Résumés
Abstract
Background: Canadian specialist residency training programs are implementing a form of competency-based medical education (CBME) that requires the assessment of entrustable professional activities (EPAs). Dashboards could be used to track the completion of EPAs to support program evaluation.
Methods: Using a design-based research process, we identified program evaluation needs related to CBME assessments and designed a dashboard containing elements (data, analytics, and visualizations) meeting these needs. We interviewed leaders from the emergency medicine program and postgraduate medical education office at the University of Saskatchewan. Two investigators thematically analyzed interview transcripts to identify program evaluation needs that were audited by two additional investigators. Identified needs were described using quotes, analytics, and visualizations.
Results: Between July 1, 2019 and April 6, 2021 we conducted 17 interviews with six participants (two program leaders and four institutional leaders). Four needs emerged as themes: tracking changes in overall assessment metrics, comparing metrics to the assessment plan, evaluating rotation performance, and engagement with the assessment metrics. We addressed these needs by presenting analytics and visualizations within a dashboard.
Conclusions: We identified program evaluation needs related to EPA assessments and designed dashboard elements to meet them. This work will inform the development of other CBME assessment dashboards designed to support program evaluation.
Résumé
Contexte : Les programmes de résidence dans une spécialité au Canada offrent une formation médicale fondée sur les compétences (FMFC) qui exige l’évaluation des activités professionnelles confiables (APC). Des tableaux de bord pourraient être utilisés pour suivre la complétion des APC afin de faciliter l’évaluation des programmes.
Méthodes : Par un processus de recherche basé sur la conception, nous avons ciblé les besoins des programmes liés aux évaluations de la FMFC et conçu un tableau de bord qui comprend les éléments (données, analyses et visualisations) nécessaires pour répondre à ces besoins. Nous avons eu recours à des entretiens auprès des responsables du programme de médecine d’urgence et du bureau d’éducation médicale postdoctorale de l’Université de Saskatchewan. Deux enquêteurs ont effectué une analyse thématique des transcriptions des entretiens afin de recenser les besoins d’évaluation du programme, vérifiés par la suite par deux autres enquêteurs. Les besoins recensés ont été décrits à l’aide de citations, d’analyses et de visualisations.
Résultats : Entre le 1er juillet 2019 et le 6 avril 2021, nous avons mené 17 entretiens avec six participants (deux responsables de programmes et quatre responsables de l’établissement). Quatre besoins sont ressortis en tant que thèmes : le suivi des changements dans les mesures d’évaluation globales, la comparaison des mesures avec le plan d’évaluation, l’évaluation de l’efficacité du stage et l’engagement face à des mesures d’évaluation. Nous avons répondu à ces besoins en présentant des analyses et des visualisations dans un tableau de bord.
Conclusions : Nous avons identifié les besoins d’évaluation du programme liés aux évaluations des APC et conçu des éléments de tableau de bord pour y répondre. Ce travail guidera la conception d’autres tableaux de bord d’évaluation de la FMFC en vue de faciliter l’évaluation des programmes.
Veuillez télécharger l’article en PDF pour le lire.
Télécharger
Parties annexes
Bibliography
- Frank JR, Snell LS, Cate O Ten, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638-645. https://doi.org/10.3109/0142159X.2010.501190
- Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32(8):676-682. https://doi.org/10.3109/0142159X.2010.500704
- Royal College of Physicians and Surgeons of Canada. Competence by Design: Reshaping Canadian Medical Education.; 2014.
- Englander R, Frank JR, Carraccio C, Sherbino J, Ross S, Snell L. Toward a shared language for competency-based medical education. Med Teach. 2017;39(6):582-587. https://doi.org/10.1080/0142159X.2017.1315066
- Van Loon KA, Driessen EW, Teunissen PW, Scheele F. Experiences with EPAs, potential benefits and pitfalls. Med Teach. 2014;36(8):698-702. https://doi.org/10.3109/0142159X.2014.909588
- Rekman J, Gofton W, Dudek N, Gofton T, Hamstra SJ. Entrustability Scales: outlining their usefulness for competency-based clinical assessment. Acad Med. 2015;91(2):1. https://doi.org/10.1097/ACM.0000000000001045
- Gofton WT, Dudek NL, Wood TJ, Balaa F, Hamstra SJ. The Ottawa Surgical Competency Operating Room Evaluation (O-SCORE): a tool to assess surgical competence. Acad Med. 2012;87(10):1401-1407. https://doi.org/10.1097/ACM.0b013e3182677805
- MacEwan MJ, Dudek NL, Wood TJ, Gofton WT. Continued validation of the O-SCORE (Ottawa Surgical Competency Operating Room Evaluation): use in the simulated environment. Teach Learn Med. 2016;28(1):72-79. https://doi.org/10.1080/10401334.2015.1107483
- Sherbino J, Bandiera G, Doyle K, et al. The competency-based medical education evolution of Canadian emergency medicine specialist training. Can J Emerg Med. 2020;22(1):95-102. https://doi.org/10.1017/cem.2019.417
- Stoneham EJ, Witt L, Paterson QS, Martin LJ, Thoma B. The development of entrustable professional activities reference cards to support the implementation of Competence by Design in emergency medicine. Can J Emerg Med. 2019;21(6):803-806. https://doi.org/10.1017/cem.2019.395
- Royal College of Physicians and Surgeons of Canada. Entrustable Professional Activities for Emergency Medicine. Royal College of Physicians and Surgeons of Canada; 2018.
- Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94(7):1002-1009. https://doi.org/10.1097/ACM.0000000000002743
- Chan TM, Paterson QS, Hall AK, et al. Outcomes in the age of competency-based medical education: recommendations for emergency medicine training in Canada from the 2019 symposium of academic emergency physicians. Can J Emerg Med. 2020;22(2):204-214. https://doi.org/10.1017/cem.2019.491
- Van Melle E, Hall AK, Schumacher DJ, et al. Capturing outcomes of competency-based medical education: the call and the challenge. Med Teach. 2021;43(7). https://doi.org/10.1080/0142159X.2021.1925640
- Hall AK, Schumacher DJ, Thoma B, et al. Outcomes of competency-based medical education: a taxonomy for shared language. Med Teach. 2021;43(7):788-793. https://doi.org/10.1080/0142159X.2021.1925643
- Carraccio C, Martini A, Van Melle E, Schumacher DJ. Identifying core components of EPA implementation: a path to knowing if a complex intervention is being implemented as intended. Acad Med J Assoc Am Med Coll. Published online March 23, 2021. https://doi.org/10.1097/ACM.0000000000004075
- Thoma B, Hall AK, Clark K, et al. Evaluation of a national competency-based assessment system in emergency medicine: a CanDREAM study. J Grad Med Educ. 2020;12(4):425-434. https://doi.org/10.4300/JGME-D-19-00803.1
- Hall AKM, Rich J, Dagnone JDM, et al. It’s a marathon, not a sprint: rapid evaluation of CBME program implementation. Acad Med. 2020;95(5):786-793. https://doi.org/10.1097/ACM.0000000000003040
- Carey R, Wilson G, Bandi V, et al. Developing a dashboard to meet the needs of residents in a competency-based training program: a design-based research project. Can Med Educ J. 2020;11(6):e31-e35. https://doi.org/10.36834/cmej.69682
- Thoma B, Bandi V, Carey R, et al. Developing a dashboard to meet Competence Committee needs: a design-based research project. Can Med Educ J. 2020;11(1):e16-e34. https://doi.org/10.36834/cmej.68903
- Thoma B, Caretta-Weyer H, Schumacher DJ, et al. Becoming a Deliberately developmental organization: using competency-based assessment data for organizational development. Med Teach. 2021;43(7):801-809. https://doi.org/10.1080/0142159X.2021.1925100
- Stein M, Janetzko H, Lamprecht A, et al. Bring it to the pitch: combining video and movement data to enhance team sport analysis. IEEE Trans Vis Comput Graph. 2018;24(1):13-22. https://doi.org/10.1109/TVCG.2017.2745181
- Teizer J, Cheng T, Fang Y. Location tracking and data visualization technology to advance construction ironworkers’ education and training in safety and productivity. Autom Constr. 2013;35:53-68. https://doi.org/10.1016/j.autcon.2013.03.004
- Stadler JG, Donlon K, Siewert JD, Franken T, Lewis NE. Improving the efficiency and ease of healthcare analysis through use of data visualization dashboards. Big Data. 2016;4(2):129-135. https://doi.org/10.1089/big.2015.0059
- Few S. Information Dashboard Design: The Effective Visual Communication of Data. 1st Editio. O’Reilly Media; 2006.
- Ellaway RH, Pusic M V., Galbraith RM, Cameron T. Developing the role of big data and analytics in health professional education. Med Teach. 2014;36(3):216-222. https://doi.org/10.3109/0142159X.2014.874553
- McKenney S, Reeves TC. Conducting Educational Design Research. 2nd ed. Routledge; 2019.
- Reeves TC, Herrington J, Oliver R. design research: a socially responsible approach to instructional technology research in higher education. J Comput High Educ. 2005;16(2):97-116.
- Chen W, Reeves TC. Twelve tips for conducting educational design research in medical education. Med Teach. 2020;4(9):980-986. https://doi.org/10.1080/0142159X.2019.1657231
- Yilmaz Y, Carey R, Chan T, et al. Developing a dashboard for faculty development in competency-based training programs: a design-based research project. Can Med Educ J. 2021;12(4):48-64. https://doi.org/10.36834/cmej.72067
- Boscardin C, Fergus KB, Hellevig B, Hauer KE. Twelve tips to promote successful development of a learner performance dashboard within a medical education program. Med Teach. 2017;40(8):1-7. https://doi.org/10.1080/0142159X.2017.1396306
- Karami M, Langarizadeh M, Fatehi M. Evaluation of effective dashboards: key concepts and criteria. Open Med Inform J. 2017;11(1):52-57. https://doi.org/10.2174/1874431101711010052
- Dolmans DHJM, Tigelaar D. Building bridges between theory and practice in medical education using a design-based research approach: AMEE Guide No. 60. Med Teach. 2012;34(1):1-10. https://doi.org/10.3109/0142159X.2011.595437
- Chen W, Sandars J, Reeves TC. Navigating complexity: the importance of design-based research for faculty development. Med Teach. 2021;43(4):475-477. https://doi.org/10.1080/0142159X.2020.1774530
- Van Melle E, Gruppen L, Holmboe ES, Flynn L, Oandasan I, Frank JR. Using contribution analysis to evaluate competency-based medical education programs: it’s all about rigor in thinking. Acad Med. 2017;92(6):752-758. https://doi.org/10.1097/ACM.0000000000001479
- Oandasan I, Martin L, McGuire M, Zorzi R. Twelve tips for improvement-oriented evaluation of competency-based medical education. Med Teach. 2020;42(3):272-277. https://doi.org/10.1080/0142159X.2018.1552783
- Lang C, Siemens G, Wise A, Gasevic D. Handbook of Learning Analytics: First Edition. Society for Learning Analytics Research.
- Chan T, Sebok-Syer S, Thoma B, Wise A, Sherbino J, Pusic M. Learning analytics in medical education assessment: the past, the present, and the future. AEM Educ Train. 2018;2(2):178-187. https://doi.org/10.1002/aet2.10087
- Thoma B, Ellaway R, Chan TM. From Utopia through Dystopia: charting a course for learning analytics in competency-based medical education. Acad Med. 2021;96(7S):S89-S95. https://doi.org/10.1097/ACM.0000000000004092
- Olmos M, Corrin L. Learning analytics: a case study of the process of design of visualizations. J Asynchronous Learn Netw. 2012;16(3):39-49.
- Bandi V, Shisong W, Thoma B. CBD Dashboard UI. GitHub. Published 2019. https://github.com/kiranbandi/cbd-dashboard-ui [Accessed Oct 1, 2021].
- Watling CJ, Lingard L. Grounded theory in medical education research: AMEE Guide No. 70. Med Teach. 2012;34(10):850-861. https://doi.org/10.3109/0142159X.2012.704439
- Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349-357. https://doi.org/10.1093/intqhc/mzm042
- O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research. Acad Med. 2014;89(9):1245-1251. https://doi.org/10.1097/ACM.0000000000000388
- Sample S, Rimawi HA, Bérczi B, Chorley A, Pardhan A, Chan TM. Seeing potential opportunities for teaching (SPOT): Evaluating a bundle of interventions to augment entrustable professional activity acquisition. AEM Educ Train. 2021;5(4):e10631. https://doi.org/10.1002/aet2.10631
- Mohanna K, Cottrell E, Wall D, Chambers R. Teaching Made Easy: A Manual for Health Professionals. 3rd ed. Routledge & CRC Press; 2011.
- Frye AW, Hemmer PA. Program evaluation models and related theories: AMEE guide no. 67. Med Teach. 2012;34(5):288-299. https://doi.org/10.3109/0142159X.2012.668637