Résumés
Abstract
Background: Educators now use reported observations when assessing trainees’ performance. Unfortunately, they have little information about how to design and implement assessments based on reported observations.
Objective: The purpose of this scoping review was to map the literature on the use of reported observations in judging health professions education (HPE) trainees' performances.
Methods: Arksey and O'Malley’s (2005) method was used with four databases (sources: ERIC, CINAHL, MEDLINE, PsycINFO). Eligibility criteria for articles were: documents in English or French, including primary data, and initial or professional training; (2) training in an HPE program; (3) workplace-based assessment; and (4) assessment based on reported observations. The inclusion/exclusion, and data extraction steps were performed (agreement rate > 90%). We developed a data extraction grid to chart the data. Descriptive analyses were used to summarize quantitative data, and the authors conducted thematic analysis for qualitative data.
Results: Based on 36 papers and 13 consultations, the team identified six steps characterizing trainee performance assessment based on reported observations in HPE: (1) making first contact, (2) observing and documenting the trainee performance, (3) collecting and completing assessment data, (4) aggregating assessment data, (5) inferring the level of competence, and (6) documenting and communicating the decision to the stakeholders.
Discussion: The design and implementation of assessment based on reported observations is a first step towards a quality implementation by guiding educators and administrators responsible for graduating competent professionals. Future research might focus on understanding the context beyond assessor cognition to ensure the quality of meta-assessors’ decisions.
Résumé
Contexte : Les éducateurs utilisent désormais les observations rapportées pour évaluer la performance de leurs stagiaires. Malheureusement, ils disposent de peu d'informations sur la manière de concevoir et de mettre en œuvre des évaluations basées sur les observations rapportées. Objectif : L'objectif de cette étude de la portée des écrits était de recenser la littérature sur l'utilisation des observations rapportées lors d'évaluation de la performance de stagiaires dans les programmes professionnalisants en santé (PPS).
Méthodes : La méthode d'Arksey et O'Malley (2005) a été utilisée et quatre bases de données ont été interrogées (ERIC, CINAHL, MEDLINE, PsycINFO). Les critères d'admissibilité des articles étaient les suivants : (1) écrit en anglais ou en français ; (2) comprend des données primaires ; (3) traite de la formation initiale ou professionnelle ; (4) se situe en formation d'un PPS ; (5) traite de l’évaluation en stage ; et (6) traite de l’évaluation basée sur des observations rapportées. Les étapes d'inclusion/exclusion et d'extraction des données ont été réalisées à deux personnes (taux d'accord > 90%). Nous avons extrait les données avec une grille d'extraction des données préétablie et itérative. Des analyses quantitatives ont été menées pour résumer les données numériques et une analyse thématique pour résumer les données qualitatives.
Résultats : Sur la base de 36 articles et de 13 consultations, nous avons identifié six étapes caractérisant l'évaluation de la performances de stagiaires basée sur des observations rapportées dans les PPS : (1) établir un premier contact, (2) observer et documenter la performance du stagiaire, (3) recueillir et compléter les données d'évaluation, (4) agréger les données d'évaluation, (5) déduire le niveau de compétence, et (6) documenter et communiquer la décision aux parties prenantes.
Discussion : La conception et la mise en œuvre de l'évaluation sur la base d’observations rapportées constituent un premier pas vers la mise en œuvre d’une évaluation de qualité en guidant les éducateurs et les administrateurs responsables de la formation de professionnels compétents. Les recherches futures pourraient se concentrer sur la compréhension du contexte au-delà de la cognition de l'évaluateur afin de garantir la qualité des décisions prises par les métaévaluateurs.
Veuillez télécharger l’article en PDF pour le lire.
Télécharger
Parties annexes
Bibliography
- Gofton W, Dudek N, Barton G, Bhanji F. Workplace-based assessment implementation guide: formative tips for medical teaching practice. R Coll Physicians Surg Can. 2017;1‑12.
- Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32(8):676‑82. https://doi.org/10.3109/0142159X.2010.500704
- Gingerich A, Kogan J, Yeates P, Govaerts M, Holmboe E. Seeing the « black box » differently: assessor cognition from three research perspectives. Med Educ. 2014;48(11):1055‑68. https://doi.org/10.1111/medu.12546
- Kinnear B, Warm EJ, Hauer KE. Twelve tips to maximize the value of a clinical competency committee in postgraduate medical education. Med Teach. 2018;40(11):1110‑5. https://doi.org/10.1080/0142159X.2018.1474191
- Schuwirth LWT, van der Vleuten CPM. A history of assessment in medical education. Adv Health Sci Educ. 2020;25(5):1045‑56. https://doi.org/10.1007/s10459-020-10003-0
- van der Vleuten CPM, Schuwirth LWT, Driessen EW, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34(3):205‑14. https://doi.org/10.3109/0142159X.2012.652239
- Hodges B. Assessment in the post-psychometric era: learning to love the subjective and collective. Med Teach. 2013;35(7):564‑8. https://doi.org/10.3109/0142159X.2013.789134
- Colbert CY, French JC, Herring ME, Dannefer EF. Fairness: the hidden challenge for competency-based postgraduate medical education programs. Perspect Med Educ. 2017;6:347‑55. https://doi.org/10.1007/s40037-017-0359-8
- Colbert CY, Dannefer EF, French JC. Clinical competency committees and assessment: changing the conversation in graduate medical education. J Grad Med Educ. 2015;7(2):162‑5. https://doi.org/10.4300/JGME-D-14-00448.1
- French JC, Dannefer EF, Colbert CY. A systematic approach toward building a fully operational clinical competency committee. J Surg Educ. 2014;71(6):e22‑7. https://doi.org/10.1016/j.jsurg.2014.04.005
- Larocque S, Luhanga FL. Exploring the issue of failure to fail in a nursing program. Int J Nurs Educ Scholarsh. 2013;10(1):115‑22. https://doi.org/10.1515/ijnes-2012-0037
- Kane M. The argument-based approach to validation. Sch Psychol Rev. 2013;42(4):448‑57. https://doi.org/10.1080/02796015.2013.12087465
- Kane MT. Validating the interpretations and uses of test scores. J Educ Meas. 2013;50(1):1‑73. https://doi.org/10.1111/jedm.12000
- Zumbo BD, Chan EKH. Validity and validation in social, behavioral, and health sciences. Cham: Springer; 2014. https://doi.org/10.1007/978-3-319-07794-9
- Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8(1):19‑32. https://doi.org/10.1080/1364557032000119616
- Peters MDJ, Marnie C, Tricco AC, et al. Updated methodological guidance for the conduct of scoping reviews. JBI Evid Synth. 2020;18(10):2119‑26. https://doi.org/10.11124/JBIES-20-00167
- Levac D, Colquhoun H, O’Brien KK. Scoping studies: advancing the methodology. Implement Sci. 2010;5:69. https://doi.org/10.1186/1748-5908-5-69
- Tricco AC, Lillie E, Zarin W, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467‑73. https://doi.org/10.7326/M18-0850
- Thomas A, Lubarsky S, Durning SJ, Young ME. Knowledge syntheses in medical education: demystifying scoping reviews. Acad Med. 2017;92(2):161‑6. https://doi.org/10.1097/ACM.0000000000001452
- Braun V, Clarke V. Can I use TA? Should I use TA? Should I not use TA? Comparing reflexive thematic analysis and other pattern‐based qualitative analytic approaches. Couns Psychother Res. 2021;21(1):37‑47. https://doi.org/10.1002/capr.12360
- Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77‑101. https://doi.org/10.1191/1478088706qp063oa
- Donato AA, Alweis R, Wenderoth S. Design of a clinical competency committee to maximize formative feedback. J Community Hosp Intern Med Perspect JCHIMP. 2016;6(6):33533. https://doi.org/10.3402/jchimp.v6.33533
- Moonen-van Loon JMW, Overeem K, Govaerts MJB, Verhoeven BH, van der Vleuten CPM, Driessen EW. The reliability of multisource feedback in competency-based assessment programs: the effects of multiple occasions and assessor groups. Acad Med. 2015;90(8):1093‑9. https://doi.org/10.1097/ACM.0000000000000763
- Keeley MG, Gusic ME, Morgan HK, Aagaard EM, Santen SA. Moving toward summative competency assessment to individualize the postclerkship phase. Acad Med. 2019;94(12):1858‑64. https://doi.org/10.1097/ACM.0000000000002830
- Hemmer PA, Hawkins R, Jackson JL, Pangaro LN. Assessing how well three evaluation methods detect deficiencies in medical students’ professionalism in two settings of an internal medicine clerkship. Acad Med. 2000;75(2):167‑73. https://doi.org/10.1097/00001888-200002000-00016
- Murray KE, Lane JL, Carraccio C, et al. Crossing the gap: using competency-based assessment to determine whether learners are ready for the undergraduate-to-graduate transition. Acad Med. 2019;94(3):338‑45. https://doi.org/10.1097/ACM.0000000000002535
- Hicks PJ, Margolis MJ, Carraccio CL, et al. A novel workplace-based assessment for competency-based decisions and learner feedback. Med Teach. 2018;40(11):1143‑50. https://doi.org/10.1080/0142159X.2018.1461204
- Rich JV, Fostaty Young S, Donnelly C, et al. Competency-based education calls for programmatic assessment: but what does this look like in practice? J Eval Clin Pract. 2019;26(4):95. https://doi.org/10.1111/jep.13328
- Driessen EW, van Tartwijk J, Govaerts M, Teunissen P, van der Vleuten CP. The use of programmatic assessment in the clinical workplace: a Maastricht case report. Med Teach. 2012;34(3):226‑31. https://doi.org/10.3109/0142159X.2012.652242
- Cianciolo AT, Hingle S, Hudali T, Beason AM. Evaluating clerkship competency without exams. Clin Teach. 2020;17(6):624‑8. https://doi.org/10.1111/tct.13114
- Lass SL, Kornreich HK, Hoffmann KI, Friedman DB. Consistency in ratings of clinical performance of the same students throughout medical school and internship. Annual Conference on Research in Medical Education. Conference on Research in Medical Education. 1977;16:147‑52. PMID: 606069
- Perry M, Linn A, Munzer BW, et al. Programmatic assessment in emergency medicine: implementation of best practices. J Grad Med Educ. 2018;10(1):84‑90. https://doi.org/10.4300/JGME-D-17-00094.1
- Helminen K, Tossavainen K, Turunen H. Assessing clinical practice of student nurses: views of teachers, mentors and students. Nurse Educ Today. 2014;34(8):1161‑6. https://doi.org/10.1016/j.nedt.2014.04.007
- Duitsman ME, Fluit CRMG, van der Goot WE, et al. Judging residents’ performance: a qualitative study using grounded theory. BMC Med Educ. 2019;19(1):13. https://doi.org/10.1186/s12909-018-1446-1
- Hemmer PA, Dadekian GA, Terndrup C, et al. Regular formal evaluation sessions are effective as frame-of-reference training for faculty evaluators of clerkship medical students. J Gen Intern Med. 2015;30(9):1313‑8. https://doi.org/10.1007/s11606-015-3294-6
- Yonge O, Myrick F, Ferguson LM. Preceptored students in rural settings want feedback. Int J Nurs Educ Scholarsh. 2011;8(1). https://doi.org/10.2202/1548-923X.2047
- Battistone M, Pendleton B, Milne C, et al. Global descriptive evaluations are more responsive than global numeric ratings in detecting students’ progress during the inpatient portion of an internal medicine clerkship. Acad Med. 2001;76(10):S105‑7. https://doi.org/10.1097/00001888-200110001-00035
- Wu XV, Enskär K, Pua LH, Heng DGN, Wang W. Clinical nurse leaders’ and academics’ perspectives in clinical assessment of final‐year nursing students: a qualitative study. Nurs Health Sci. 2017;19(3):287‑93. https://doi.org/10.1111/nhs.12342
- Borman KR, Augustine R, Leibrandt T, Pezzi CM, Kukora JS. Initial performance of a modified milestones global evaluation tool for semiannual evaluation of residents by faculty. J Surg Educ. 2013;70(6):739‑49. https://doi.org/10.1016/j.jsurg.2013.08.004
- Hauer KE, Chesluk B, Iobst W, et al. Reviewing residents’ competence: a qualitative study of the role of clinical competency committees in performance assessment. Acad Med. 2015;90(8):1084‑92. https://doi.org/10.1097/acm.0000000000000736
- Goodyear HM, Lakshminarayana I, Wall D, Bindal T. A multisource feedback tool to assess ward round leadership skills of senior paediatric trainees: (2) Testing reliability and practicability. Postgrad Med J. 2015;91(1075):268‑73. https://doi.org/10.1136/postgradmedj-2015-133308
- Ekpenyong A, Baker E, Harris I, et al. How do clinical competency committees use different sources of data to assess residents’ performance on the internal medicine milestones? A mixed methods pilot study. Med Teach. 2017;39(10):1074‑83. https://doi.org/10.1080/0142159X.2017.1353070
- Swing SR, Clyman SG, Holmboe ES, Williams RG. Advancing resident assessment in graduate medical education. J Grad Med Educ. 2009;1(2):278‑86. https://doi.org/10.4300/JGME-D-09-00010.1
- Berger JS, Pan E, Thomas J. A randomized, controlled crossover study to discern the value of 360-degree versus traditional, faculty-only evaluation for performance improvement of anesthesiology residents. J Educ Perioper Med JEPM. 2009;11(2):E053. https://doi.org/10.46374/volxi-issue2-berger
- Nabors C, Forman L, Peterson SJ, et al. Milestones: a rapid assessment method for the Clinical Competency Committee. Arch Med Sci. 2017;13(1):201‑9. https://doi.org/10.5114/aoms.2016.64045
- Chan TM, Sherbino J, Mercuri M. Nuance and noise: lessons learned from longitudinal aggregated assessment data. J Grad Med Educ. 2017;9(6):724‑9. https://doi.org/10.4300/JGME-D-17-00086.1
- Hauer KE, Clauser J, Lipner RS, et al. The internal medicine reporting milestones: cross-sectional description of initial implementation in U.S. residency programs. Ann Intern Med. 2016;165(5):356‑62. https://doi.org/10.7326/M15-2411
- Friedman KA, Balwan S, Cacace F, Katona K, Sunday S, Chaudhry S. Impact on house staff evaluation scores when changing from a Dreyfus- to a Milestone-based evaluation model: one internal medicine residency program’s findings. Med Educ Online. 2014;19(1). https://doi.org/10.3402/meo.v19.25185
- Aagaard E, Kane GC, Conforti L, et al. Early feedback on the use of the internal medicine reporting milestones in assessment of resident performance. J Grad Med Educ. 2013;5(3):433‑8. https://doi.org/10.4300/JGME-D-13-00001.1
- Chan TM, Sebok-Syer SS, Sampson C, Monteiro S. The quality of assessment of learning (Qual) score: validity evidence for a scoring system aimed at rating short, workplace-based comments on trainee performance. Teach Learn Med. 2020;32(3):319‑29. https://doi.org/10.1080/10401334.2019.1708365
- Ogunyemi D, Eno M, Rad S, Fong A, Alexander C, Azziz R. Evaluating professionalism, practice-based learning and improvement, and systems-based practice: utilization of a compliance form and correlation with conflict styles. J Grad Med Educ. 2010;2(3):423‑9. https://doi.org/10.4300/JGME-D-10-00048.1
- Engström M, Löfmark A, Vae KJU, Mårtensson G. Nursing students’ perceptions of using the Clinical Education Assessment tool AssCE and their overall perceptions of the clinical learning environment - A cross-sectional correlational study. Nurse Educ Today. 2017;51:63‑7. https://doi.org/10.1016/j.nedt.2017.01.009
- Park YS, Zar FA, Norcini JJ, Tekian A. Competency evaluations in the next accreditation system: contributing to guidelines and implications. Teach Learn Med. 2016;28(2):135‑45. https://doi.org/10.1080/10401334.2016.1146607
- Watson RS, Borgert AJ, O Heron CT, et al. A multicenter prospective comparison of the accreditation council for graduate medical education milestones: clinical competency committee vs. resident self-assessment. J Surg Educ. 2017;74(6):e8‑14. https://doi.org/10.1016/j.jsurg.2017.06.009
- Ginsburg S, Regehr G, Lingard L, Eva KW. Reading between the lines: faculty interpretations of narrative evaluation comments. Med Educ. 2015;49(3):296‑306. https://doi.org/10.1111/medu.12637
- Douglass KA, Jacquet GA, Hayward AS, et al. Development of a global health milestones tool for learners in emergency medicine: a pilot project. AEM Educ Train. 2017;1(4):269‑79. https://doi.org/10.1002/aet2.10046
- Bartlett KW, Whicker SA, Bookman J, et al. Milestone-based assessments are superior to likert-type assessments in illustrating trainee progression. J Grad Med Educ. 2015;7(1):75‑80. https://doi.org/10.4300/JGME-D-14-00389.1
- The Royal College of Physicians and Surgeons of Canada. CanMEDS Milestones. Ottawa (ON): The Royal College of Physicians and Surgeon of Canada; 2015. Available from: https://canmeds.royalcollege.ca/en/milestones
- Marceau M, Gallagher F, Young M, St-Onge C. Validity as a social imperative for assessment in health professions education: a concept analysis. Med Educ. 2018;52(6):641‑53. https://doi.org/10.1111/medu.13574
- American Educational Research Association., American Psychological Association., National Council on Measurement in Education., Joint Committee on Standards for Educational and Psychological Testing (U.S.). Standards for educational and psychological testing. Washington (DC): American Educational Research Association; 2014.
- Gauthier G, St‐Onge C, Tavares W. Rater cognition: review and integration of research findings. Med Educ. 2016;50(5):511‑22. https://doi.org/10.1111/medu.12973
- St-Onge C, Chamberland M, Lévesque A, Varpio L. Expectations, observations, and the cognitive processes that bind them: expert assessment of examinee performance. Adv Health Sci Educ. 2016;21:627‑42. https://doi.org/10.1007/s10459-015-9656-3
- Govaerts M, van der Vleuten CPM. Validity in work‐based assessment: expanding our horizons. Med Educ. 2013;47(12):1164‑74. https://doi.org/10.1111/medu.12289
- Kogan JR, Conforti L, Bernabeo E, Iobst W, Holmboe E. Opening the black box of clinical skills assessment via observation: a conceptual model. Med Educ. 2011;45(10):1048‑60. https://doi.org/10.1111/j.1365-2923.2011.04025.x
- Yeates P, Cardell J, Byrne G, Eva KW. Relatively speaking: contrast effects influence assessors’ scores and narrative feedback. Med Educ. 2015;49(9):909‑19. https://doi.org/10.1111/medu.12777
- Lee V, Brain K, Martin J. From opening the ‘black box’ to looking behind the curtain: cognition and context in assessor-based judgements. Adv Health Sci Educ. 2019;24:85‑102. https://doi.org/10.1007/s10459-018-9851-0
- Pack R, Lingard L, Watling CJ, Chahine S, Cristancho SM. Some assembly required: tracing the interpretative work of Clinical Competency Committees. Med Educ. 2019;53(7):723‑34. https://doi.org/10.1111/medu.13884
- Hauer KE, Cate O ten, Boscardin CK, et al. Ensuring resident competence: a narrative review of the literature on group decision making to inform the work of Clinical Competency Committees. J Grad Med Educ. 2016;8(2):156‑64. https://doi.org/10.4300/JGME-D-15-00144.1
- Cheung WJ, Patey AM, Frank JR, Mackay M, Boet S. Barriers and enablers to direct observation of trainees’ clinical performance: a qualitative study using the theoretical domains framework. Acad Med. 2019;94(1):101‑14. https://doi.org/10.1097/ACM.0000000000002396
- Watling C, LaDonna KA, Lingard L, Voyer S, Hatala R. ‘Sometimes the work just needs to be done’: socio‐cultural influences on direct observation in medical training. Med Educ. 2016;50(10):1054‑64. https://doi.org/10.1111/medu.13062
- St-Onge C. Enjeux et défis de l’évaluation longitudinale: quelques pistes de réflexion préalables à son implantation. Pédagogie Médicale. 2018;19(3):137‑42. https://doi.org/10.1051/pmed/2019022
- Madan R, Conn D, Dubo E, Voore P, Wiesenfeld L. The enablers and barriers to the use of direct observation of trainee clinical skills by supervising faculty in a psychiatry residency program. Can J Psychiatry. 2012;57(4):269‑72. https://doi.org/10.1177/070674371205700411
- Kogan JR, Conforti LN, Yamazaki K, Iobst W, Holmboe ES. Commitment to change and challenges to implementing changes after workplace-based assessment rater training. Acad Med. 2017;92(3):394‑402. https://doi.org/10.1097/ACM.0000000000001319