1
dos S. Ribeiro C, van de Burgwal LHM, Regeer BJ. Overcoming challenges for designing and implementing the One Health approach: A systematic review of the literature. One Health. 2019;7. doi: 10.1016/j.onehlt.2019.100085
2
Hodges BD. A practical guide for medical teachers. Fifth edition. Edinburgh: Elsevier 2017.
3
Association for the Study of Medical Education. Understanding medical education: evidence, theory, and practice. Third edition. Hoboken, NJ: Wiley-Blackwell 2019.
4
Marshall S, editor. A handbook for teaching and learning in higher education: enhancing academic practice. Fifth edition. Abingdon, Oxon: Routledge 2020.
5
Holmboe EricS, Durning SJ. Practical Guide to the Assessment of Clinical Competence. Third Edition. 2024.
6
Norcini J, Anderson B, Bollela V, et al. Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 Conference. Medical Teacher. 2011;33:206–14. doi: 10.3109/0142159X.2011.551559
7
Van Der Vleuten CPM. The assessment of professional competence: Developments, research and practical implications. Advances in Health Sciences Education. 1996;1:41–67. doi: 10.1007/BF00596229
8
van der Vleuten CPM, Schuwirth LWT. Assessing professional competence: from methods to programmes. Medical Education. 2005;39:309–17. doi: 10.1111/j.1365-2929.2005.02094.x
9
Schuwirth LW, van der Vleuten CP. How to Design a Useful Test: The Principles of Assessment. In: Swanwick T, ed. Understanding Medical Education. Oxford, UK: Wiley-Blackwell 2019:277–89.
10
van der Vleuten CPM, Schuwirth LWT, Driessen EW, et al. A model for programmatic assessment fit for purpose. Medical Teacher. 2012;34:205–14. doi: 10.3109/0142159X.2012.652239
11
Biggs J. Enhancing Teaching through Constructive Alignment. Higher Education. 1996;32:347–64.
12
Miller G. The assessment of clinical skills/competence/performance. Academic Medicine. 1990;65.
13
Schuwirth LWT, van der Vleuten CPM. Programmatic assessment and Kane’s validity perspective. Medical Education. 2012;46:38–48. doi: 10.1111/j.1365-2923.2011.04098.x
14
Epstein RM. Defining and Assessing Professional Competence. JAMA. 2002;287. doi: 10.1001/jama.287.2.226
15
ten Cate O. Nuts and Bolts of Entrustable Professional Activities. Journal of Graduate Medical Education. 2013;5:157–8. doi: 10.4300/JGME-D-12-00380.1
16
Ten Cate O. Competency-Based Postgraduate Medical Education: Past, Present and Future. GMS Journal for Medical Education. 2017;34. doi: 10.3205/zma001146
17
Cruess RL, Cruess SR, Steinert Y. Amending Miller’s Pyramid to Include Professional Identity Formation. Academic Medicine. 2016;91:180–5. doi: 10.1097/ACM.0000000000000913
18
Cobb KA, Brown G, Jaarsma DADC, et al. The educational impact of assessment: A comparison of DOPS and MCQs. Medical Teacher. 2013;35:e1598–607. doi: 10.3109/0142159X.2013.803061
19
Jolly B, Dalton MJ. Written Assessment. In: Swanwick T, Forrest K, O’Brien BC, eds. Understanding Medical Education. Chichester, UK: John Wiley & Sons, Ltd 2018:291–317.
20
Jolly B. Written Assessment. In: Swanwick T, ed. Understanding Medical Education. Oxford, UK: Wiley-Blackwell 2019:261–261.
21
Epstein RM. Assessment in Medical Education. New England Journal of Medicine. 2007;356:387–96. doi: 10.1056/NEJMra054784
22
Hift RJ. Should essays and other "open-ended”-type questions retain a place in written summative assessment in clinical medicine? BMC Medical Education. 2014;14. doi: 10.1186/s12909-014-0249-2
23
Paniagua M, Swygert K, editors. The Gold Book - constructing written test questions for the basic and clinical sciences.
24
Schuwirth LWT, van der Vleuten CPM. ABC of learning and teaching in medicine: Written assessment. BMJ. 2003;326:643–5. doi: 10.1136/bmj.326.7390.643
25
Schuwirth LWT, van der Vleuten CPM. Different written assessment methods: what can be said about their strengths and weaknesses? Medical Education. 2004;38:974–9. doi: 10.1111/j.1365-2929.2004.01916.x
26
Schuwirth LWT, van der Vleuten CPM. General overview of the theories used in assessment: AMEE Guide No. 57. Medical Teacher. 2011;33:783–97. doi: 10.3109/0142159X.2011.611022
27
Charlin B, Roy L, Brailovsky C, et al. The Script Concordance Test: A Tool to Assess the Reflective Clinician. Teaching and Learning in Medicine. 2000;12:189–95. doi: 10.1207/S15328015TLM1204_5
28
Fournier J, Demeester A, Charlin B. Script Concordance Tests: Guidelines for Construction. BMC Medical Informatics and Decision Making. 2008;8. doi: 10.1186/1472-6947-8-18
29
Case SM, Swanson DB. Extended‐matching items: A practical alternative to free‐response questions. Teaching and Learning in Medicine. 1993;5:107–15. doi: 10.1080/10401339309539601
30
Dory V, Gagnon R, Vanpee D, et al. How to construct and implement script concordance tests: insights from a systematic review. Medical Education. 2012;46:552–63. doi: 10.1111/j.1365-2923.2011.04211.x
31
Farmer EA, Page G. A practical guide to assessing clinical decision-making skills using the key features approach. Medical Education. 2005;39:1188–94. doi: 10.1111/j.1365-2929.2005.02339.x
32
FENDERSON B. The virtues of extended matching and uncued tests as alternatives to multiple choice questions. Human Pathology. 1997;28:526–32. doi: 10.1016/S0046-8177(97)90073-3
33
Haladyna TM, Downing SM, Rodriguez MC. A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment. Applied Measurement in Education. 2002;15:309–33. doi: 10.1207/S15324818AME1503_5
34
McCoubrie P. Improving the fairness of multiple-choice questions: a literature review. Medical Teacher. 2004;26:709–12. doi: 10.1080/01421590400013495
35
Miller MD, Linn RL, Gronlund NE. Measurement and assessment in teaching. 11th ed., International ed. Boston, Mass: Pearson Education 2013.
36
Palmer EJ, Devitt PG. Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions?: research paper. BMC Medical Education. 2007;7. doi: 10.1186/1472-6920-7-49
37
Lubarsky S, Dory V, Meterissian S, et al. Examining the effects of gaming and guessing on script concordance test scores. Perspectives on Medical Education. 2018;7:174–81. doi: 10.1007/s40037-018-0435-8
38
Anderson LW, Bloom BS. A taxonomy for learning, teaching, and assessing: a revision of Bloom’s taxonomy of educational objectives. Abridged ed. New York, N.Y.: Longman 2001.
39
Sam AH, Field SM, Collares CF, et al. Very-short-answer questions: reliability, discrimination and acceptability. Medical Education. 2018;52:447–55. doi: 10.1111/medu.13504
40
Cotton DRE, Cotton PA, Shipway JR. Chatting and cheating: Ensuring academic integrity in the era of ChatGPT. Innovations in Education and Teaching International. 2024;61:228–39. doi: 10.1080/14703297.2023.2190148
41
Boursicot KAM, Roberts TE, Burdick WP. Structured Assessments of Clinical Competence. In: Swanwick T, Forrest K, O’Brien BC, eds. Understanding Medical Education. Chichester, UK: John Wiley & Sons, Ltd 2018:335–45.
42
Schoonheim-Klein M, Muijtjens A, Habets L, et al. Who will pass the dental OSCE? Comparison of the Angoff and the borderline regression standard setting methods. European Journal of Dental Education. 2009;13:162–71. doi: 10.1111/j.1600-0579.2008.00568.x
43
Regehr G1, MacRae H, Reznick RK, Szalay D. Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med. Published Online First: 1998.
44
Harden RM. Misconceptions and the OSCE. Medical Teacher. 2015;37:608–10. doi: 10.3109/0142159X.2015.1042443
45
Carraccio, Carol;Wolfsthal, Susan D.;Englander, Robert;Ferentz, Kevin;Martin, Christine. Shifting Paradigms: From Flexner to Competencies. Academic Medicine. ;77.
46
Rushforth HE. Objective structured clinical examination (OSCE): Review of literature and implications for nursing education. Nurse Education Today. 2007;27:481–90. doi: 10.1016/j.nedt.2006.08.009
47
Spielman A, Fulmer T, Eisenberg E, et al. Dentistry, Nursing, and Medicine: A Comparisaon of Core Competencies. Journal of Dental Education. 2005;69:1257–71.
48
Harden RM, Stevenson M, Downie WW, et al. Assessment of clinical competence using objective structured examination. BMJ. 1975;1:447–51. doi: 10.1136/bmj.1.5955.447
49
Watson R, Stimpson A, Topping A, et al. Clinical competence assessment in nursing: a systematic review of the literature. Journal of Advanced Nursing. 2002;39:421–31. doi: 10.1046/j.1365-2648.2002.02307.x
50
Williams DM, Davies S, Horner M, et al. Peer and near-peer OSCE examiners. Medical Teacher. 2016;38:212–3. doi: 10.3109/0142159X.2015.1072266
51
Brown C, Ross S, Cleland J, et al. Money makes the (medical assessment) world go round: The cost of components of a summative final year Objective Structured Clinical Examination (OSCE). Medical Teacher. 2015;37:653–9. doi: 10.3109/0142159X.2015.1033389
52
Meskell P, Burke E, Kropmans TJB, et al. Back to the future: An online OSCE Management Information System for nursing OSCEs. Nurse Education Today. 2015;35:1091–6. doi: 10.1016/j.nedt.2015.06.010
53
Tavakol M, Doody GA. A novel psychometric programme for the rapid analysis of OSCE data. Medical Teacher. 2016;38:104–5. doi: 10.3109/0142159X.2015.1062085
54
Eva KW, Rosenfeld J, Reiter HI, et al. An admissions OSCE: the multiple mini-interview. Medical Education. 2004;38:314–26. doi: 10.1046/j.1365-2923.2004.01776.x
55
Lane P. Recruitment into training for general practice—the winds of change or a breath of fresh air? BMJ. 2005;331:s153–s153. doi: 10.1136/bmj.331.7520.s153
56
Hodges B, Regehr G, McNaughton N, et al. OSCE checklists do not capture increasing levels of expertise. 1999;74.
57
Hodges B, McIlroy JH. Analytic global OSCE ratings are sensitive to level of training. Medical Education. 2003;37:1012–6. doi: 10.1046/j.1365-2923.2003.01674.x
58
Ma IWY, Zalunardo N, Pachev G, et al. Comparing the use of global rating scale with checklists for the assessment of central venous catheterization skills using simulation. Advances in Health Sciences Education. 2012;17:457–70. doi: 10.1007/s10459-011-9322-3
59
Wood TJ, Humphrey-Murto SM, Norman GR. Standard Setting in a Small Scale OSCE: A Comparison of the Modified Borderline-Group Method and the Borderline Regression Method. Advances in Health Sciences Education. 2006;11:115–22. doi: 10.1007/s10459-005-7853-1
60
Harden RM. Revisiting ‘Assessment of clinical competence using an objective structured clinical examination (OSCE)’. Medical Education. 2016;50:376–9. doi: 10.1111/medu.12801
61
Harden RM, Lilley P, Patricio M, et al. The definitive guide to the OSCE: the Objective Structured Clinical Examination as a performance assessment. Edinburgh: Elsevier 2016.
62
Denison A, Bate E, Thompson J. Tablet versus paper marking in assessment: feedback matters. Perspectives on Medical Education. 2016;5:108–13. doi: 10.1007/s40037-016-0262-8
63
ten Cate O. Nuts and Bolts of Entrustable Professional Activities. Journal of Graduate Medical Education. 2013;5:157–8. doi: 10.4300/JGME-D-12-00380.1
64
Harden RM. Learning outcomes as a tool to assess progression. Medical Teacher. 2007;29:678–82. doi: 10.1080/01421590701729955
65
Ross M. Entrustable professional activities. The Clinical Teacher. 2015;12:223–5. doi: 10.1111/tct.12436
66
ten Cate O, Young JQ. The patient handover as an entrustable professional activity: adding meaning in teaching and practice. BMJ Quality & Safety. 2012;21:i9–12. doi: 10.1136/bmjqs-2012-001213
67
Aylward M, Nixon J, Gladding S. An Entrustable Professional Activity (EPA) for Handoffs as a Model for EPA Assessment Development. Academic Medicine. 2014;89:1335–40. doi: 10.1097/ACM.0000000000000317
68
Hauer KE, Soni K, Cornett P, et al. Developing Entrustable Professional Activities as the Basis for Assessment of Competence in an Internal Medicine Residency: A Feasibility Study. Journal of General Internal Medicine. 2013;28:1110–4. doi: 10.1007/s11606-013-2372-x
69
Orsini C, Binnie VI. Entrustment decisions in dental education: Is it time to start formalising? Medical Teacher. 2016;38:322–322. doi: 10.3109/0142159X.2015.1114598
70
Auewarakul C, Downing SM, Praditsuwan R, et al. Item Analysis to Improve Reliability for an Internal Medicine Undergraduate OSCE. Advances in Health Sciences Education. 2005;10:105–13. doi: 10.1007/s10459-005-2315-3
71
NEWBLE DI, SWANSON DB. Psychometric characteristics of the objective structured clinical examination. Medical Education. 1988;22:325–34. doi: 10.1111/j.1365-2923.1988.tb00761.x
72
Sturpe DA. Objective Structured Clinical Examinations in Doctor of Pharmacy Programs in the United States. American journal of pharmaceutical education. 2010;74. doi: 10.5688/aj7408148
73
Snell LS, Frank JR. Competencies, the tea bag model, and the end of time. Medical Teacher. 2010;32:629–30. doi: 10.3109/0142159X.2010.500707
74
Gravina EW. Competency-Based Education and Its Effect on Nursing Education: A Literature Review. Teaching and Learning in Nursing. 2017;12:117–21. doi: 10.1016/j.teln.2016.11.004
75
Read EK, Bell C, Rhind S, et al. The Use of Global Rating Scales for OSCEs in Veterinary Medicine. PLOS ONE. 2015;10. doi: 10.1371/journal.pone.0121000
76
Wood TJ, Pugh D. Are rating scales really better than checklists for measuring increasing levels of expertise? Medical Teacher. 2020;42:46–51. doi: 10.1080/0142159X.2019.1652260
77
Hagel CM, Hall AK, Dagnone JD. Queen’s University Emergency Medicine Simulation OSCE: an Advance in Competency-Based Assessment. CJEM. 2016;18:230–3. doi: 10.1017/cem.2015.34
78
Tekian A, Ten Cate O, Holmboe E, et al. Entrustment decisions: Implications for curriculum development and assessment. Medical Teacher. 2020;1–7. doi: 10.1080/0142159X.2020.1733506
79
Peters H, Holzhausen Y, Boscardin C, et al. Twelve tips for the implementation of EPAs for assessment and entrustment decisions. Medical Teacher. 2017;39:802–7. doi: 10.1080/0142159X.2017.1331031
80
Kakadia R, Chen E, Ohyama H. Implementing an online OSCE during the COVID‐19 pandemic. Journal of Dental Education. Published Online First: 23 July 2020. doi: 10.1002/jdd.12323
81
Ryan A, Carson A, Reid K, et al. Fully online OSCEs: A large cohort case study. MedEdPublish. 2020;9. doi: 10.15694/mep.2020.000214.1
82
J. G. Boyle. Viva la VOSCE? BMC Medical Education. 2020;20.
83
Hopwood J, Myers G, Sturrock A. Twelve tips for conducting a virtual OSCE. Medical Teacher. 2021;43:633–6. doi: 10.1080/0142159X.2020.1830961
84
Norcini JJ. The Mini-CEX: A Method for Assessing Clinical Skills. Annals of Internal Medicine. 2003;138. doi: 10.7326/0003-4819-138-6-200303180-00012
85
Kessel D, Jenkins J, Neville E. Workplace based assessments are no more. BMJ. Published Online First: 26 September 2012. doi: 10.1136/bmj.e6193
86
Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Medical Teacher. 2007;29:855–71. doi: 10.1080/01421590701775453
87
Elstein AS, Sprafka SA, Shulman LS. Medical Problem Solving: An Analysis of Clinical Reasoning. Harvard University Press, 2013 .
88
Noel GL. How Well Do Internal Medicine Faculty Members Evaluate the Clinical Skills of Residents? Annals of Internal Medicine. 1992;117. doi: 10.7326/0003-4819-117-9-757
89
Kogan JR, Bellini LM, Shea JA. Feasibility, Reliability, and Validity of the Mini-Clinical Evaluation Exercise (mCEX) in a Medicine Core Clerkship. Academic Medicine. 2003;78.
90
Durning SJ, Cation LJ, Markert RJ, et al. Assessing the Reliability and Validity of the Mini-Clinical Evaluation Exercise for Internal Medicine Residency Training. Academic Medicine. 2002;77.
91
Holmboe ES, Huot S, Chung J, et al. Construct Validity of the MiniClinical Evaluation Exercise (miniCEX). Academic Medicine. 2003;78.
92
Torsney KM, Cocker DM, Slesser AAP. The Modern Surgeon and Competency Assessment: Are the Workplace-Based Assessments Evidence-Based? World Journal of Surgery. 2015;39:623–33. doi: 10.1007/s00268-014-2875-6
93
Mitchell C, Bhat S, Herbert A, et al. Workplace-based assessments of junior doctors: do scores predict training difficulties? Medical Education. 2011;45:1190–8. doi: 10.1111/j.1365-2923.2011.04056.x
94
Williams RG, Verhulst S, Colliver JA, et al. Assuring the reliability of resident performance appraisals: More items or more observations? Surgery. 2005;137:141–7. doi: 10.1016/j.surg.2004.06.011
95
Murphy DJ, Bruce DA, Mercer SW, et al. The reliability of workplace-based assessment in postgraduate medical education and training: a national evaluation in general practice in the United Kingdom. Advances in Health Sciences Education. 2009;14:219–32. doi: 10.1007/s10459-008-9104-8
96
Archer JC. Use of SPRAT for peer review of paediatricians in training. BMJ. 2005;330:1251–3. doi: 10.1136/bmj.38447.610451.8F
97
Quantrill SJ, Tun JK. Workplace-based assessment as an educational tool. Guide supplement 31.5 – Viewpoint. Medical Teacher. 2012;34:417–8. doi: 10.3109/0142159X.2012.668234
98
Hurst YK, Prescott-Clements LE, Rennie JS. The patient assessment questionnaire: A new instrument for evaluating the interpersonal skills of vocational dental practitioners. British Dental Journal. 2004;197:497–500. doi: 10.1038/sj.bdj.4811750
99
Humphrey-Murto S, Côté M, Pugh D, et al. Assessing the Validity of a Multidisciplinary Mini-Clinical Evaluation Exercise. Teaching and Learning in Medicine. 2018;30:152–61. doi: 10.1080/10401334.2017.1387553
100
Rekman J, Hamstra SJ, Dudek N, et al. A New Instrument for Assessing Resident Competence in Surgical Clinic: The Ottawa Clinic Assessment Tool. Journal of Surgical Education. 2016;73:575–82. doi: 10.1016/j.jsurg.2016.02.003
101
Sutherland RM, Reid KJ, Chiavaroli NG, et al. Assessing Diagnostic Reasoning Using a Standardized Case-Based Discussion. Journal of Medical Education and Curricular Development. 2019;6. doi: 10.1177/2382120519849411
102
Driessen EW, Muijtjens AMM, van Tartwijk J, et al. Web- or paper-based portfolios: is there a difference? Medical Education. 2007;41:1067–73. doi: 10.1111/j.1365-2923.2007.02859.x
103
Driessen E, van Tartwijk J. Portfolios in personal and professional development. In: Swanwick T, ed. Understanding Medical Education. Chichester, UK: Wiley-Blackwell 2013:255–62.
104
Siau K, Dunckley P, Valori R, et al. Changes in scoring of Direct Observation of Procedural Skills (DOPS) forms and the impact on competence assessment. Endoscopy. 2018;50:770–8. doi: 10.1055/a-0576-6667
105
Martinsen SSS, Espeland T, Berg EAR, et al. Examining the educational impact of the mini-CEX: a randomised controlled study. BMC Medical Education. 2021;21. doi: 10.1186/s12909-021-02670-3
106
Cohen L, Manion L, Morrison K. Research methods in education. Eighth edition. London: Routledge 2018.
107
Joanna Briggs Institute QARI. https://jbi.global/
108
Buckley S, Coleman J, Davison I, et al. The educational effects of portfolios on undergraduate student learning: A Best Evidence Medical Education (BEME) systematic review. BEME Guide No. 11. Medical Teacher. 2009;31:282–98. doi: 10.1080/01421590902889897
109
Brookfield S. Developing critical thinkers: challenging adults to explore alternative ways of thinking and acting. Milton Keynes: Open University Press 1987.
110
Burls A. What is critical appraisal? 2009.
111
The Campbell Collaboration. http://www.campbellcollaboration.org/
112
CASP Critical Appraisal Skills Programme Oxford UK. http://www.casp-uk.net/
113
Cochrane | Trusted evidence. Informed decisions. Better health. http://www.cochrane.org/
114
Kee F, Bickle I. Critical thinking and critical appraisal: the chicken and the egg? QJM. 2004;97:609–14. doi: 10.1093/qjmed/hch099
115
Da Silva A, Dennick R. Corpus analysis of problem-based learning transcripts: an exploratory study. Medical Education. 2010;44:280–8. doi: 10.1111/j.1365-2923.2009.03575.x
116
Garrison DR. Critical thinking and adult education: a conceptual model for developing critical thinking in adult learners. International Journal of Lifelong Education. 1991;10:287–303. doi: 10.1080/0260137910100403
117
Hammick M, Dornan T, Steinert Y. Conducting a best evidence systematic review. Part 1: From idea to data coding. BEME Guide No. 13. Medical Teacher. 2010;32:3–15. doi: 10.3109/01421590903414245
118
Horsley T, Hyde C, Santesso N, et al. Teaching critical appraisal skills in healthcare settings. Cochrane Database of Systematic Reviews. Published Online First: 1 September 1996. doi: 10.1002/14651858.CD001270.pub2
119
Huang GC, Newman LR, Schwartzstein RM. Critical Thinking in Health Professions Education: Summary and Consensus Statements of the Millennium Conference 2011. Teaching and Learning in Medicine. 2014;26:95–102. doi: 10.1080/10401334.2013.857335
120
Jenicek M. The hard art of soft science: Evidence-Based Medicine, Reasoned Medicine or both? Journal of Evaluation in Clinical Practice. 2006;12:410–9. doi: 10.1111/j.1365-2753.2006.00718.x
121
Kirkpatrick D. Great Ideas Revisited: Revisiting Kirkpatrick’s Four-Level Model. Training and Development. 1996;50:54–9.
122
Missimer CA. Good arguments: an introduction to critical thinking. 3rd ed. Englewood Cliffs, N.J.: Prentice Hall 1995.
123
Moore TJ. Critical thinking and disciplinary thinking: a continuing debate. Higher Education Research & Development. 2011;30:261–74. doi: 10.1080/07294360.2010.501328
124
Paul R. Critical thinking: how to prepare students for a rapidly changing world. foundation for critical thinking 1995.
125
Paul R, Elder L. The Miniature Guide to Critical Thinking: Concepts and Tools. 2006.
126
Yardley S, Dornan T. Kirkpatrick’s levels and education ‘evidence’. Medical Education. 2012;46:97–106. doi: 10.1111/j.1365-2923.2011.04076.x
127
Devine OP, Harborne AC, McManus IC. Assessment at UK medical schools varies substantially in volume, type and intensity and correlates with postgraduate attainment. BMC Medical Education. 2015;15. doi: 10.1186/s12909-015-0428-9
128
Ertmer PA, Richardson JC, Belland B, et al. Using Peer Feedback to Enhance the Quality of Student Online Postings: An Exploratory Study. Journal of Computer-Mediated Communication. 2007;12:412–33. doi: 10.1111/j.1083-6101.2007.00331.x
129
Nicol DJ, Macfarlane‐Dick D. Formative assessment and self‐regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education. 2006;31:199–218. doi: 10.1080/03075070600572090
130
Piedra N, Chicaiza J, Lopez J, et al. Measuring collaboration and creativity skills through rubrics: Experience from UTPL collaborative social networks course. IEEE EDUCON 2010 Conference. IEEE 2010:1511–6.
131
De Wever B, Van Keer H, Schellens T, et al. Assessing collaboration in a wiki: The reliability of university students’ peer assessment. The Internet and Higher Education. 2011;14:201–6. doi: 10.1016/j.iheduc.2011.07.003
132
Verkerk MA, de Bree MJ, Mourits MJE. Reflective professionalism: interpreting CanMEDS’ ‘professionalism’. Journal of Medical Ethics. 2007;33:663–6. doi: 10.1136/jme.2006.017954
133
Cleland JA, Knight LV, Rees CE, et al. Is it me or is it them? Factors that influence the passing of underperforming students. Medical Education. 2008;42:800–9. doi: 10.1111/j.1365-2923.2008.03113.x
134
Cruess R. The Professionalism Mini-Evaluation Exercise: A Preliminary Investigation. Academic Medicine. ;81.
135
Goldie J. Assessment of professionalism: A consolidation of current thinking. Medical Teacher. 2013;35:e952–6. doi: 10.3109/0142159X.2012.714888
136
van Mook WNKA, Gorter SL, O’Sullivan H, et al. Approaches to professional behaviour assessment: Tools in the professionalism toolbox. European Journal of Internal Medicine. 2009;20:e153–7. doi: 10.1016/j.ejim.2009.07.012
137
Ginsburg S, Regehr G, Lingard L. Basing the evaluation of professionalism on observable behaviours: a cautionary tale. Academic Medicine. 2004;79:S1–4.
138
Hodges BD, Ginsburg S, Cruess R, et al. Assessment of professionalism: Recommendations from the Ottawa 2010 Conference. Medical Teacher. 2011;33:354–63. doi: 10.3109/0142159X.2011.577300
139
Schubert S, Ortwein H, Dumitsch A, et al. A situational judgement test of professional behaviour: development and validation. Medical Teacher. 2008;30:528–33. doi: 10.1080/01421590801952994
140
Arnold L, Shue CK, Kritt B, et al. Medical students’ views on peer assessment of professionalism. Journal of General Internal Medicine. 2005;20:819–24. doi: 10.1111/j.1525-1497.2005.0162.x
141
Arnold L, Shue CK, Kalishman S, et al. Can There Be a Single System for Peer Assessment of Professionalism among Medical Students? A Multi-Institutional Study. Academic Medicine. 2007;82:578–86. doi: 10.1097/ACM.0b013e3180555d4e
142
Finn G, Sawdon M, Clipsham L, et al. Peer estimation of lack of professionalism correlates with low Conscientiousness Index scores. Medical Education. 2009;43:960–7. doi: 10.1111/j.1365-2923.2009.03453.x
143
Gaufberg E, Fitzpatrick A. The favour: a professional boundaries OSCE station. Medical Education. 2008;42:529–30. doi: 10.1111/j.1365-2923.2008.03067.x
144
Ginsburg S. Context, Conflict, and Resolution: A New Conceptual Framework for Evaluating Professionalism. Academic Medicine. ;75.
145
Ginsburg S, Regehr G, Mylopoulos M. From behaviours to attributions: further concerns regarding the evaluation of professionalism. Medical Education. 2009;43:414–25. doi: 10.1111/j.1365-2923.2009.03335.x
146
Ginsburg S, van der Vleuten C, Eva KW, et al. Hedging to save face: a linguistic analysis of written comments on in-training evaluation reports. Advances in Health Sciences Education. 2016;21:175–88. doi: 10.1007/s10459-015-9622-0
147
GMC. Development of generic professional capabilities. http://www.gmc-uk.org/education/23581.asp
148
Kelly M, O’Flynn S, McLachlan J, et al. The Clinical Conscientiousness Index. Academic Medicine. 2012;87:1218–24. doi: 10.1097/ACM.0b013e3182628499
149
McCormack WT, Lazarus C, Stern D, et al. Peer Nomination: A Tool for Identifying Medical Student Exemplars in Clinical Competence and Caring, Evaluated at Three Medical Schools. Academic Medicine. 2007;82:1033–9. doi: 10.1097/01.ACM.0000285345.75528.ee
150
McLachlan JC, Finn G, Macnaughton J. The Conscientiousness Index: A Novel Tool to Explore Students’ Professionalism. Academic Medicine. 2009;84:559–65. doi: 10.1097/ACM.0b013e31819fb7ff
151
Norcini JJ. Peer assessment of competence. Medical Education. 2003;37:539–43. doi: 10.1046/j.1365-2923.2003.01536.x
152
Papadakis MA, Teherani A, Banach MA, et al. Disciplinary Action by Medical Boards and Prior Behavior in Medical School. New England Journal of Medicine. 2005;353:2673–82. doi: 10.1056/NEJMsa052596
153
Pohl CA, Hojat M, Arnold L. Peer Nominations as Related to Academic Attainment, Empathy, Personality, and Specialty Interest. Academic Medicine. 2011;86:747–51. doi: 10.1097/ACM.0b013e318217e464
154
Ramsey PG. Use of Peer Ratings to Evaluate Physician Performance. JAMA: The Journal of the American Medical Association. 1993;269. doi: 10.1001/jama.1993.03500130069034
155
Royal College of Physicians. Doctors in Society: Medical professionalism in a changing world. 2005.
156
Stern DT, Frohna AZ, Gruppen LD. The prediction of professional behaviour. Medical Education. 2005;39:75–82. doi: 10.1111/j.1365-2929.2004.02035.x
157
Stern DT. Measuring medical professionalism. New York: Oxford University Press 2006.
158
van Mook WNKA, van Luijk SJ, O’Sullivan H, et al. General considerations regarding assessment of professional behaviour. European Journal of Internal Medicine. 2009;20:e90–5. doi: 10.1016/j.ejim.2008.11.011
159
Wilkinson TJ, Wade WB, Knock LD. A Blueprint to Assess Professionalism: Results of a Systematic Review. Academic Medicine. 2009;84:551–8. doi: 10.1097/ACM.0b013e31819fbaa2
160
Zijlstra-Shaw S, Robinson PG, Roberts T. Assessing professionalism within dental education; the need for a definition. European Journal of Dental Education. 2012;16:e128–36. doi: 10.1111/j.1600-0579.2011.00687.x
161
Patterson F, Zibarras L, Ashworth V. Situational judgement tests in medical education and training: Research, theory and practice: AMEE Guide No. 100. Medical Teacher. 2016;38:3–17. doi: 10.3109/0142159X.2015.1072619
162
Gorania R. Situational judgement stress. British Dental Journal. 2021;231:426–426. doi: 10.1038/s41415-021-3577-8
163
Royal College of Physicians. Advancing medical professionalism. 2018. https://www.healthcarevalues.ox.ac.uk/files/ampsummarypdf
164
Rimmer A. Situational judgment test is scrapped under new system for allocating foundation training places. BMJ. Published Online First: 2 June 2023. doi: 10.1136/bmj.p1269
165
McKinley DW, Norcini JJ. How to set standards on performance-based examinations: AMEE Guide No. 85. Medical Teacher. 2014;36:97–110. doi: 10.3109/0142159X.2013.853119
166
Ben-David MF. AMEE Guide No. 18: Standard setting in student assessment. Medical Teacher. 2000;22:120–30. doi: 10.1080/01421590078526
167
De Champlain AF. Standard Setting Methods in Medical Education. In: Swanwick T, ed. Understanding Medical Education. Oxford, UK: Wiley-Blackwell 2019:347–59.
168
Cohen-Schotanus J, van der Vleuten CPM. A standard setting method with the best performing students as point of reference: Practical and affordable. Medical Teacher. 2010;32:154–60. doi: 10.3109/01421590903196979
169
Downing SM, Tekian A, Yudkowsky R. RESEARCH METHODOLOGY: Procedures for Establishing Defensible Absolute Passing Scores on Performance Examinations in Health Professions Education. Teaching and Learning in Medicine. 2006;18:50–7. doi: 10.1207/s15328015tlm1801_11
170
Hofstee WKB. The Case for Compromise in Educational Selection and Grading. On Educational Testing. Published Online First: 1984.
171
Fowell SL, Fewtrell R, McLaughlin PJ. Estimating the Minimum Number of Judges Required for Test-centred Standard Setting on Written Assessments. Do Discussion and Iteration have an Influence? Advances in Health Sciences Education. 2008;13:11–24. doi: 10.1007/s10459-006-9027-1
172
Taylor CA. Development of a modified Cohen method of standard setting. Medical Teacher. 2011;33:e678–82. doi: 10.3109/0142159X.2011.611192
173
Karantonis A, Sireci SG. The Bookmark Standard-Setting Method: A Literature Review. Educational Measurement: Issues and Practice. 2006;25:4–12. doi: 10.1111/j.1745-3992.2006.00047.x
174
Puryer J, O’Sullivan D. An introduction to standard setting methods in dentistry. BDJ. 2015;219:355–8. doi: 10.1038/sj.bdj.2015.755
175
Linn AMJ, Tonkin A, Duggan P. Standard setting of script concordance tests using an adapted Nedelsky approach. Medical Teacher. 2013;35:314–9. doi: 10.3109/0142159X.2012.746446
176
Woodhouse, L. Comparison of Cohen and Angoff methods of standard setting: is Angoff worth it? European Board of Medical Assessors Annual Academic Conference: Crossing Boundaries: Assessment in Medical Education.
177
Barbara S. Plake, James C. Impara and Patrick M. Irwin. Consistency of Angoff-Based Predictions of Item Performance: Evidence of Technical Quality of Results from the Angoff Standard Setting Method. Journal of Educational Measurement. 2000;37.
178
IC McManus. Implementing statistical equating for MRCP(UK) parts 1 and 2. BMC Medical Education. 2014;14.
179
Wood DF. Formative Assessment. In: Swanwick T, ed. Understanding Medical Education. Oxford, UK: John Wiley & Sons, Ltd 2019:317–28.
180
Bing-You RG. Why Medical Educators May Be Failing at Feedback. JAMA. 2009;302. doi: 10.1001/jama.2009.1393
181
Ramani S, Krackov SK. Twelve tips for giving feedback effectively in the clinical environment. Medical Teacher. 2012;34:787–91. doi: 10.3109/0142159X.2012.684916
182
Van De Ridder JMM, Stokking KM, McGaghie WC, et al. What is feedback in clinical education? Medical Education. 2008;42:189–97. doi: 10.1111/j.1365-2923.2007.02973.x
183
Rushton A. Formative assessment: a key to deep learning? Medical Teacher. 2005;27:509–13. doi: 10.1080/01421590500129159
184
Sender Liberman A, Liberman M, Steinert Y, et al. Surgery residents and attending surgeons have different perceptions of feedback. Medical Teacher. 2005;27:470–2. doi: 10.1080/0142590500129183
185
Duers LE, Brown N. An exploration of student nurses’ experiences of formative assessment. Nurse Education Today. 2009;29:654–9. doi: 10.1016/j.nedt.2009.02.007
186
Olson BL, McDonald JL. Influence of Online Formative Assessment Upon Student Learning in Biomedical Science Courses. Journal of Dental Education. 2004;68:656–9. doi: 10.1002/j.0022-0337.2004.68.6.tb03783.x
187
Garrison C, Ehringhaus M. Formative and Summative Assessments in the Classroom. Published Online First: 2007.
188
Sadler DR. Formative Assessment: revisiting the territory. Assessment in Education: Principles, Policy & Practice. 1998;5:77–84. doi: 10.1080/0969595980050104
189
Ditchfield C. How do learners make sense of the formative assessment opportunities available to inform their learning in a PBL course. 2007.
190
Weaver MR. Do students value feedback? Student perceptions of tutors’ written responses. Assessment & Evaluation in Higher Education. 2006;31:379–94. doi: 10.1080/02602930500353061
191
Ferrell G. Supporting assessment and feedback practice with technology: from tinkering to transformation. 2013.
192
Black P, Wiliam D. Assessment and Classroom Learning. Assessment in Education: Principles, Policy & Practice. 1998;5:7–74. doi: 10.1080/0969595980050102
193
Sadler DR. Formative assessment and the design of instructional systems. Instructional Science. 1989;18:119–44. doi: 10.1007/BF00117714
194
Elnicki DM, Layne RD, Ogden PE, et al. Oral versus written feedback in medical clinic. Journal of General Internal Medicine. 1998;13:155–8. doi: 10.1046/j.1525-1497.1998.00049.x
195
Norcross WA. The Consultation: An Approach to Learning and Teaching. JAMA: The Journal of the American Medical Association. 1985;253. doi: 10.1001/jama.1985.03350270123038
196
Jackson JL, Kay C, Jackson WC, et al. The Quality of Written Feedback by Attendings of Internal Medicine Residents. Journal of General Internal Medicine. 2015;30:973–8. doi: 10.1007/s11606-015-3237-2
197
Rudolph J, Raemer D, Shapiro J. We know what they did wrong, but not why : the case for ‘frame-based’ feedback. The Clinical Teacher. 2013;10:186–9. doi: 10.1111/j.1743-498X.2012.00636.x
198
Scally G, Donaldson LJ. Looking forward: Clinical governance and the drive for quality improvement in the new NHS in England. BMJ. 1998;317:61–5. doi: 10.1136/bmj.317.7150.61
199
Shaw S. Research governance: where did it come from, what does it mean? Journal of the Royal Society of Medicine. 2005;98:496–502.
200
Dimensions of Quality. 2010.
201
Anderson D, Ackerman-Anderson LS. Beyond change management: how to achieve breakthrough results through conscious change leadership. 2nd ed. San Francisso: Pfeiffer 2010.
202
Hay D, Kinchin I, Lygo‐Baker S. Making learning visible: the role of concept mapping in higher education. Studies in Higher Education. 2008;33:295–311. doi: 10.1080/03075070802049251
203
Hay DB, Tan PL, Whaites E. Non‐traditional learners in higher education: comparison of a traditional MCQ examination with concept mapping to assess learning in a dental radiological science course. Assessment & Evaluation in Higher Education. 2010;35:577–95. doi: 10.1080/02602931003782525
204
Richstone L, Schwartz MJ, Seideman C, et al. Eye Metrics as an Objective Assessment of Surgical Skill. Annals of Surgery. 2010;252:177–82. doi: 10.1097/SLA.0b013e3181e464fb
205
Suetsugu N, Ohki M, Kaku T. Quantitative Analysis of Nursing Observation Employing a Portable Eye-Tracker. Open Journal of Nursing. 2016;06:53–61. doi: 10.4236/ojn.2016.61006
206
Gould J, Day P. Hearing you loud and clear: student perspectives of audio feedback in higher education. Assessment & Evaluation in Higher Education. 2013;38:554–66. doi: 10.1080/02602938.2012.660131
207
Frost J, de Pont G, Brailsford I. Expanding assessment methods and moments in history. Assessment & Evaluation in Higher Education. 2012;37:293–304. doi: 10.1080/02602938.2010.531247
208
Harrison CJ, Molyneux AJ, Blackwell S, et al. How we give personalised audio feedback after summative OSCEs. Medical Teacher. 2015;37:323–6. doi: 10.3109/0142159X.2014.932901
209
Voelkel S, Mello LV. Audio Feedback – Better Feedback? Bioscience Education. 2014;22:16–30. doi: 10.11120/beej.2014.00022
210
Ashraf H, Sodergren MH, Merali N, et al. Eye-tracking technology in medical education: A systematic review. Medical Teacher. 2018;40:62–9. doi: 10.1080/0142159X.2017.1391373
211
Mayer RE. Cognitive Learning. Encyclopedia of the sciences of learning. [S.l.]: Springer 2012.
212
Ho VW, Harris PG, Kumar RK, et al. Knowledge maps: a tool for online assessment with automated feedback. Medical Education Online. 2018;23. doi: 10.1080/10872981.2018.1457394
213
Guraya SY. The Desired Concept Maps and Goal Setting for Assessing Professionalism in Medicine. JOURNAL OF CLINICAL AND DIAGNOSTIC RESEARCH. Published Online First: 2016. doi: 10.7860/JCDR/2016/19917.7832
214
Kassab SE, Fida M, Radwan A, et al. Generalisability theory analyses of concept mapping assessment scores in a problem-based medical curriculum. Medical Education. 2016;50:730–7. doi: 10.1111/medu.13054
215
Courteille O, Bergin R, Courteille O, et al. The use of a virtual patient case in an OSCE-based exam – A pilot study. Medical Teacher. 2008;30:e66–76. doi: 10.1080/01421590801910216
216
Downing SM. Guessing on selected-response examinations. Medical Education. 2003;37:670–1. doi: 10.1046/j.1365-2923.2003.01585.x
217
Ingrid Tonni, Cynthia C. Gadbury‐Amyot, Marjan Govaerts, Olle ten Cate, Joan Davis, Lily T. Garcia, Richard W. Valachovic. ADEA‐ADEE Shaping the Future of Dental Education III. Journal of Dental Education. 2020;84:97–104. doi: 10.1002/jdd.12024