1.
dos S. Ribeiro, C., van de Burgwal, L.H.M., Regeer, B.J.: Overcoming challenges for designing and implementing the One Health approach: A systematic review of the literature. One Health. 7, (2019). https://doi.org/10.1016/j.onehlt.2019.100085.
2.
Hodges, B.D.: A practical guide for medical teachers. Elsevier, Edinburgh (2017).
3.
Association for the Study of Medical Education: Understanding medical education: evidence, theory, and practice. Wiley-Blackwell, Hoboken, NJ (2019).
4.
Marshall, S. ed: A handbook for teaching and learning in higher education: enhancing academic practice. Routledge, Abingdon, Oxon (2020).
5.
Holmboe, Eric.S., Durning, S.J.: Practical Guide to the Assessment of Clinical Competence. (2024).
6.
Norcini, J., Anderson, B., Bollela, V., Burch, V., Costa, M.J., Duvivier, R., Galbraith, R., Hays, R., Kent, A., Perrott, V., Roberts, T.: Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 Conference. Medical Teacher. 33, 206–214 (2011). https://doi.org/10.3109/0142159X.2011.551559.
7.
Van Der Vleuten, C.P.M.: The assessment of professional competence: Developments, research and practical implications. Advances in Health Sciences Education. 1, 41–67 (1996). https://doi.org/10.1007/BF00596229.
8.
van der Vleuten, C.P.M., Schuwirth, L.W.T.: Assessing professional competence: from methods to programmes. Medical Education. 39, 309–317 (2005). https://doi.org/10.1111/j.1365-2929.2005.02094.x.
9.
Schuwirth, L.W., van der Vleuten, C.P.: How to Design a Useful Test: The Principles of Assessment. In: Swanwick, T. (ed.) Understanding Medical Education. pp. 277–289. Wiley-Blackwell, Oxford, UK (2019). https://doi.org/10.1002/9781119373780.ch20.
10.
van der Vleuten, C.P.M., Schuwirth, L.W.T., Driessen, E.W., Dijkstra, J., Tigelaar, D., Baartman, L.K.J., van Tartwijk, J.: A model for programmatic assessment fit for purpose. Medical Teacher. 34, 205–214 (2012). https://doi.org/10.3109/0142159X.2012.652239.
11.
Biggs, J.: Enhancing Teaching through Constructive Alignment. Higher Education. 32, 347–364 (1996).
12.
Miller, G.: The assessment of clinical skills/competence/performance. Academic Medicine. 65, (1990).
13.
Schuwirth, L.W.T., van der Vleuten, C.P.M.: Programmatic assessment and Kane’s validity perspective. Medical Education. 46, 38–48 (2012). https://doi.org/10.1111/j.1365-2923.2011.04098.x.
14.
Epstein, R.M.: Defining and Assessing Professional Competence. JAMA. 287, (2002). https://doi.org/10.1001/jama.287.2.226.
15.
ten Cate, O.: Nuts and Bolts of Entrustable Professional Activities. Journal of Graduate Medical Education. 5, 157–158 (2013). https://doi.org/10.4300/JGME-D-12-00380.1.
16.
Ten Cate, O.: Competency-Based Postgraduate Medical Education: Past, Present and Future. GMS Journal for Medical Education. 34, (2017). https://doi.org/10.3205/zma001146.
17.
Cruess, R.L., Cruess, S.R., Steinert, Y.: Amending Miller’s Pyramid to Include Professional Identity Formation. Academic Medicine. 91, 180–185 (2016). https://doi.org/10.1097/ACM.0000000000000913.
18.
Cobb, K.A., Brown, G., Jaarsma, D.A.D.C., Hammond, R.A.: The educational impact of assessment: A comparison of DOPS and MCQs. Medical Teacher. 35, e1598–e1607 (2013). https://doi.org/10.3109/0142159X.2013.803061.
19.
Jolly, B., Dalton, M.J.: Written Assessment. In: Swanwick, T., Forrest, K., and O’Brien, B.C. (eds) Understanding Medical Education. pp. 291–317. John Wiley & Sons, Ltd, Chichester, UK (2018). https://doi.org/10.1002/9781119373780.ch21.
20.
Jolly, B.: Written Assessment. In: Swanwick, T. (ed.) Understanding Medical Education. pp. 261–261. Wiley-Blackwell, Oxford, UK (2019). https://doi.org/10.1002/9781119373780.ch21.
21.
Epstein, R.M.: Assessment in Medical Education. New England Journal of Medicine. 356, 387–396 (2007). https://doi.org/10.1056/NEJMra054784.
22.
Hift, R.J.: Should essays and other "open-ended”-type questions retain a place in written summative assessment in clinical medicine? BMC Medical Education. 14, (2014). https://doi.org/10.1186/s12909-014-0249-2.
23.
Paniagua, M., Swygert, K. eds: The Gold Book - constructing written test questions for the basic and clinical sciences, https://www.nbme.org/sites/default/files/2020-01/IWW_Gold_Book.pdf.
24.
Schuwirth, L.W.T., van der Vleuten, C.P.M.: ABC of learning and teaching in medicine: Written assessment. BMJ. 326, 643–645 (2003). https://doi.org/10.1136/bmj.326.7390.643.
25.
Schuwirth, L.W.T., van der Vleuten, C.P.M.: Different written assessment methods: what can be said about their strengths and weaknesses? Medical Education. 38, 974–979 (2004). https://doi.org/10.1111/j.1365-2929.2004.01916.x.
26.
Schuwirth, L.W.T., van der Vleuten, C.P.M.: General overview of the theories used in assessment: AMEE Guide No. 57. Medical Teacher. 33, 783–797 (2011). https://doi.org/10.3109/0142159X.2011.611022.
27.
Charlin, B., Roy, L., Brailovsky, C., Goulet, F., van der Vleuten, C.: The Script Concordance Test: A Tool to Assess the Reflective Clinician. Teaching and Learning in Medicine. 12, 189–195 (2000). https://doi.org/10.1207/S15328015TLM1204_5.
28.
Fournier, J., Demeester, A., Charlin, B.: Script Concordance Tests: Guidelines for Construction. BMC Medical Informatics and Decision Making. 8, (2008). https://doi.org/10.1186/1472-6947-8-18.
29.
Case, S.M., Swanson, D.B.: Extended‐matching items: A practical alternative to free‐response questions. Teaching and Learning in Medicine. 5, 107–115 (1993). https://doi.org/10.1080/10401339309539601.
30.
Dory, V., Gagnon, R., Vanpee, D., Charlin, B.: How to construct and implement script concordance tests: insights from a systematic review. Medical Education. 46, 552–563 (2012). https://doi.org/10.1111/j.1365-2923.2011.04211.x.
31.
Farmer, E.A., Page, G.: A practical guide to assessing clinical decision-making skills using the key features approach. Medical Education. 39, 1188–1194 (2005). https://doi.org/10.1111/j.1365-2929.2005.02339.x.
32.
FENDERSON, B.: The virtues of extended matching and uncued tests as alternatives to multiple choice questions. Human Pathology. 28, 526–532 (1997). https://doi.org/10.1016/S0046-8177(97)90073-3.
33.
Haladyna, T.M., Downing, S.M., Rodriguez, M.C.: A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment. Applied Measurement in Education. 15, 309–333 (2002). https://doi.org/10.1207/S15324818AME1503_5.
34.
McCoubrie, P.: Improving the fairness of multiple-choice questions: a literature review. Medical Teacher. 26, 709–712 (2004). https://doi.org/10.1080/01421590400013495.
35.
Miller, M.D., Linn, R.L., Gronlund, N.E.: Measurement and assessment in teaching. Pearson Education, Boston, Mass (2013).
36.
Palmer, E.J., Devitt, P.G.: Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions?: research paper. BMC Medical Education. 7, (2007). https://doi.org/10.1186/1472-6920-7-49.
37.
Lubarsky, S., Dory, V., Meterissian, S., Lambert, C., Gagnon, R.: Examining the effects of gaming and guessing on script concordance test scores. Perspectives on Medical Education. 7, 174–181 (2018). https://doi.org/10.1007/s40037-018-0435-8.
38.
Anderson, L.W., Bloom, B.S.: A taxonomy for learning, teaching, and assessing: a revision of Bloom’s taxonomy of educational objectives. Longman, New York, N.Y. (2001).
39.
Sam, A.H., Field, S.M., Collares, C.F., van der Vleuten, C.P.M., Wass, V.J., Melville, C., Harris, J., Meeran, K.: Very-short-answer questions: reliability, discrimination and acceptability. Medical Education. 52, 447–455 (2018). https://doi.org/10.1111/medu.13504.
40.
Cotton, D.R.E., Cotton, P.A., Shipway, J.R.: Chatting and cheating: Ensuring academic integrity in the era of ChatGPT. Innovations in Education and Teaching International. 61, 228–239 (2024). https://doi.org/10.1080/14703297.2023.2190148.
41.
Boursicot, K.A.M., Roberts, T.E., Burdick, W.P.: Structured Assessments of Clinical Competence. In: Swanwick, T., Forrest, K., and O’Brien, B.C. (eds) Understanding Medical Education. pp. 335–345. John Wiley & Sons, Ltd, Chichester, UK (2018). https://doi.org/10.1002/9781119373780.ch23.
42.
Schoonheim-Klein, M., Muijtjens, A., Habets, L., Manogue, M., Van Der Vleuten, C., Van Der Velden, U.: Who will pass the dental OSCE? Comparison of the Angoff and the borderline regression standard setting methods. European Journal of Dental Education. 13, 162–171 (2009). https://doi.org/10.1111/j.1600-0579.2008.00568.x.
43.
Regehr G1, MacRae H, Reznick RK, Szalay D.: Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med. (1998).
44.
Harden, R.M.: Misconceptions and the OSCE. Medical Teacher. 37, 608–610 (2015). https://doi.org/10.3109/0142159X.2015.1042443.
45.
Carraccio, Carol;Wolfsthal, Susan D.;Englander, Robert;Ferentz, Kevin;Martin, Christine: Shifting Paradigms: From Flexner to Competencies. Academic Medicine. 77,.
46.
Rushforth, H.E.: Objective structured clinical examination (OSCE): Review of literature and implications for nursing education. Nurse Education Today. 27, 481–490 (2007). https://doi.org/10.1016/j.nedt.2006.08.009.
47.
Spielman, A., Fulmer, T., Eisenberg, E., Alfano, M.: Dentistry, Nursing, and Medicine: A Comparisaon of Core Competencies. Journal of Dental Education. 69, 1257–1271 (2005).
48.
Harden, R.M., Stevenson, M., Downie, W.W., Wilson, G.M.: Assessment of clinical competence using objective structured examination. BMJ. 1, 447–451 (1975). https://doi.org/10.1136/bmj.1.5955.447.
49.
Watson, R., Stimpson, A., Topping, A., Porock, D.: Clinical competence assessment in nursing: a systematic review of the literature. Journal of Advanced Nursing. 39, 421–431 (2002). https://doi.org/10.1046/j.1365-2648.2002.02307.x.
50.
Williams, D.M., Davies, S., Horner, M., Handley, J.: Peer and near-peer OSCE examiners. Medical Teacher. 38, 212–213 (2016). https://doi.org/10.3109/0142159X.2015.1072266.
51.
Brown, C., Ross, S., Cleland, J., Walsh, K.: Money makes the (medical assessment) world go round: The cost of components of a summative final year Objective Structured Clinical Examination (OSCE). Medical Teacher. 37, 653–659 (2015). https://doi.org/10.3109/0142159X.2015.1033389.
52.
Meskell, P., Burke, E., Kropmans, T.J.B., Byrne, E., Setyonugroho, W., Kennedy, K.M.: Back to the future: An online OSCE Management Information System for nursing OSCEs. Nurse Education Today. 35, 1091–1096 (2015). https://doi.org/10.1016/j.nedt.2015.06.010.
53.
Tavakol, M., Doody, G.A.: A novel psychometric programme for the rapid analysis of OSCE data. Medical Teacher. 38, 104–105 (2016). https://doi.org/10.3109/0142159X.2015.1062085.
54.
Eva, K.W., Rosenfeld, J., Reiter, H.I., Norman, G.R.: An admissions OSCE: the multiple mini-interview. Medical Education. 38, 314–326 (2004). https://doi.org/10.1046/j.1365-2923.2004.01776.x.
55.
Lane, P.: Recruitment into training for general practice—the winds of change or a breath of fresh air? BMJ. 331, s153–s153 (2005). https://doi.org/10.1136/bmj.331.7520.s153.
56.
Hodges, B., Regehr, G., McNaughton, N., Tiberius, R., Hanson, M.: OSCE checklists do not capture increasing levels of expertise. 74, (1999).
57.
Hodges, B., McIlroy, J.H.: Analytic global OSCE ratings are sensitive to level of training. Medical Education. 37, 1012–1016 (2003). https://doi.org/10.1046/j.1365-2923.2003.01674.x.
58.
Ma, I.W.Y., Zalunardo, N., Pachev, G., Beran, T., Brown, M., Hatala, R., McLaughlin, K.: Comparing the use of global rating scale with checklists for the assessment of central venous catheterization skills using simulation. Advances in Health Sciences Education. 17, 457–470 (2012). https://doi.org/10.1007/s10459-011-9322-3.
59.
Wood, T.J., Humphrey-Murto, S.M., Norman, G.R.: Standard Setting in a Small Scale OSCE: A Comparison of the Modified Borderline-Group Method and the Borderline Regression Method. Advances in Health Sciences Education. 11, 115–122 (2006). https://doi.org/10.1007/s10459-005-7853-1.
60.
Harden, R.M.: Revisiting ‘Assessment of clinical competence using an objective structured clinical examination (OSCE)’. Medical Education. 50, 376–379 (2016). https://doi.org/10.1111/medu.12801.
61.
Harden, R.M., Lilley, P., Patricio, M., Norman, G.R.: The definitive guide to the OSCE: the Objective Structured Clinical Examination as a performance assessment. Elsevier, Edinburgh (2016).
62.
Denison, A., Bate, E., Thompson, J.: Tablet versus paper marking in assessment: feedback matters. Perspectives on Medical Education. 5, 108–113 (2016). https://doi.org/10.1007/s40037-016-0262-8.
63.
ten Cate, O.: Nuts and Bolts of Entrustable Professional Activities. Journal of Graduate Medical Education. 5, 157–158 (2013). https://doi.org/10.4300/JGME-D-12-00380.1.
64.
Harden, R.M.: Learning outcomes as a tool to assess progression. Medical Teacher. 29, 678–682 (2007). https://doi.org/10.1080/01421590701729955.
65.
Ross, M.: Entrustable professional activities. The Clinical Teacher. 12, 223–225 (2015). https://doi.org/10.1111/tct.12436.
66.
ten Cate, O., Young, J.Q.: The patient handover as an entrustable professional activity: adding meaning in teaching and practice. BMJ Quality & Safety. 21, i9–i12 (2012). https://doi.org/10.1136/bmjqs-2012-001213.
67.
Aylward, M., Nixon, J., Gladding, S.: An Entrustable Professional Activity (EPA) for Handoffs as a Model for EPA Assessment Development. Academic Medicine. 89, 1335–1340 (2014). https://doi.org/10.1097/ACM.0000000000000317.
68.
Hauer, K.E., Soni, K., Cornett, P., Kohlwes, J., Hollander, H., Ranji, S.R., ten Cate, O., Widera, E., Calton, B., O’Sullivan, P.S.: Developing Entrustable Professional Activities as the Basis for Assessment of Competence in an Internal Medicine Residency: A Feasibility Study. Journal of General Internal Medicine. 28, 1110–1114 (2013). https://doi.org/10.1007/s11606-013-2372-x.
69.
Orsini, C., Binnie, V.I.: Entrustment decisions in dental education: Is it time to start formalising? Medical Teacher. 38, 322–322 (2016). https://doi.org/10.3109/0142159X.2015.1114598.
70.
Auewarakul, C., Downing, S.M., Praditsuwan, R., Jaturatamrong, U.: Item Analysis to Improve Reliability for an Internal Medicine Undergraduate OSCE. Advances in Health Sciences Education. 10, 105–113 (2005). https://doi.org/10.1007/s10459-005-2315-3.
71.
NEWBLE, D.I., SWANSON, D.B.: Psychometric characteristics of the objective structured clinical examination. Medical Education. 22, 325–334 (1988). https://doi.org/10.1111/j.1365-2923.1988.tb00761.x.
72.
Sturpe, D.A.: Objective Structured Clinical Examinations in Doctor of Pharmacy Programs in the United States. American journal of pharmaceutical education. 74, (2010). https://doi.org/10.5688/aj7408148.
73.
Snell, L.S., Frank, J.R.: Competencies, the tea bag model, and the end of time. Medical Teacher. 32, 629–630 (2010). https://doi.org/10.3109/0142159X.2010.500707.
74.
Gravina, E.W.: Competency-Based Education and Its Effect on Nursing Education: A Literature Review. Teaching and Learning in Nursing. 12, 117–121 (2017). https://doi.org/10.1016/j.teln.2016.11.004.
75.
Read, E.K., Bell, C., Rhind, S., Hecker, K.G.: The Use of Global Rating Scales for OSCEs in Veterinary Medicine. PLOS ONE. 10, (2015). https://doi.org/10.1371/journal.pone.0121000.
76.
Wood, T.J., Pugh, D.: Are rating scales really better than checklists for measuring increasing levels of expertise? Medical Teacher. 42, 46–51 (2020). https://doi.org/10.1080/0142159X.2019.1652260.
77.
Hagel, C.M., Hall, A.K., Dagnone, J.D.: Queen’s University Emergency Medicine Simulation OSCE: an Advance in Competency-Based Assessment. CJEM. 18, 230–233 (2016). https://doi.org/10.1017/cem.2015.34.
78.
Tekian, A., Ten Cate, O., Holmboe, E., Roberts, T., Norcini, J.: Entrustment decisions: Implications for curriculum development and assessment. Medical Teacher. 1–7 (2020). https://doi.org/10.1080/0142159X.2020.1733506.
79.
Peters, H., Holzhausen, Y., Boscardin, C., ten Cate, O., Chen, H.C.: Twelve tips for the implementation of EPAs for assessment and entrustment decisions. Medical Teacher. 39, 802–807 (2017). https://doi.org/10.1080/0142159X.2017.1331031.
80.
Kakadia, R., Chen, E., Ohyama, H.: Implementing an online OSCE during the COVID‐19 pandemic. Journal of Dental Education. (2020). https://doi.org/10.1002/jdd.12323.
81.
Ryan, A., Carson, A., Reid, K., Smallwood, D., Judd, T.: Fully online OSCEs: A large cohort case study. MedEdPublish. 9, (2020). https://doi.org/10.15694/mep.2020.000214.1.
82.
J. G. Boyle: Viva la VOSCE? BMC Medical Education. 20, (2020).
83.
Hopwood, J., Myers, G., Sturrock, A.: Twelve tips for conducting a virtual OSCE. Medical Teacher. 43, 633–636 (2021). https://doi.org/10.1080/0142159X.2020.1830961.
84.
Norcini, J.J.: The Mini-CEX: A Method for Assessing Clinical Skills. Annals of Internal Medicine. 138, (2003). https://doi.org/10.7326/0003-4819-138-6-200303180-00012.
85.
Kessel, D., Jenkins, J., Neville, E.: Workplace based assessments are no more. BMJ. (2012). https://doi.org/10.1136/bmj.e6193.
86.
Norcini, J., Burch, V.: Workplace-based assessment as an educational tool: AMEE Guide No. 31. Medical Teacher. 29, 855–871 (2007). https://doi.org/10.1080/01421590701775453.
87.
Elstein, A.S., Sprafka, S.A., Shulman, L.S.: Medical Problem Solving: An Analysis of Clinical Reasoning. , Harvard University Press, 2013.
88.
Noel, G.L.: How Well Do Internal Medicine Faculty Members Evaluate the Clinical Skills of Residents? Annals of Internal Medicine. 117, (1992). https://doi.org/10.7326/0003-4819-117-9-757.
89.
Kogan, J.R., Bellini, L.M., Shea, J.A.: Feasibility, Reliability, and Validity of the Mini-Clinical Evaluation Exercise (mCEX) in a Medicine Core Clerkship. Academic Medicine. 78, (2003).
90.
Durning, S.J., Cation, L.J., Markert, R.J., Pangaro, L.N.: Assessing the Reliability and Validity of the Mini-Clinical Evaluation Exercise for Internal Medicine Residency Training. Academic Medicine. 77, (2002).
91.
Holmboe, E.S., Huot, S., Chung, J., Norcini, J., Hawkins, R.E.: Construct Validity of the MiniClinical Evaluation Exercise (miniCEX). Academic Medicine. 78, (2003).
92.
Torsney, K.M., Cocker, D.M., Slesser, A.A.P.: The Modern Surgeon and Competency Assessment: Are the Workplace-Based Assessments Evidence-Based? World Journal of Surgery. 39, 623–633 (2015). https://doi.org/10.1007/s00268-014-2875-6.
93.
Mitchell, C., Bhat, S., Herbert, A., Baker, P.: Workplace-based assessments of junior doctors: do scores predict training difficulties? Medical Education. 45, 1190–1198 (2011). https://doi.org/10.1111/j.1365-2923.2011.04056.x.
94.
Williams, R.G., Verhulst, S., Colliver, J.A., Dunnington, G.L.: Assuring the reliability of resident performance appraisals: More items or more observations? Surgery. 137, 141–147 (2005). https://doi.org/10.1016/j.surg.2004.06.011.
95.
Murphy, D.J., Bruce, D.A., Mercer, S.W., Eva, K.W.: The reliability of workplace-based assessment in postgraduate medical education and training: a national evaluation in general practice in the United Kingdom. Advances in Health Sciences Education. 14, 219–232 (2009). https://doi.org/10.1007/s10459-008-9104-8.
96.
Archer, J.C.: Use of SPRAT for peer review of paediatricians in training. BMJ. 330, 1251–1253 (2005). https://doi.org/10.1136/bmj.38447.610451.8F.
97.
Quantrill, S.J., Tun, J.K.: Workplace-based assessment as an educational tool. Guide supplement 31.5 – Viewpoint. Medical Teacher. 34, 417–418 (2012). https://doi.org/10.3109/0142159X.2012.668234.
98.
Hurst, Y.K., Prescott-Clements, L.E., Rennie, J.S.: The patient assessment questionnaire: A new instrument for evaluating the interpersonal skills of vocational dental practitioners. British Dental Journal. 197, 497–500 (2004). https://doi.org/10.1038/sj.bdj.4811750.
99.
Humphrey-Murto, S., Côté, M., Pugh, D., Wood, T.J.: Assessing the Validity of a Multidisciplinary Mini-Clinical Evaluation Exercise. Teaching and Learning in Medicine. 30, 152–161 (2018). https://doi.org/10.1080/10401334.2017.1387553.
100.
Rekman, J., Hamstra, S.J., Dudek, N., Wood, T., Seabrook, C., Gofton, W.: A New Instrument for Assessing Resident Competence in Surgical Clinic: The Ottawa Clinic Assessment Tool. Journal of Surgical Education. 73, 575–582 (2016). https://doi.org/10.1016/j.jsurg.2016.02.003.
101.
Sutherland, R.M., Reid, K.J., Chiavaroli, N.G., Smallwood, D., McColl, G.J.: Assessing Diagnostic Reasoning Using a Standardized Case-Based Discussion. Journal of Medical Education and Curricular Development. 6, (2019). https://doi.org/10.1177/2382120519849411.
102.
Driessen, E.W., Muijtjens, A.M.M., van Tartwijk, J., van der Vleuten, C.P.M.: Web- or paper-based portfolios: is there a difference? Medical Education. 41, 1067–1073 (2007). https://doi.org/10.1111/j.1365-2923.2007.02859.x.
103.
Driessen, E., van Tartwijk, J.: Portfolios in personal and professional development. In: Swanwick, T. (ed.) Understanding Medical Education. pp. 255–262. Wiley-Blackwell, Chichester, UK (2013). https://doi.org/10.1002/9781119373780.ch18.
104.
Siau, K., Dunckley, P., Valori, R., Feeney, M., Hawkes, N., Anderson, J., Beales, I., Wells, C., Thomas-Gibson, S., Johnson, G.: Changes in scoring of Direct Observation of Procedural Skills (DOPS) forms and the impact on competence assessment. Endoscopy. 50, 770–778 (2018). https://doi.org/10.1055/a-0576-6667.
105.
Martinsen, S.S.S., Espeland, T., Berg, E.A.R., Samstad, E., Lillebo, B., Slørdahl, T.S.: Examining the educational impact of the mini-CEX: a randomised controlled study. BMC Medical Education. 21, (2021). https://doi.org/10.1186/s12909-021-02670-3.
106.
Cohen, L., Manion, L., Morrison, K.: Research methods in education. Routledge, London (2018).
107.
Joanna Briggs Institute QARI, https://jbi.global/.
108.
Buckley, S., Coleman, J., Davison, I., Khan, K.S., Zamora, J., Malick, S., Morley, D., Pollard, D., Ashcroft, T., Popovic, C., Sayers, J.: The educational effects of portfolios on undergraduate student learning: A Best Evidence Medical Education (BEME) systematic review. BEME Guide No. 11. Medical Teacher. 31, 282–298 (2009). https://doi.org/10.1080/01421590902889897.
109.
Brookfield, S.: Developing critical thinkers: challenging adults to explore alternative ways of thinking and acting. Open University Press, Milton Keynes (1987).
110.
Burls, A.: What is critical appraisal?, https://www.academia.edu/92786872/What_Is_Critical_Appraisal, (2009).
111.
The Campbell Collaboration, http://www.campbellcollaboration.org/.
112.
CASP Critical Appraisal Skills Programme Oxford UK, http://www.casp-uk.net/.
113.
Cochrane | Trusted evidence. Informed decisions. Better health., http://www.cochrane.org/.
114.
Kee, F., Bickle, I.: Critical thinking and critical appraisal: the chicken and the egg? QJM. 97, 609–614 (2004). https://doi.org/10.1093/qjmed/hch099.
115.
Da Silva, A., Dennick, R.: Corpus analysis of problem-based learning transcripts: an exploratory study. Medical Education. 44, 280–288 (2010). https://doi.org/10.1111/j.1365-2923.2009.03575.x.
116.
Garrison, D.R.: Critical thinking and adult education: a conceptual model for developing critical thinking in adult learners. International Journal of Lifelong Education. 10, 287–303 (1991). https://doi.org/10.1080/0260137910100403.
117.
Hammick, M., Dornan, T., Steinert, Y.: Conducting a best evidence systematic review. Part 1: From idea to data coding. BEME Guide No. 13. Medical Teacher. 32, 3–15 (2010). https://doi.org/10.3109/01421590903414245.
118.
Horsley, T., Hyde, C., Santesso, N., Parkes, J., Milne, R., Stewart, R.: Teaching critical appraisal skills in healthcare settings. Cochrane Database of Systematic Reviews. (1996). https://doi.org/10.1002/14651858.CD001270.pub2.
119.
Huang, G.C., Newman, L.R., Schwartzstein, R.M.: Critical Thinking in Health Professions Education: Summary and Consensus Statements of the Millennium Conference 2011. Teaching and Learning in Medicine. 26, 95–102 (2014). https://doi.org/10.1080/10401334.2013.857335.
120.
Jenicek, M.: The hard art of soft science: Evidence-Based Medicine, Reasoned Medicine or both? Journal of Evaluation in Clinical Practice. 12, 410–419 (2006). https://doi.org/10.1111/j.1365-2753.2006.00718.x.
121.
Kirkpatrick, D.: Great Ideas Revisited: Revisiting Kirkpatrick’s Four-Level Model. Training and Development. 50, 54–59 (1996).
122.
Missimer, C.A.: Good arguments: an introduction to critical thinking. Prentice Hall, Englewood Cliffs, N.J. (1995).
123.
Moore, T.J.: Critical thinking and disciplinary thinking: a continuing debate. Higher Education Research & Development. 30, 261–274 (2011). https://doi.org/10.1080/07294360.2010.501328.
124.
Paul, R.: Critical thinking: how to prepare students for a rapidly changing world. foundation for critical thinking (1995).
125.
Paul, R., Elder, L.: The Miniature Guide to Critical Thinking: Concepts and Tools, https://www.criticalthinking.org/files/Concepts_Tools.pdf, (2006).
126.
Yardley, S., Dornan, T.: Kirkpatrick’s levels and education ‘evidence’. Medical Education. 46, 97–106 (2012). https://doi.org/10.1111/j.1365-2923.2011.04076.x.
127.
Devine, O.P., Harborne, A.C., McManus, I.C.: Assessment at UK medical schools varies substantially in volume, type and intensity and correlates with postgraduate attainment. BMC Medical Education. 15, (2015). https://doi.org/10.1186/s12909-015-0428-9.
128.
Ertmer, P.A., Richardson, J.C., Belland, B., Camin, D., Connolly, P., Coulthard, G., Lei, K., Mong, C.: Using Peer Feedback to Enhance the Quality of Student Online Postings: An Exploratory Study. Journal of Computer-Mediated Communication. 12, 412–433 (2007). https://doi.org/10.1111/j.1083-6101.2007.00331.x.
129.
Nicol, D.J., Macfarlane‐Dick, D.: Formative assessment and self‐regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education. 31, 199–218 (2006). https://doi.org/10.1080/03075070600572090.
130.
Piedra, N., Chicaiza, J., Lopez, J., Romero, A., Tovar, E.: Measuring collaboration and creativity skills through rubrics: Experience from UTPL collaborative social networks course. In: IEEE EDUCON 2010 Conference. pp. 1511–1516. IEEE (2010). https://doi.org/10.1109/EDUCON.2010.5492349.
131.
De Wever, B., Van Keer, H., Schellens, T., Valcke, M.: Assessing collaboration in a wiki: The reliability of university students’ peer assessment. The Internet and Higher Education. 14, 201–206 (2011). https://doi.org/10.1016/j.iheduc.2011.07.003.
132.
Verkerk, M.A., de Bree, M.J., Mourits, M.J.E.: Reflective professionalism: interpreting CanMEDS’ ‘professionalism’. Journal of Medical Ethics. 33, 663–666 (2007). https://doi.org/10.1136/jme.2006.017954.
133.
Cleland, J.A., Knight, L.V., Rees, C.E., Tracey, S., Bond, C.M.: Is it me or is it them? Factors that influence the passing of underperforming students. Medical Education. 42, 800–809 (2008). https://doi.org/10.1111/j.1365-2923.2008.03113.x.
134.
Cruess, R.: The Professionalism Mini-Evaluation Exercise: A Preliminary Investigation. Academic Medicine. 81,.
135.
Goldie, J.: Assessment of professionalism: A consolidation of current thinking. Medical Teacher. 35, e952–e956 (2013). https://doi.org/10.3109/0142159X.2012.714888.
136.
van Mook, W.N.K.A., Gorter, S.L., O’Sullivan, H., Wass, V., Schuwirth, L.W., van der Vleuten, C.P.M.: Approaches to professional behaviour assessment: Tools in the professionalism toolbox. European Journal of Internal Medicine. 20, e153–e157 (2009). https://doi.org/10.1016/j.ejim.2009.07.012.
137.
Ginsburg, S., Regehr, G., Lingard, L.: Basing the evaluation of professionalism on observable behaviours: a cautionary tale. Academic Medicine. 79, S1–S4 (2004).
138.
Hodges, B.D., Ginsburg, S., Cruess, R., Cruess, S., Delport, R., Hafferty, F., Ho, M.-J., Holmboe, E., Holtman, M., Ohbu, S., Rees, C., Ten Cate, O., Tsugawa, Y., Van Mook, W., Wass, V., Wilkinson, T., Wade, W.: Assessment of professionalism: Recommendations from the Ottawa 2010 Conference. Medical Teacher. 33, 354–363 (2011). https://doi.org/10.3109/0142159X.2011.577300.
139.
Schubert, S., Ortwein, H., Dumitsch, A., Schwantes, U., Wilhelm, O., Kiessling, C., Schubert, S., Ortwein, H., Dumitsch, A., Schwantes, U., Wilhelm, O., Kiessling, C.: A situational judgement test of professional behaviour: development and validation. Medical Teacher. 30, 528–533 (2008). https://doi.org/10.1080/01421590801952994.
140.
Arnold, L., Shue, C.K., Kritt, B., Ginsburg, S., Stern, D.T.: Medical students’ views on peer assessment of professionalism. Journal of General Internal Medicine. 20, 819–824 (2005). https://doi.org/10.1111/j.1525-1497.2005.0162.x.
141.
Arnold, L., Shue, C.K., Kalishman, S., Prislin, M., Pohl, C., Pohl, H., Stern, D.T.: Can There Be a Single System for Peer Assessment of Professionalism among Medical Students? A Multi-Institutional Study. Academic Medicine. 82, 578–586 (2007). https://doi.org/10.1097/ACM.0b013e3180555d4e.
142.
Finn, G., Sawdon, M., Clipsham, L., McLachlan, J.: Peer estimation of lack of professionalism correlates with low Conscientiousness Index scores. Medical Education. 43, 960–967 (2009). https://doi.org/10.1111/j.1365-2923.2009.03453.x.
143.
Gaufberg, E., Fitzpatrick, A.: The favour: a professional boundaries OSCE station. Medical Education. 42, 529–530 (2008). https://doi.org/10.1111/j.1365-2923.2008.03067.x.
144.
Ginsburg, S.: Context, Conflict, and Resolution: A New Conceptual Framework for Evaluating Professionalism. Academic Medicine. 75,.
145.
Ginsburg, S., Regehr, G., Mylopoulos, M.: From behaviours to attributions: further concerns regarding the evaluation of professionalism. Medical Education. 43, 414–425 (2009). https://doi.org/10.1111/j.1365-2923.2009.03335.x.
146.
Ginsburg, S., van der Vleuten, C., Eva, K.W., Lingard, L.: Hedging to save face: a linguistic analysis of written comments on in-training evaluation reports. Advances in Health Sciences Education. 21, 175–188 (2016). https://doi.org/10.1007/s10459-015-9622-0.
147.
GMC: Development of generic professional capabilities, http://www.gmc-uk.org/education/23581.asp.
148.
Kelly, M., O’Flynn, S., McLachlan, J., Sawdon, M.A.: The Clinical Conscientiousness Index. Academic Medicine. 87, 1218–1224 (2012). https://doi.org/10.1097/ACM.0b013e3182628499.
149.
McCormack, W.T., Lazarus, C., Stern, D., Small, P.A.: Peer Nomination: A Tool for Identifying Medical Student Exemplars in Clinical Competence and Caring, Evaluated at Three Medical Schools. Academic Medicine. 82, 1033–1039 (2007). https://doi.org/10.1097/01.ACM.0000285345.75528.ee.
150.
McLachlan, J.C., Finn, G., Macnaughton, J.: The Conscientiousness Index: A Novel Tool to Explore Students’ Professionalism. Academic Medicine. 84, 559–565 (2009). https://doi.org/10.1097/ACM.0b013e31819fb7ff.
151.
Norcini, J.J.: Peer assessment of competence. Medical Education. 37, 539–543 (2003). https://doi.org/10.1046/j.1365-2923.2003.01536.x.
152.
Papadakis, M.A., Teherani, A., Banach, M.A., Knettler, T.R., Rattner, S.L., Stern, D.T., Veloski, J.J., Hodgson, C.S.: Disciplinary Action by Medical Boards and Prior Behavior in Medical School. New England Journal of Medicine. 353, 2673–2682 (2005). https://doi.org/10.1056/NEJMsa052596.
153.
Pohl, C.A., Hojat, M., Arnold, L.: Peer Nominations as Related to Academic Attainment, Empathy, Personality, and Specialty Interest. Academic Medicine. 86, 747–751 (2011). https://doi.org/10.1097/ACM.0b013e318217e464.
154.
Ramsey, P.G.: Use of Peer Ratings to Evaluate Physician Performance. JAMA: The Journal of the American Medical Association. 269, (1993). https://doi.org/10.1001/jama.1993.03500130069034.
155.
Royal College of Physicians: Doctors in Society: Medical professionalism in a changing world, https://cdn.shopify.com/s/files/1/0924/4392/files/doctors_in_society_reportweb.pdf?15745311214883953343, (2005).
156.
Stern, D.T., Frohna, A.Z., Gruppen, L.D.: The prediction of professional behaviour. Medical Education. 39, 75–82 (2005). https://doi.org/10.1111/j.1365-2929.2004.02035.x.
157.
Stern, D.T.: Measuring medical professionalism. Oxford University Press, New York (2006).
158.
van Mook, W.N.K.A., van Luijk, S.J., O’Sullivan, H., Wass, V., Schuwirth, L.W., van der Vleuten, C.P.M.: General considerations regarding assessment of professional behaviour. European Journal of Internal Medicine. 20, e90–e95 (2009). https://doi.org/10.1016/j.ejim.2008.11.011.
159.
Wilkinson, T.J., Wade, W.B., Knock, L.D.: A Blueprint to Assess Professionalism: Results of a Systematic Review. Academic Medicine. 84, 551–558 (2009). https://doi.org/10.1097/ACM.0b013e31819fbaa2.
160.
Zijlstra-Shaw, S., Robinson, P.G., Roberts, T.: Assessing professionalism within dental education; the need for a definition. European Journal of Dental Education. 16, e128–e136 (2012). https://doi.org/10.1111/j.1600-0579.2011.00687.x.
161.
Patterson, F., Zibarras, L., Ashworth, V.: Situational judgement tests in medical education and training: Research, theory and practice: AMEE Guide No. 100. Medical Teacher. 38, 3–17 (2016). https://doi.org/10.3109/0142159X.2015.1072619.
162.
Gorania, R.: Situational judgement stress. British Dental Journal. 231, 426–426 (2021). https://doi.org/10.1038/s41415-021-3577-8.
163.
Royal College of Physicians: Advancing medical professionalism, https://www.healthcarevalues.ox.ac.uk/files/ampsummarypdf.
164.
Rimmer, A.: Situational judgment test is scrapped under new system for allocating foundation training places. BMJ. (2023). https://doi.org/10.1136/bmj.p1269.
165.
McKinley, D.W., Norcini, J.J.: How to set standards on performance-based examinations: AMEE Guide No. 85. Medical Teacher. 36, 97–110 (2014). https://doi.org/10.3109/0142159X.2013.853119.
166.
Ben-David, M.F.: AMEE Guide No. 18: Standard setting in student assessment. Medical Teacher. 22, 120–130 (2000). https://doi.org/10.1080/01421590078526.
167.
De Champlain, A.F.: Standard Setting Methods in Medical Education. In: Swanwick, T. (ed.) Understanding Medical Education. pp. 347–359. Wiley-Blackwell, Oxford, UK (2019). https://doi.org/10.1002/9781119373780.ch24.
168.
Cohen-Schotanus, J., van der Vleuten, C.P.M.: A standard setting method with the best performing students as point of reference: Practical and affordable. Medical Teacher. 32, 154–160 (2010). https://doi.org/10.3109/01421590903196979.
169.
Downing, S.M., Tekian, A., Yudkowsky, R.: RESEARCH METHODOLOGY: Procedures for Establishing Defensible Absolute Passing Scores on Performance Examinations in Health Professions Education. Teaching and Learning in Medicine. 18, 50–57 (2006). https://doi.org/10.1207/s15328015tlm1801_11.
170.
Hofstee, W.K.B.: The Case for Compromise in Educational Selection and Grading. On Educational Testing. (1984).
171.
Fowell, S.L., Fewtrell, R., McLaughlin, P.J.: Estimating the Minimum Number of Judges Required for Test-centred Standard Setting on Written Assessments. Do Discussion and Iteration have an Influence? Advances in Health Sciences Education. 13, 11–24 (2008). https://doi.org/10.1007/s10459-006-9027-1.
172.
Taylor, C.A.: Development of a modified Cohen method of standard setting. Medical Teacher. 33, e678–e682 (2011). https://doi.org/10.3109/0142159X.2011.611192.
173.
Karantonis, A., Sireci, S.G.: The Bookmark Standard-Setting Method: A Literature Review. Educational Measurement: Issues and Practice. 25, 4–12 (2006). https://doi.org/10.1111/j.1745-3992.2006.00047.x.
174.
Puryer, J., O’Sullivan, D.: An introduction to standard setting methods in dentistry. BDJ. 219, 355–358 (2015). https://doi.org/10.1038/sj.bdj.2015.755.
175.
Linn, A.M.J., Tonkin, A., Duggan, P.: Standard setting of script concordance tests using an adapted Nedelsky approach. Medical Teacher. 35, 314–319 (2013). https://doi.org/10.3109/0142159X.2012.746446.
176.
Woodhouse, L: Comparison of Cohen and Angoff methods of standard setting: is Angoff worth it? European Board of Medical Assessors Annual Academic Conference: Crossing Boundaries: Assessment in Medical Education.
177.
Barbara S. Plake, James C. Impara and Patrick M. Irwin: Consistency of Angoff-Based Predictions of Item Performance: Evidence of Technical Quality of Results from the Angoff Standard Setting Method. Journal of Educational Measurement. 37, (2000).
178.
IC McManus: Implementing statistical equating for MRCP(UK) parts 1 and 2. BMC Medical Education. 14, (2014).
179.
Wood, D.F.: Formative Assessment. In: Swanwick, T. (ed.) Understanding Medical Education. pp. 317–328. John Wiley & Sons, Ltd, Oxford, UK (2019). https://doi.org/10.1002/9781119373780.ch25.
180.
Bing-You, R.G.: Why Medical Educators May Be Failing at Feedback. JAMA. 302, (2009). https://doi.org/10.1001/jama.2009.1393.
181.
Ramani, S., Krackov, S.K.: Twelve tips for giving feedback effectively in the clinical environment. Medical Teacher. 34, 787–791 (2012). https://doi.org/10.3109/0142159X.2012.684916.
182.
Van De Ridder, J.M.M., Stokking, K.M., McGaghie, W.C., Ten Cate, O.T.J.: What is feedback in clinical education? Medical Education. 42, 189–197 (2008). https://doi.org/10.1111/j.1365-2923.2007.02973.x.
183.
Rushton, A.: Formative assessment: a key to deep learning? Medical Teacher. 27, 509–513 (2005). https://doi.org/10.1080/01421590500129159.
184.
Sender Liberman, A., Liberman, M., Steinert, Y., McLeod, P., Meterissian, S.: Surgery residents and attending surgeons have different perceptions of feedback. Medical Teacher. 27, 470–472 (2005). https://doi.org/10.1080/0142590500129183.
185.
Duers, L.E., Brown, N.: An exploration of student nurses’ experiences of formative assessment. Nurse Education Today. 29, 654–659 (2009). https://doi.org/10.1016/j.nedt.2009.02.007.
186.
Olson, B.L., McDonald, J.L.: Influence of Online Formative Assessment Upon Student Learning in Biomedical Science Courses. Journal of Dental Education. 68, 656–659 (2004). https://doi.org/10.1002/j.0022-0337.2004.68.6.tb03783.x.
187.
Garrison, C., Ehringhaus, M.: Formative and Summative Assessments in the Classroom. (2007).
188.
Sadler, D.R.: Formative Assessment: revisiting the territory. Assessment in Education: Principles, Policy & Practice. 5, 77–84 (1998). https://doi.org/10.1080/0969595980050104.
189.
Ditchfield, C.: How do learners make sense of the formative assessment opportunities available to inform their learning in a PBL course. (2007).
190.
Weaver, M.R.: Do students value feedback? Student perceptions of tutors’ written responses. Assessment & Evaluation in Higher Education. 31, 379–394 (2006). https://doi.org/10.1080/02602930500353061.
191.
Ferrell, G.: Supporting assessment and feedback practice with technology: from tinkering to transformation, https://repository.jisc.ac.uk/5450/, (2013).
192.
Black, P., Wiliam, D.: Assessment and Classroom Learning. Assessment in Education: Principles, Policy & Practice. 5, 7–74 (1998). https://doi.org/10.1080/0969595980050102.
193.
Sadler, D.R.: Formative assessment and the design of instructional systems. Instructional Science. 18, 119–144 (1989). https://doi.org/10.1007/BF00117714.
194.
Elnicki, D.M., Layne, R.D., Ogden, P.E., Morris, D.K.: Oral versus written feedback in medical clinic. Journal of General Internal Medicine. 13, 155–158 (1998). https://doi.org/10.1046/j.1525-1497.1998.00049.x.
195.
Norcross, W.A.: The Consultation: An Approach to Learning and Teaching. JAMA: The Journal of the American Medical Association. 253, (1985). https://doi.org/10.1001/jama.1985.03350270123038.
196.
Jackson, J.L., Kay, C., Jackson, W.C., Frank, M.: The Quality of Written Feedback by Attendings of Internal Medicine Residents. Journal of General Internal Medicine. 30, 973–978 (2015). https://doi.org/10.1007/s11606-015-3237-2.
197.
Rudolph, J., Raemer, D., Shapiro, J.: We know what they did wrong, but not why : the case for ‘frame-based’ feedback. The Clinical Teacher. 10, 186–189 (2013). https://doi.org/10.1111/j.1743-498X.2012.00636.x.
198.
Scally, G., Donaldson, L.J.: Looking forward: Clinical governance and the drive for quality improvement in the new NHS in England. BMJ. 317, 61–65 (1998). https://doi.org/10.1136/bmj.317.7150.61.
199.
Shaw, S.: Research governance: where did it come from, what does it mean? Journal of the Royal Society of Medicine. 98, 496–502 (2005).
200.
Dimensions of Quality, https://www.advance-he.ac.uk/knowledge-hub/dimensions-quality, (2010).
201.
Anderson, D., Ackerman-Anderson, L.S.: Beyond change management: how to achieve breakthrough results through conscious change leadership. Pfeiffer, San Francisso (2010).
202.
Hay, D., Kinchin, I., Lygo‐Baker, S.: Making learning visible: the role of concept mapping in higher education. Studies in Higher Education. 33, 295–311 (2008). https://doi.org/10.1080/03075070802049251.
203.
Hay, D.B., Tan, P.L., Whaites, E.: Non‐traditional learners in higher education: comparison of a traditional MCQ examination with concept mapping to assess learning in a dental radiological science course. Assessment & Evaluation in Higher Education. 35, 577–595 (2010). https://doi.org/10.1080/02602931003782525.
204.
Richstone, L., Schwartz, M.J., Seideman, C., Cadeddu, J., Marshall, S., Kavoussi, L.R.: Eye Metrics as an Objective Assessment of Surgical Skill. Annals of Surgery. 252, 177–182 (2010). https://doi.org/10.1097/SLA.0b013e3181e464fb.
205.
Suetsugu, N., Ohki, M., Kaku, T.: Quantitative Analysis of Nursing Observation Employing a Portable Eye-Tracker. Open Journal of Nursing. 06, 53–61 (2016). https://doi.org/10.4236/ojn.2016.61006.
206.
Gould, J., Day, P.: Hearing you loud and clear: student perspectives of audio feedback in higher education. Assessment & Evaluation in Higher Education. 38, 554–566 (2013). https://doi.org/10.1080/02602938.2012.660131.
207.
Frost, J., de Pont, G., Brailsford, I.: Expanding assessment methods and moments in history. Assessment & Evaluation in Higher Education. 37, 293–304 (2012). https://doi.org/10.1080/02602938.2010.531247.
208.
Harrison, C.J., Molyneux, A.J., Blackwell, S., Wass, V.J.: How we give personalised audio feedback after summative OSCEs. Medical Teacher. 37, 323–326 (2015). https://doi.org/10.3109/0142159X.2014.932901.
209.
Voelkel, S., Mello, L.V.: Audio Feedback – Better Feedback? Bioscience Education. 22, 16–30 (2014). https://doi.org/10.11120/beej.2014.00022.
210.
Ashraf, H., Sodergren, M.H., Merali, N., Mylonas, G., Singh, H., Darzi, A.: Eye-tracking technology in medical education: A systematic review. Medical Teacher. 40, 62–69 (2018). https://doi.org/10.1080/0142159X.2017.1391373.
211.
Mayer, R.E.: Cognitive Learning. In: Encyclopedia of the sciences of learning. Springer, [S.l.] (2012).
212.
Ho, V.W., Harris, P.G., Kumar, R.K., Velan, G.M.: Knowledge maps: a tool for online assessment with automated feedback. Medical Education Online. 23, (2018). https://doi.org/10.1080/10872981.2018.1457394.
213.
Guraya, S.Y.: The Desired Concept Maps and Goal Setting for Assessing Professionalism in Medicine. JOURNAL OF CLINICAL AND DIAGNOSTIC RESEARCH. (2016). https://doi.org/10.7860/JCDR/2016/19917.7832.
214.
Kassab, S.E., Fida, M., Radwan, A., Hassan, A.B., Abu-Hijleh, M., O’Connor, B.P.: Generalisability theory analyses of concept mapping assessment scores in a problem-based medical curriculum. Medical Education. 50, 730–737 (2016). https://doi.org/10.1111/medu.13054.
215.
Courteille, O., Bergin, R., Courteille, O., Bergin, R., Stockeld, D., Ponzer, S., Fors, U.: The use of a virtual patient case in an OSCE-based exam – A pilot study. Medical Teacher. 30, e66–e76 (2008). https://doi.org/10.1080/01421590801910216.
216.
Downing, S.M.: Guessing on selected-response examinations. Medical Education. 37, 670–671 (2003). https://doi.org/10.1046/j.1365-2923.2003.01585.x.
217.
Ingrid Tonni, Cynthia C. Gadbury‐Amyot, Marjan Govaerts, Olle ten Cate, Joan Davis, Lily T. Garcia, Richard W. Valachovic: ADEA‐ADEE Shaping the Future of Dental Education III. Journal of Dental Education. 84, 97–104 (2020). https://doi.org/10.1002/jdd.12024.