[1]
C. dos S. Ribeiro, L. H. M. van de Burgwal, and B. J. Regeer, ‘Overcoming challenges for designing and implementing the One Health approach: A systematic review of the literature’, One Health, vol. 7, June 2019, doi: 10.1016/j.onehlt.2019.100085
[2]
B. D. Hodges, A practical guide for medical teachers, Fifth edition. Edinburgh: Elsevier, 2017. Available: https://www.vlebooks.com/vleweb/product/openreader?id=GlasgowUni&isbn=9780702068935
[3]
Association for the Study of Medical Education, Understanding medical education: evidence, theory, and practice, Third edition. Hoboken, NJ: Wiley-Blackwell, 2019. Available: https://ezproxy.lib.gla.ac.uk/login?url=https://dx.doi.org/10.1002/9781119373780
[4]
S. Marshall, Ed., A handbook for teaching and learning in higher education: enhancing academic practice, Fifth edition. Abingdon, Oxon: Routledge, 2020. Available: https://ebookcentral.proquest.com/lib/gla/detail.action?docID=5983041
[5]
Eric. S. Holmboe and S. J. Durning, Practical Guide to the Assessment of Clinical Competence, Third Edition. 2024.
[6]
J. Norcini et al., ‘Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 Conference’, Medical Teacher, vol. 33, no. 3, pp. 206–214, Mar. 2011, doi: 10.3109/0142159X.2011.551559
[7]
C. P. M. Van Der Vleuten, ‘The assessment of professional competence: Developments, research and practical implications’, Advances in Health Sciences Education, vol. 1, no. 1, pp. 41–67, Jan. 1996, doi: 10.1007/BF00596229
[8]
C. P. M. van der Vleuten and L. W. T. Schuwirth, ‘Assessing professional competence: from methods to programmes’, Medical Education, vol. 39, no. 3, pp. 309–317, Mar. 2005, doi: 10.1111/j.1365-2929.2005.02094.x
[9]
L. W. Schuwirth and C. P. van der Vleuten, ‘How to Design a Useful Test: The Principles of Assessment’, in Understanding Medical Education, T. Swanwick, Ed., 3rd ed.Oxford, UK: Wiley-Blackwell, 2019, pp. 277–289. doi: 10.1002/9781119373780.ch20. Available: https://ezproxy.lib.gla.ac.uk/login?url=https://onlinelibrary.wiley.com/doi/10.1002/9781119373780.ch20
[10]
C. P. M. van der Vleuten et al., ‘A model for programmatic assessment fit for purpose’, Medical Teacher, vol. 34, no. 3, pp. 205–214, Mar. 2012, doi: 10.3109/0142159X.2012.652239
[11]
J. Biggs, ‘Enhancing Teaching through Constructive Alignment’, Higher Education, vol. 32, no. 3, pp. 347–364, 1996, Available: https://www.jstor.org/stable/3448076
[12]
G. Miller, ‘The assessment of clinical skills/competence/performance’, Academic Medicine, vol. 65, no. 9, 1990, Available: https://ezproxy.lib.gla.ac.uk/login?url=https://oce.ovid.com/article/00001888-199009000-00045/PDF
[13]
L. W. T. Schuwirth and C. P. M. van der Vleuten, ‘Programmatic assessment and Kane’s validity perspective’, Medical Education, vol. 46, no. 1, pp. 38–48, Jan. 2012, doi: 10.1111/j.1365-2923.2011.04098.x
[14]
R. M. Epstein, ‘Defining and Assessing Professional Competence’, JAMA, vol. 287, no. 2, Jan. 2002, doi: 10.1001/jama.287.2.226
[15]
O. ten Cate, ‘Nuts and Bolts of Entrustable Professional Activities’, Journal of Graduate Medical Education, vol. 5, no. 1, pp. 157–158, Mar. 2013, doi: 10.4300/JGME-D-12-00380.1
[16]
O. Ten Cate, ‘Competency-Based Postgraduate Medical Education: Past, Present and Future’, GMS Journal for Medical Education, vol. 34, no. 5, 2017, doi: 10.3205/zma001146. Available: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5704607/
[17]
R. L. Cruess, S. R. Cruess, and Y. Steinert, ‘Amending Miller’s Pyramid to Include Professional Identity Formation’, Academic Medicine, vol. 91, no. 2, pp. 180–185, Feb. 2016, doi: 10.1097/ACM.0000000000000913
[18]
K. A. Cobb, G. Brown, D. A. D. C. Jaarsma, and R. A. Hammond, ‘The educational impact of assessment: A comparison of DOPS and MCQs’, Medical Teacher, vol. 35, no. 11, pp. e1598–e1607, Nov. 2013, doi: 10.3109/0142159X.2013.803061
[19]
B. Jolly and M. J. Dalton, ‘Written Assessment’, in Understanding Medical Education, T. Swanwick, K. Forrest, and B. C. O’Brien, Eds, 3rd ed.Chichester, UK: John Wiley & Sons, Ltd, 2018, pp. 291–317. doi: 10.1002/9781119373780.ch21. Available: https://ezproxy.lib.gla.ac.uk/login?url=https://doi.wiley.com/10.1002/9781119373780.ch21
[20]
B. Jolly, ‘Written Assessment’, in Understanding Medical Education, T. Swanwick, Ed., 3rd ed.Oxford, UK: Wiley-Blackwell, 2019, pp. 261–261. doi: 10.1002/9781119373780.ch21. Available: https://ezproxy.lib.gla.ac.uk/login?url=https://onlinelibrary.wiley.com/doi/10.1002/9781119373780.ch21
[21]
R. M. Epstein, ‘Assessment in Medical Education’, New England Journal of Medicine, vol. 356, no. 4, pp. 387–396, Jan. 2007, doi: 10.1056/NEJMra054784
[22]
R. J. Hift, ‘Should essays and other "open-ended”-type questions retain a place in written summative assessment in clinical medicine?’, BMC Medical Education, vol. 14, no. 1, Dec. 2014, doi: 10.1186/s12909-014-0249-2
[23]
M. Paniagua and K. Swygert, Eds, ‘The Gold Book - constructing written test questions for the basic and clinical sciences’. Available: https://www.nbme.org/sites/default/files/2020-01/IWW_Gold_Book.pdf
[24]
L. W. T. Schuwirth and C. P. M. van der Vleuten, ‘ABC of learning and teaching in medicine: Written assessment’, BMJ, vol. 326, no. 7390, pp. 643–645, Mar. 2003, doi: 10.1136/bmj.326.7390.643
[25]
L. W. T. Schuwirth and C. P. M. van der Vleuten, ‘Different written assessment methods: what can be said about their strengths and weaknesses?’, Medical Education, vol. 38, no. 9, pp. 974–979, Sept. 2004, doi: 10.1111/j.1365-2929.2004.01916.x
[26]
L. W. T. Schuwirth and C. P. M. van der Vleuten, ‘General overview of the theories used in assessment: AMEE Guide No. 57’, Medical Teacher, vol. 33, no. 10, pp. 783–797, Oct. 2011, doi: 10.3109/0142159X.2011.611022
[27]
B. Charlin, L. Roy, C. Brailovsky, F. Goulet, and C. van der Vleuten, ‘The Script Concordance Test: A Tool to Assess the Reflective Clinician’, Teaching and Learning in Medicine, vol. 12, no. 4, pp. 189–195, Oct. 2000, doi: 10.1207/S15328015TLM1204_5
[28]
J. Fournier, A. Demeester, and B. Charlin, ‘Script Concordance Tests: Guidelines for Construction’, BMC Medical Informatics and Decision Making, vol. 8, 2008, doi: 10.1186/1472-6947-8-18. Available: https://ezproxy.lib.gla.ac.uk/login?url=https://bmcmedinformdecismak.biomedcentral.com/articles/10.1186/1472-6947-8-18
[29]
S. M. Case and D. B. Swanson, ‘Extended‐matching items: A practical alternative to free‐response questions’, Teaching and Learning in Medicine, vol. 5, no. 2, pp. 107–115, Jan. 1993, doi: 10.1080/10401339309539601
[30]
V. Dory, R. Gagnon, D. Vanpee, and B. Charlin, ‘How to construct and implement script concordance tests: insights from a systematic review’, Medical Education, vol. 46, no. 6, pp. 552–563, June 2012, doi: 10.1111/j.1365-2923.2011.04211.x
[31]
E. A. Farmer and G. Page, ‘A practical guide to assessing clinical decision-making skills using the key features approach’, Medical Education, vol. 39, no. 12, pp. 1188–1194, Dec. 2005, doi: 10.1111/j.1365-2929.2005.02339.x
[32]
B. FENDERSON, ‘The virtues of extended matching and uncued tests as alternatives to multiple choice questions’, Human Pathology, vol. 28, no. 5, pp. 526–532, May 1997, doi: 10.1016/S0046-8177(97)90073-3
[33]
T. M. Haladyna, S. M. Downing, and M. C. Rodriguez, ‘A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment’, Applied Measurement in Education, vol. 15, no. 3, pp. 309–333, July 2002, doi: 10.1207/S15324818AME1503_5
[34]
P. McCoubrie, ‘Improving the fairness of multiple-choice questions: a literature review’, Medical Teacher, vol. 26, no. 8, pp. 709–712, Dec. 2004, doi: 10.1080/01421590400013495
[35]
M. D. Miller, R. L. Linn, and N. E. Gronlund, Measurement and assessment in teaching, 11th ed., International ed. Boston, Mass: Pearson Education, 2013.
[36]
E. J. Palmer and P. G. Devitt, ‘Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions?: research paper’, BMC Medical Education, vol. 7, no. 1, 2007, doi: 10.1186/1472-6920-7-49
[37]
S. Lubarsky, V. Dory, S. Meterissian, C. Lambert, and R. Gagnon, ‘Examining the effects of gaming and guessing on script concordance test scores’, Perspectives on Medical Education, vol. 7, no. 3, pp. 174–181, June 2018, doi: 10.1007/s40037-018-0435-8
[38]
L. W. Anderson and B. S. Bloom, A taxonomy for learning, teaching, and assessing: a revision of Bloom’s taxonomy of educational objectives, Abridged ed. New York, N.Y.: Longman, 2001.
[39]
A. H. Sam et al., ‘Very-short-answer questions: reliability, discrimination and acceptability’, Medical Education, vol. 52, no. 4, pp. 447–455, Apr. 2018, doi: 10.1111/medu.13504
[40]
D. R. E. Cotton, P. A. Cotton, and J. R. Shipway, ‘Chatting and cheating: Ensuring academic integrity in the era of ChatGPT’, Innovations in Education and Teaching International, vol. 61, no. 2, pp. 228–239, Mar. 2024, doi: 10.1080/14703297.2023.2190148
[41]
K. A. M. Boursicot, T. E. Roberts, and W. P. Burdick, ‘Structured Assessments of Clinical Competence’, in Understanding Medical Education, T. Swanwick, K. Forrest, and B. C. O’Brien, Eds, Chichester, UK: John Wiley & Sons, Ltd, 2018, pp. 335–345. doi: 10.1002/9781119373780.ch23. Available: https://ezproxy.lib.gla.ac.uk/login?url=https://doi.wiley.com/10.1002/9781119373780.ch23
[42]
M. Schoonheim-Klein, A. Muijtjens, L. Habets, M. Manogue, C. Van Der Vleuten, and U. Van Der Velden, ‘Who will pass the dental OSCE? Comparison of the Angoff and the borderline regression standard setting methods’, European Journal of Dental Education, vol. 13, no. 3, pp. 162–171, Aug. 2009, doi: 10.1111/j.1600-0579.2008.00568.x
[43]
Regehr G1, MacRae H, Reznick RK, Szalay D., ‘Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination.’, Acad Med., 1998, Available: https://www.ncbi.nlm.nih.gov/pubmed/9759104
[44]
R. M. Harden, ‘Misconceptions and the OSCE’, Medical Teacher, vol. 37, no. 7, pp. 608–610, July 2015, doi: 10.3109/0142159X.2015.1042443
[45]
Carraccio, Carol;Wolfsthal, Susan D.;Englander, Robert;Ferentz, Kevin;Martin, Christine, ‘Shifting Paradigms: From Flexner to Competencies’, Academic Medicine, vol. 77, no. 5, Available: https://journals.lww.com/academicmedicine/Fulltext/2002/05000/Shifting_Paradigms__From_Flexner_to_Competencies.3.aspx
[46]
H. E. Rushforth, ‘Objective structured clinical examination (OSCE): Review of literature and implications for nursing education’, Nurse Education Today, vol. 27, no. 5, pp. 481–490, July 2007, doi: 10.1016/j.nedt.2006.08.009
[47]
A. Spielman, T. Fulmer, E. Eisenberg, and M. Alfano, ‘Dentistry, Nursing, and Medicine: A Comparisaon of Core Competencies’, Journal of Dental Education, vol. 69, no. 11, pp. 1257–1271, 2005, Available: https://pubmed.ncbi.nlm.nih.gov/16275689/
[48]
R. M. Harden, M. Stevenson, W. W. Downie, and G. M. Wilson, ‘Assessment of clinical competence using objective structured examination.’, BMJ, vol. 1, no. 5955, pp. 447–451, Feb. 1975, doi: 10.1136/bmj.1.5955.447
[49]
R. Watson, A. Stimpson, A. Topping, and D. Porock, ‘Clinical competence assessment in nursing: a systematic review of the literature’, Journal of Advanced Nursing, vol. 39, no. 5, pp. 421–431, Sept. 2002, doi: 10.1046/j.1365-2648.2002.02307.x
[50]
D. M. Williams, S. Davies, M. Horner, and J. Handley, ‘Peer and near-peer OSCE examiners’, Medical Teacher, vol. 38, no. 2, pp. 212–213, Feb. 2016, doi: 10.3109/0142159X.2015.1072266
[51]
C. Brown, S. Ross, J. Cleland, and K. Walsh, ‘Money makes the (medical assessment) world go round: The cost of components of a summative final year Objective Structured Clinical Examination (OSCE)’, Medical Teacher, vol. 37, no. 7, pp. 653–659, July 2015, doi: 10.3109/0142159X.2015.1033389
[52]
P. Meskell, E. Burke, T. J. B. Kropmans, E. Byrne, W. Setyonugroho, and K. M. Kennedy, ‘Back to the future: An online OSCE Management Information System for nursing OSCEs’, Nurse Education Today, vol. 35, no. 11, pp. 1091–1096, Nov. 2015, doi: 10.1016/j.nedt.2015.06.010
[53]
M. Tavakol and G. A. Doody, ‘A novel psychometric programme for the rapid analysis of OSCE data’, Medical Teacher, vol. 38, no. 1, pp. 104–105, Jan. 2016, doi: 10.3109/0142159X.2015.1062085
[54]
K. W. Eva, J. Rosenfeld, H. I. Reiter, and G. R. Norman, ‘An admissions OSCE: the multiple mini-interview’, Medical Education, vol. 38, no. 3, pp. 314–326, Mar. 2004, doi: 10.1046/j.1365-2923.2004.01776.x
[55]
P. Lane, ‘Recruitment into training for general practice—the winds of change or a breath of fresh air?’, BMJ, vol. 331, no. 7520, pp. s153–s153, Oct. 2005, doi: 10.1136/bmj.331.7520.s153
[56]
B. Hodges, G. Regehr, N. McNaughton, R. Tiberius, and M. Hanson, ‘OSCE checklists do not capture increasing levels of expertise’, vol. 74, no. 10, 1999, Available: https://journals.lww.com/academicmedicine/abstract/1999/10000/osce_checklists_do_not_capture_increasing_levels.17.aspx
[57]
B. Hodges and J. H. McIlroy, ‘Analytic global OSCE ratings are sensitive to level of training’, Medical Education, vol. 37, no. 11, pp. 1012–1016, Nov. 2003, doi: 10.1046/j.1365-2923.2003.01674.x
[58]
I. W. Y. Ma et al., ‘Comparing the use of global rating scale with checklists for the assessment of central venous catheterization skills using simulation’, Advances in Health Sciences Education, vol. 17, no. 4, pp. 457–470, Oct. 2012, doi: 10.1007/s10459-011-9322-3
[59]
T. J. Wood, S. M. Humphrey-Murto, and G. R. Norman, ‘Standard Setting in a Small Scale OSCE: A Comparison of the Modified Borderline-Group Method and the Borderline Regression Method’, Advances in Health Sciences Education, vol. 11, no. 2, pp. 115–122, May 2006, doi: 10.1007/s10459-005-7853-1
[60]
R. M. Harden, ‘Revisiting “Assessment of clinical competence using an objective structured clinical examination (OSCE)”’, Medical Education, vol. 50, no. 4, pp. 376–379, Apr. 2016, doi: 10.1111/medu.12801
[61]
R. M. Harden, P. Lilley, M. Patricio, and G. R. Norman, The definitive guide to the OSCE: the Objective Structured Clinical Examination as a performance assessment. Edinburgh: Elsevier, 2016. Available: https://www.vlebooks.com/vleweb/product/openreader?id=GlasgowUni&isbn=9780702055492
[62]
A. Denison, E. Bate, and J. Thompson, ‘Tablet versus paper marking in assessment: feedback matters’, Perspectives on Medical Education, vol. 5, no. 2, pp. 108–113, Apr. 2016, doi: 10.1007/s40037-016-0262-8
[63]
O. ten Cate, ‘Nuts and Bolts of Entrustable Professional Activities’, Journal of Graduate Medical Education, vol. 5, no. 1, pp. 157–158, Mar. 2013, doi: 10.4300/JGME-D-12-00380.1
[64]
R. M. Harden, ‘Learning outcomes as a tool to assess progression’, Medical Teacher, vol. 29, no. 7, pp. 678–682, Jan. 2007, doi: 10.1080/01421590701729955
[65]
M. Ross, ‘Entrustable professional activities’, The Clinical Teacher, vol. 12, no. 4, pp. 223–225, Aug. 2015, doi: 10.1111/tct.12436
[66]
O. ten Cate and J. Q. Young, ‘The patient handover as an entrustable professional activity: adding meaning in teaching and practice’, BMJ Quality & Safety, vol. 21, no. Suppl 1, pp. i9–i12, Dec. 2012, doi: 10.1136/bmjqs-2012-001213
[67]
M. Aylward, J. Nixon, and S. Gladding, ‘An Entrustable Professional Activity (EPA) for Handoffs as a Model for EPA Assessment Development’, Academic Medicine, vol. 89, no. 10, pp. 1335–1340, Oct. 2014, doi: 10.1097/ACM.0000000000000317
[68]
K. E. Hauer et al., ‘Developing Entrustable Professional Activities as the Basis for Assessment of Competence in an Internal Medicine Residency: A Feasibility Study’, Journal of General Internal Medicine, vol. 28, no. 8, pp. 1110–1114, Aug. 2013, doi: 10.1007/s11606-013-2372-x
[69]
C. Orsini and V. I. Binnie, ‘Entrustment decisions in dental education: Is it time to start formalising?’, Medical Teacher, vol. 38, no. 3, pp. 322–322, Mar. 2016, doi: 10.3109/0142159X.2015.1114598
[70]
C. Auewarakul, S. M. Downing, R. Praditsuwan, and U. Jaturatamrong, ‘Item Analysis to Improve Reliability for an Internal Medicine Undergraduate OSCE’, Advances in Health Sciences Education, vol. 10, no. 2, pp. 105–113, June 2005, doi: 10.1007/s10459-005-2315-3
[71]
D. I. NEWBLE and D. B. SWANSON, ‘Psychometric characteristics of the objective structured clinical examination’, Medical Education, vol. 22, no. 4, pp. 325–334, July 1988, doi: 10.1111/j.1365-2923.1988.tb00761.x
[72]
D. A. Sturpe, ‘Objective Structured Clinical Examinations in Doctor of Pharmacy Programs in the United States’, American journal of pharmaceutical education, vol. 74, no. 8, 2010, doi: 10.5688/aj7408148. Available: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2987288/
[73]
L. S. Snell and J. R. Frank, ‘Competencies, the tea bag model, and the end of time’, Medical Teacher, vol. 32, no. 8, pp. 629–630, Aug. 2010, doi: 10.3109/0142159X.2010.500707
[74]
E. W. Gravina, ‘Competency-Based Education and Its Effect on Nursing Education: A Literature Review’, Teaching and Learning in Nursing, vol. 12, no. 2, pp. 117–121, Apr. 2017, doi: 10.1016/j.teln.2016.11.004
[75]
E. K. Read, C. Bell, S. Rhind, and K. G. Hecker, ‘The Use of Global Rating Scales for OSCEs in Veterinary Medicine’, PLOS ONE, vol. 10, no. 3, Mar. 2015, doi: 10.1371/journal.pone.0121000
[76]
T. J. Wood and D. Pugh, ‘Are rating scales really better than checklists for measuring increasing levels of expertise?’, Medical Teacher, vol. 42, no. 1, pp. 46–51, Jan. 2020, doi: 10.1080/0142159X.2019.1652260
[77]
C. M. Hagel, A. K. Hall, and J. D. Dagnone, ‘Queen’s University Emergency Medicine Simulation OSCE: an Advance in Competency-Based Assessment’, CJEM, vol. 18, no. 3, pp. 230–233, May 2016, doi: 10.1017/cem.2015.34
[78]
A. Tekian, O. Ten Cate, E. Holmboe, T. Roberts, and J. Norcini, ‘Entrustment decisions: Implications for curriculum development and assessment’, Medical Teacher, pp. 1–7, Mar. 2020, doi: 10.1080/0142159X.2020.1733506
[79]
H. Peters, Y. Holzhausen, C. Boscardin, O. ten Cate, and H. C. Chen, ‘Twelve tips for the implementation of EPAs for assessment and entrustment decisions’, Medical Teacher, vol. 39, no. 8, pp. 802–807, Aug. 2017, doi: 10.1080/0142159X.2017.1331031
[80]
R. Kakadia, E. Chen, and H. Ohyama, ‘Implementing an online OSCE during the COVID‐19 pandemic’, Journal of Dental Education, July 2020, doi: 10.1002/jdd.12323
[81]
A. Ryan, A. Carson, K. Reid, D. Smallwood, and T. Judd, ‘Fully online OSCEs: A large cohort case study’, MedEdPublish, vol. 9, no. 1, 2020, doi: 10.15694/mep.2020.000214.1
[82]
J. G. Boyle, ‘Viva la VOSCE?’, BMC Medical Education, vol. 20, no. 1, 2020, Available: https://bmcmededuc.biomedcentral.com/articles/10.1186/s12909-020-02444-3
[83]
J. Hopwood, G. Myers, and A. Sturrock, ‘Twelve tips for conducting a virtual OSCE’, Medical Teacher, vol. 43, no. 6, pp. 633–636, June 2021, doi: 10.1080/0142159X.2020.1830961
[84]
J. J. Norcini, ‘The Mini-CEX: A Method for Assessing Clinical Skills’, Annals of Internal Medicine, vol. 138, no. 6, Mar. 2003, doi: 10.7326/0003-4819-138-6-200303180-00012
[85]
D. Kessel, J. Jenkins, and E. Neville, ‘Workplace based assessments are no more’, BMJ, Sept. 2012, doi: 10.1136/bmj.e6193
[86]
J. Norcini and V. Burch, ‘Workplace-based assessment as an educational tool: AMEE Guide No. 31’, Medical Teacher, vol. 29, no. 9–10, pp. 855–871, Jan. 2007, doi: 10.1080/01421590701775453
[87]
A. S. Elstein, S. A. Sprafka, and L. S. Shulman, Medical Problem Solving: An Analysis of Clinical Reasoning. Harvard University Press, 2013.
[88]
G. L. Noel, ‘How Well Do Internal Medicine Faculty Members Evaluate the Clinical Skills of Residents?’, Annals of Internal Medicine, vol. 117, no. 9, Nov. 1992, doi: 10.7326/0003-4819-117-9-757
[89]
J. R. Kogan, L. M. Bellini, and J. A. Shea, ‘Feasibility, Reliability, and Validity of the Mini-Clinical Evaluation Exercise (mCEX) in a Medicine Core Clerkship’, Academic Medicine, vol. 78, no. 10, 2003, Available: https://journals.lww.com/academicmedicine/Fulltext/2003/10001/Feasibility,_Reliability,_and_Validity_of_the.11.aspx
[90]
S. J. Durning, L. J. Cation, R. J. Markert, and L. N. Pangaro, ‘Assessing the Reliability and Validity of the Mini-Clinical Evaluation Exercise for Internal Medicine Residency Training’, Academic Medicine, vol. 77, no. 9, 2002, Available: https://journals.lww.com/academicmedicine/pages/articleviewer.aspx?year=2002&issue=09000&article=00020&type=abstract
[91]
E. S. Holmboe, S. Huot, J. Chung, J. Norcini, and R. E. Hawkins, ‘Construct Validity of the MiniClinical Evaluation Exercise (miniCEX)’, Academic Medicine, vol. 78, no. 8, 2003, Available: https://journals.lww.com/academicmedicine/pages/articleviewer.aspx?year=2003&issue=08000&article=00018&type=abstract
[92]
K. M. Torsney, D. M. Cocker, and A. A. P. Slesser, ‘The Modern Surgeon and Competency Assessment: Are the Workplace-Based Assessments Evidence-Based?’, World Journal of Surgery, vol. 39, no. 3, pp. 623–633, Mar. 2015, doi: 10.1007/s00268-014-2875-6
[93]
C. Mitchell, S. Bhat, A. Herbert, and P. Baker, ‘Workplace-based assessments of junior doctors: do scores predict training difficulties?’, Medical Education, vol. 45, no. 12, pp. 1190–1198, Dec. 2011, doi: 10.1111/j.1365-2923.2011.04056.x
[94]
R. G. Williams, S. Verhulst, J. A. Colliver, and G. L. Dunnington, ‘Assuring the reliability of resident performance appraisals: More items or more observations?’, Surgery, vol. 137, no. 2, pp. 141–147, Feb. 2005, doi: 10.1016/j.surg.2004.06.011
[95]
D. J. Murphy, D. A. Bruce, S. W. Mercer, and K. W. Eva, ‘The reliability of workplace-based assessment in postgraduate medical education and training: a national evaluation in general practice in the United Kingdom’, Advances in Health Sciences Education, vol. 14, no. 2, pp. 219–232, May 2009, doi: 10.1007/s10459-008-9104-8
[96]
J. C. Archer, ‘Use of SPRAT for peer review of paediatricians in training’, BMJ, vol. 330, no. 7502, pp. 1251–1253, May 2005, doi: 10.1136/bmj.38447.610451.8F
[97]
S. J. Quantrill and J. K. Tun, ‘Workplace-based assessment as an educational tool. Guide supplement 31.5 – Viewpoint’, Medical Teacher, vol. 34, no. 5, pp. 417–418, May 2012, doi: 10.3109/0142159X.2012.668234
[98]
Y. K. Hurst, L. E. Prescott-Clements, and J. S. Rennie, ‘The patient assessment questionnaire: A new instrument for evaluating the interpersonal skills of vocational dental practitioners’, British Dental Journal, vol. 197, no. 8, pp. 497–500, Oct. 2004, doi: 10.1038/sj.bdj.4811750
[99]
S. Humphrey-Murto, M. Côté, D. Pugh, and T. J. Wood, ‘Assessing the Validity of a Multidisciplinary Mini-Clinical Evaluation Exercise’, Teaching and Learning in Medicine, vol. 30, no. 2, pp. 152–161, Apr. 2018, doi: 10.1080/10401334.2017.1387553
[100]
J. Rekman, S. J. Hamstra, N. Dudek, T. Wood, C. Seabrook, and W. Gofton, ‘A New Instrument for Assessing Resident Competence in Surgical Clinic: The Ottawa Clinic Assessment Tool’, Journal of Surgical Education, vol. 73, no. 4, pp. 575–582, July 2016, doi: 10.1016/j.jsurg.2016.02.003
[101]
R. M. Sutherland, K. J. Reid, N. G. Chiavaroli, D. Smallwood, and G. J. McColl, ‘Assessing Diagnostic Reasoning Using a Standardized Case-Based Discussion’, Journal of Medical Education and Curricular Development, vol. 6, Jan. 2019, doi: 10.1177/2382120519849411
[102]
E. W. Driessen, A. M. M. Muijtjens, J. van Tartwijk, and C. P. M. van der Vleuten, ‘Web- or paper-based portfolios: is there a difference?’, Medical Education, vol. 41, no. 11, pp. 1067–1073, Nov. 2007, doi: 10.1111/j.1365-2923.2007.02859.x
[103]
E. Driessen and J. van Tartwijk, ‘Portfolios in personal and professional development’, in Understanding Medical Education, T. Swanwick, Ed., 3rd ed.Chichester, UK: Wiley-Blackwell, 2013, pp. 255–262. doi: 10.1002/9781119373780.ch18. Available: https://ezproxy.lib.gla.ac.uk/login?url=https://onlinelibrary.wiley.com/doi/10.1002/9781119373780.ch18
[104]
K. Siau et al., ‘Changes in scoring of Direct Observation of Procedural Skills (DOPS) forms and the impact on competence assessment’, Endoscopy, vol. 50, no. 08, pp. 770–778, Aug. 2018, doi: 10.1055/a-0576-6667
[105]
S. S. S. Martinsen, T. Espeland, E. A. R. Berg, E. Samstad, B. Lillebo, and T. S. Slørdahl, ‘Examining the educational impact of the mini-CEX: a randomised controlled study’, BMC Medical Education, vol. 21, no. 1, Dec. 2021, doi: 10.1186/s12909-021-02670-3
[106]
L. Cohen, L. Manion, and K. Morrison, Research methods in education, Eighth edition. London: Routledge, 2018. Available: https://www.vlebooks.com/vleweb/product/openreader?id=GlasgowUni&isbn=9781315456522
[107]
‘Joanna Briggs Institute QARI’. Available: https://jbi.global/
[108]
S. Buckley et al., ‘The educational effects of portfolios on undergraduate student learning: A Best Evidence Medical Education (BEME) systematic review. BEME Guide No. 11’, Medical Teacher, vol. 31, no. 4, pp. 282–298, Jan. 2009, doi: 10.1080/01421590902889897
[109]
S. Brookfield, Developing critical thinkers: challenging adults to explore alternative ways of thinking and acting. Milton Keynes: Open University Press, 1987.
[110]
A. Burls, ‘What is critical appraisal?’ 2009. Available: https://www.academia.edu/92786872/What_Is_Critical_Appraisal
[111]
‘The Campbell Collaboration’. Available: http://www.campbellcollaboration.org/
[112]
‘CASP Critical Appraisal Skills Programme Oxford UK’. Available: http://www.casp-uk.net/
[113]
‘Cochrane | Trusted evidence. Informed decisions. Better health.’ Available: http://www.cochrane.org/
[114]
F. Kee and I. Bickle, ‘Critical thinking and critical appraisal: the chicken and the egg?’, QJM, vol. 97, no. 9, pp. 609–614, Sept. 2004, doi: 10.1093/qjmed/hch099. Available: https://ezproxy.lib.gla.ac.uk/login?url=https://academic.oup.com/qjmed/article/97/9/609/1594870
[115]
A. Da Silva and R. Dennick, ‘Corpus analysis of problem-based learning transcripts: an exploratory study.’, Medical Education, vol. 44, no. 3, pp. 280–288, 2010, doi: 10.1111/j.1365-2923.2009.03575.x
[116]
D. R. Garrison, ‘Critical thinking and adult education: a conceptual model for developing critical thinking in adult learners’, International Journal of Lifelong Education, vol. 10, no. 4, pp. 287–303, Oct. 1991, doi: 10.1080/0260137910100403
[117]
M. Hammick, T. Dornan, and Y. Steinert, ‘Conducting a best evidence systematic review. Part 1: From idea to data coding. BEME Guide No. 13’, Medical Teacher, vol. 32, no. 1, pp. 3–15, Jan. 2010, doi: 10.3109/01421590903414245
[118]
T. Horsley, C. Hyde, N. Santesso, J. Parkes, R. Milne, and R. Stewart, ‘Teaching critical appraisal skills in healthcare settings’, Cochrane Database of Systematic Reviews, Sept. 1996, doi: 10.1002/14651858.CD001270.pub2
[119]
G. C. Huang, L. R. Newman, and R. M. Schwartzstein, ‘Critical Thinking in Health Professions Education: Summary and Consensus Statements of the Millennium Conference 2011’, Teaching and Learning in Medicine, vol. 26, no. 1, pp. 95–102, Jan. 2014, doi: 10.1080/10401334.2013.857335
[120]
M. Jenicek, ‘The hard art of soft science: Evidence-Based Medicine, Reasoned Medicine or both?’, Journal of Evaluation in Clinical Practice, vol. 12, no. 4, pp. 410–419, Aug. 2006, doi: 10.1111/j.1365-2753.2006.00718.x
[121]
D. Kirkpatrick, ‘Great Ideas Revisited: Revisiting Kirkpatrick’s Four-Level Model’, Training and Development, vol. 50, no. 1, pp. 54–59, 1996, Available: https://ezproxy.lib.gla.ac.uk/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=9602066395&site=ehost-live
[122]
C. A. Missimer, Good arguments: an introduction to critical thinking, 3rd ed. Englewood Cliffs, N.J.: Prentice Hall, 1995.
[123]
T. J. Moore, ‘Critical thinking and disciplinary thinking: a continuing debate’, Higher Education Research & Development, vol. 30, no. 3, pp. 261–274, June 2011, doi: 10.1080/07294360.2010.501328
[124]
R. Paul, Critical thinking: how to prepare students for a rapidly changing world. foundation for critical thinking, 1995.
[125]
R. Paul and L. Elder, ‘The Miniature Guide to Critical Thinking: Concepts and Tools’. 2006. Available: https://www.criticalthinking.org/files/Concepts_Tools.pdf
[126]
S. Yardley and T. Dornan, ‘Kirkpatrick’s levels and education “evidence”’, Medical Education, vol. 46, no. 1, pp. 97–106, Jan. 2012, doi: 10.1111/j.1365-2923.2011.04076.x
[127]
O. P. Devine, A. C. Harborne, and I. C. McManus, ‘Assessment at UK medical schools varies substantially in volume, type and intensity and correlates with postgraduate attainment’, BMC Medical Education, vol. 15, no. 1, Dec. 2015, doi: 10.1186/s12909-015-0428-9
[128]
P. A. Ertmer et al., ‘Using Peer Feedback to Enhance the Quality of Student Online Postings: An Exploratory Study’, Journal of Computer-Mediated Communication, vol. 12, no. 2, pp. 412–433, Jan. 2007, doi: 10.1111/j.1083-6101.2007.00331.x
[129]
D. J. Nicol and D. Macfarlane‐Dick, ‘Formative assessment and self‐regulated learning: a model and seven principles of good feedback practice’, Studies in Higher Education, vol. 31, no. 2, pp. 199–218, Apr. 2006, doi: 10.1080/03075070600572090
[130]
N. Piedra, J. Chicaiza, J. Lopez, A. Romero, and E. Tovar, ‘Measuring collaboration and creativity skills through rubrics: Experience from UTPL collaborative social networks course’, in IEEE EDUCON 2010 Conference, IEEE, 2010, pp. 1511–1516. doi: 10.1109/EDUCON.2010.5492349
[131]
B. De Wever, H. Van Keer, T. Schellens, and M. Valcke, ‘Assessing collaboration in a wiki: The reliability of university students’ peer assessment’, The Internet and Higher Education, vol. 14, no. 4, pp. 201–206, Sept. 2011, doi: 10.1016/j.iheduc.2011.07.003
[132]
M. A. Verkerk, M. J. de Bree, and M. J. E. Mourits, ‘Reflective professionalism: interpreting CanMEDS’ “professionalism”’, Journal of Medical Ethics, vol. 33, no. 11, pp. 663–666, Nov. 2007, doi: 10.1136/jme.2006.017954
[133]
J. A. Cleland, L. V. Knight, C. E. Rees, S. Tracey, and C. M. Bond, ‘Is it me or is it them? Factors that influence the passing of underperforming students’, Medical Education, vol. 42, no. 8, pp. 800–809, Aug. 2008, doi: 10.1111/j.1365-2923.2008.03113.x
[134]
R. Cruess, ‘The Professionalism Mini-Evaluation Exercise: A Preliminary Investigation’, Academic Medicine, vol. 81, no. 10, Available: https://journals.lww.com/academicmedicine/Fulltext/2006/10001/The_Professionalism_Mini_Evaluation_Exercise__A.19.aspx
[135]
J. Goldie, ‘Assessment of professionalism: A consolidation of current thinking’, Medical Teacher, vol. 35, no. 2, pp. e952–e956, Feb. 2013, doi: 10.3109/0142159X.2012.714888
[136]
W. N. K. A. van Mook, S. L. Gorter, H. O’Sullivan, V. Wass, L. W. Schuwirth, and C. P. M. van der Vleuten, ‘Approaches to professional behaviour assessment: Tools in the professionalism toolbox’, European Journal of Internal Medicine, vol. 20, no. 8, pp. e153–e157, Dec. 2009, doi: 10.1016/j.ejim.2009.07.012
[137]
S. Ginsburg, G. Regehr, and L. Lingard, ‘Basing the evaluation of professionalism on observable behaviours: a cautionary tale’, Academic Medicine, vol. 79, no. 10, pp. S1–S4, 2004, Available: https://ezproxy.lib.gla.ac.uk/login?url=https://journals.lww.com/academicmedicine/Fulltext/2004/10001/Basing_the_Evaluation_of_Professionalism_on.1.aspx
[138]
B. D. Hodges et al., ‘Assessment of professionalism: Recommendations from the Ottawa 2010 Conference’, Medical Teacher, vol. 33, no. 5, pp. 354–363, May 2011, doi: 10.3109/0142159X.2011.577300
[139]
S. Schubert et al., ‘A situational judgement test of professional behaviour: development and validation’, Medical Teacher, vol. 30, no. 5, pp. 528–533, Jan. 2008, doi: 10.1080/01421590801952994
[140]
L. Arnold, C. K. Shue, B. Kritt, S. Ginsburg, and D. T. Stern, ‘Medical students’ views on peer assessment of professionalism’, Journal of General Internal Medicine, vol. 20, no. 9, pp. 819–824, Sept. 2005, doi: 10.1111/j.1525-1497.2005.0162.x
[141]
L. Arnold et al., ‘Can There Be a Single System for Peer Assessment of Professionalism among Medical Students? A Multi-Institutional Study’, Academic Medicine, vol. 82, no. 6, pp. 578–586, June 2007, doi: 10.1097/ACM.0b013e3180555d4e
[142]
G. Finn, M. Sawdon, L. Clipsham, and J. McLachlan, ‘Peer estimation of lack of professionalism correlates with low Conscientiousness Index scores’, Medical Education, vol. 43, no. 10, pp. 960–967, Oct. 2009, doi: 10.1111/j.1365-2923.2009.03453.x
[143]
E. Gaufberg and A. Fitzpatrick, ‘The favour: a professional boundaries OSCE station’, Medical Education, vol. 42, no. 5, pp. 529–530, May 2008, doi: 10.1111/j.1365-2923.2008.03067.x
[144]
S. Ginsburg, ‘Context, Conflict, and Resolution: A New Conceptual Framework for Evaluating Professionalism’, Academic Medicine, vol. 75, no. 10, Available: https://journals.lww.com/academicmedicine/Fulltext/2000/10001/Context,_Conflict,_and_Resolution__A_New.3.aspx
[145]
S. Ginsburg, G. Regehr, and M. Mylopoulos, ‘From behaviours to attributions: further concerns regarding the evaluation of professionalism’, Medical Education, vol. 43, no. 5, pp. 414–425, May 2009, doi: 10.1111/j.1365-2923.2009.03335.x
[146]
S. Ginsburg, C. van der Vleuten, K. W. Eva, and L. Lingard, ‘Hedging to save face: a linguistic analysis of written comments on in-training evaluation reports’, Advances in Health Sciences Education, vol. 21, no. 1, pp. 175–188, Mar. 2016, doi: 10.1007/s10459-015-9622-0
[147]
GMC, ‘Development of generic professional capabilities’. Available: http://www.gmc-uk.org/education/23581.asp
[148]
M. Kelly, S. O’Flynn, J. McLachlan, and M. A. Sawdon, ‘The Clinical Conscientiousness Index’, Academic Medicine, vol. 87, no. 9, pp. 1218–1224, Sept. 2012, doi: 10.1097/ACM.0b013e3182628499
[149]
W. T. McCormack, C. Lazarus, D. Stern, and P. A. Small, ‘Peer Nomination: A Tool for Identifying Medical Student Exemplars in Clinical Competence and Caring, Evaluated at Three Medical Schools’, Academic Medicine, vol. 82, no. 11, pp. 1033–1039, Nov. 2007, doi: 10.1097/01.ACM.0000285345.75528.ee
[150]
J. C. McLachlan, G. Finn, and J. Macnaughton, ‘The Conscientiousness Index: A Novel Tool to Explore Students’ Professionalism’, Academic Medicine, vol. 84, no. 5, pp. 559–565, May 2009, doi: 10.1097/ACM.0b013e31819fb7ff
[151]
J. J. Norcini, ‘Peer assessment of competence’, Medical Education, vol. 37, no. 6, pp. 539–543, June 2003, doi: 10.1046/j.1365-2923.2003.01536.x
[152]
M. A. Papadakis et al., ‘Disciplinary Action by Medical Boards and Prior Behavior in Medical School’, New England Journal of Medicine, vol. 353, no. 25, pp. 2673–2682, Dec. 2005, doi: 10.1056/NEJMsa052596
[153]
C. A. Pohl, M. Hojat, and L. Arnold, ‘Peer Nominations as Related to Academic Attainment, Empathy, Personality, and Specialty Interest’, Academic Medicine, vol. 86, no. 6, pp. 747–751, June 2011, doi: 10.1097/ACM.0b013e318217e464
[154]
P. G. Ramsey, ‘Use of Peer Ratings to Evaluate Physician Performance’, JAMA: The Journal of the American Medical Association, vol. 269, no. 13, Apr. 1993, doi: 10.1001/jama.1993.03500130069034
[155]
Royal College of Physicians, ‘Doctors in Society: Medical professionalism in a changing world’. 2005. Available: https://cdn.shopify.com/s/files/1/0924/4392/files/doctors_in_society_reportweb.pdf?15745311214883953343
[156]
D. T. Stern, A. Z. Frohna, and L. D. Gruppen, ‘The prediction of professional behaviour’, Medical Education, vol. 39, no. 1, pp. 75–82, Jan. 2005, doi: 10.1111/j.1365-2929.2004.02035.x
[157]
D. T. Stern, Measuring medical professionalism. New York: Oxford University Press, 2006. Available: https://ebookcentral.proquest.com/lib/gla/detail.action?docID=3053707
[158]
W. N. K. A. van Mook, S. J. van Luijk, H. O’Sullivan, V. Wass, L. W. Schuwirth, and C. P. M. van der Vleuten, ‘General considerations regarding assessment of professional behaviour’, European Journal of Internal Medicine, vol. 20, no. 4, pp. e90–e95, July 2009, doi: 10.1016/j.ejim.2008.11.011
[159]
T. J. Wilkinson, W. B. Wade, and L. D. Knock, ‘A Blueprint to Assess Professionalism: Results of a Systematic Review’, Academic Medicine, vol. 84, no. 5, pp. 551–558, May 2009, doi: 10.1097/ACM.0b013e31819fbaa2
[160]
S. Zijlstra-Shaw, P. G. Robinson, and T. Roberts, ‘Assessing professionalism within dental education; the need for a definition’, European Journal of Dental Education, vol. 16, no. 1, pp. e128–e136, Feb. 2012, doi: 10.1111/j.1600-0579.2011.00687.x
[161]
F. Patterson, L. Zibarras, and V. Ashworth, ‘Situational judgement tests in medical education and training: Research, theory and practice: AMEE Guide No. 100’, Medical Teacher, vol. 38, no. 1, pp. 3–17, Jan. 2016, doi: 10.3109/0142159X.2015.1072619
[162]
R. Gorania, ‘Situational judgement stress’, British Dental Journal, vol. 231, no. 8, pp. 426–426, Oct. 2021, doi: 10.1038/s41415-021-3577-8
[163]
Royal College of Physicians, ‘Advancing medical professionalism’, 2018. Available: https://www.healthcarevalues.ox.ac.uk/files/ampsummarypdf
[164]
A. Rimmer, ‘Situational judgment test is scrapped under new system for allocating foundation training places’, BMJ, June 2023, doi: 10.1136/bmj.p1269
[165]
D. W. McKinley and J. J. Norcini, ‘How to set standards on performance-based examinations: AMEE Guide No. 85’, Medical Teacher, vol. 36, no. 2, pp. 97–110, Feb. 2014, doi: 10.3109/0142159X.2013.853119
[166]
M. F. Ben-David, ‘AMEE Guide No. 18: Standard setting in student assessment’, Medical Teacher, vol. 22, no. 2, pp. 120–130, Jan. 2000, doi: 10.1080/01421590078526
[167]
A. F. De Champlain, ‘Standard Setting Methods in Medical Education’, in Understanding Medical Education, T. Swanwick, Ed., Oxford, UK: Wiley-Blackwell, 2019, pp. 347–359. doi: 10.1002/9781119373780.ch24. Available: https://ezproxy.lib.gla.ac.uk/login?url=https://onlinelibrary.wiley.com/doi/10.1002/9781119373780.ch24
[168]
J. Cohen-Schotanus and C. P. M. van der Vleuten, ‘A standard setting method with the best performing students as point of reference: Practical and affordable’, Medical Teacher, vol. 32, no. 2, pp. 154–160, Jan. 2010, doi: 10.3109/01421590903196979
[169]
S. M. Downing, A. Tekian, and R. Yudkowsky, ‘RESEARCH METHODOLOGY: Procedures for Establishing Defensible Absolute Passing Scores on Performance Examinations in Health Professions Education’, Teaching and Learning in Medicine, vol. 18, no. 1, pp. 50–57, Jan. 2006, doi: 10.1207/s15328015tlm1801_11
[170]
W. K. B. Hofstee, ‘The Case for Compromise in Educational Selection and Grading’, On Educational Testing, 1984, Available: https://benwilbrink.nl/publicaties/83hofstee_compromise.htm
[171]
S. L. Fowell, R. Fewtrell, and P. J. McLaughlin, ‘Estimating the Minimum Number of Judges Required for Test-centred Standard Setting on Written Assessments. Do Discussion and Iteration have an Influence?’, Advances in Health Sciences Education, vol. 13, no. 1, pp. 11–24, Mar. 2008, doi: 10.1007/s10459-006-9027-1
[172]
C. A. Taylor, ‘Development of a modified Cohen method of standard setting’, Medical Teacher, vol. 33, no. 12, pp. e678–e682, Dec. 2011, doi: 10.3109/0142159X.2011.611192
[173]
A. Karantonis and S. G. Sireci, ‘The Bookmark Standard-Setting Method: A Literature Review’, Educational Measurement: Issues and Practice, vol. 25, no. 1, pp. 4–12, Mar. 2006, doi: 10.1111/j.1745-3992.2006.00047.x
[174]
J. Puryer and D. O’Sullivan, ‘An introduction to standard setting methods in dentistry’, BDJ, vol. 219, no. 7, pp. 355–358, Oct. 2015, doi: 10.1038/sj.bdj.2015.755
[175]
A. M. J. Linn, A. Tonkin, and P. Duggan, ‘Standard setting of script concordance tests using an adapted Nedelsky approach’, Medical Teacher, vol. 35, no. 4, pp. 314–319, Apr. 2013, doi: 10.3109/0142159X.2012.746446
[176]
Woodhouse, L, ‘Comparison of Cohen and Angoff methods of standard setting: is Angoff worth it?’, European Board of Medical Assessors Annual Academic Conference: Crossing Boundaries: Assessment in Medical Education, Available: https://eprints.ncl.ac.uk/251674
[177]
Barbara S. Plake, James C. Impara and Patrick M. Irwin, ‘Consistency of Angoff-Based Predictions of Item Performance: Evidence of Technical Quality of Results from the Angoff Standard Setting Method’, Journal of Educational Measurement, vol. 37, no. 4, 2000, Available: https://www.jstor.org/stable/1435245
[178]
IC McManus, ‘Implementing statistical equating for MRCP(UK) parts 1 and 2’, BMC Medical Education, vol. 14, no. 1, 2014, Available: https://bmcmededuc.biomedcentral.com/articles/10.1186/1472-6920-14-204
[179]
D. F. Wood, ‘Formative Assessment’, in Understanding Medical Education, T. Swanwick, Ed., Oxford, UK: John Wiley & Sons, Ltd, 2019, pp. 317–328. doi: 10.1002/9781119373780.ch25. Available: https://ezproxy.lib.gla.ac.uk/login?url=https://onlinelibrary.wiley.com/doi/abs/10.1002/9781119373780.ch25
[180]
R. G. Bing-You, ‘Why Medical Educators May Be Failing at Feedback’, JAMA, vol. 302, no. 12, Sept. 2009, doi: 10.1001/jama.2009.1393
[181]
S. Ramani and S. K. Krackov, ‘Twelve tips for giving feedback effectively in the clinical environment’, Medical Teacher, vol. 34, no. 10, pp. 787–791, Oct. 2012, doi: 10.3109/0142159X.2012.684916
[182]
J. M. M. Van De Ridder, K. M. Stokking, W. C. McGaghie, and O. T. J. Ten Cate, ‘What is feedback in clinical education?’, Medical Education, vol. 42, no. 2, pp. 189–197, Jan. 2008, doi: 10.1111/j.1365-2923.2007.02973.x
[183]
A. Rushton, ‘Formative assessment: a key to deep learning?’, Medical Teacher, vol. 27, no. 6, pp. 509–513, Sept. 2005, doi: 10.1080/01421590500129159
[184]
A. Sender Liberman, M. Liberman, Y. Steinert, P. McLeod, and S. Meterissian, ‘Surgery residents and attending surgeons have different perceptions of feedback’, Medical Teacher, vol. 27, no. 5, pp. 470–472, Aug. 2005, doi: 10.1080/0142590500129183
[185]
L. E. Duers and N. Brown, ‘An exploration of student nurses’ experiences of formative assessment’, Nurse Education Today, vol. 29, no. 6, pp. 654–659, Aug. 2009, doi: 10.1016/j.nedt.2009.02.007
[186]
B. L. Olson and J. L. McDonald, ‘Influence of Online Formative Assessment Upon Student Learning in Biomedical Science Courses’, Journal of Dental Education, vol. 68, no. 6, pp. 656–659, June 2004, doi: 10.1002/j.0022-0337.2004.68.6.tb03783.x
[187]
C. Garrison and M. Ehringhaus, ‘Formative and Summative Assessments in the Classroom’, 2007, Available: https://www.amle.org/portals/0/pdf/articles/Formative_Assessment_Article_Aug2013.pdf
[188]
D. R. Sadler, ‘Formative Assessment: revisiting the territory’, Assessment in Education: Principles, Policy & Practice, vol. 5, no. 1, pp. 77–84, Mar. 1998, doi: 10.1080/0969595980050104
[189]
C. Ditchfield, ‘How do learners make sense of the formative assessment opportunities available to inform their learning in a PBL course’, 2007.
[190]
M. R. Weaver, ‘Do students value feedback? Student perceptions of tutors’ written responses’, Assessment & Evaluation in Higher Education, vol. 31, no. 3, pp. 379–394, June 2006, doi: 10.1080/02602930500353061
[191]
G. Ferrell, ‘Supporting assessment and feedback practice with technology: from tinkering to transformation’. 2013. Available: https://repository.jisc.ac.uk/5450/
[192]
P. Black and D. Wiliam, ‘Assessment and Classroom Learning’, Assessment in Education: Principles, Policy & Practice, vol. 5, no. 1, pp. 7–74, Mar. 1998, doi: 10.1080/0969595980050102
[193]
D. R. Sadler, ‘Formative assessment and the design of instructional systems’, Instructional Science, vol. 18, no. 2, pp. 119–144, June 1989, doi: 10.1007/BF00117714
[194]
D. M. Elnicki, R. D. Layne, P. E. Ogden, and D. K. Morris, ‘Oral versus written feedback in medical clinic’, Journal of General Internal Medicine, vol. 13, no. 3, pp. 155–158, Mar. 1998, doi: 10.1046/j.1525-1497.1998.00049.x
[195]
W. A. Norcross, ‘The Consultation: An Approach to Learning and Teaching’, JAMA: The Journal of the American Medical Association, vol. 253, no. 3, Jan. 1985, doi: 10.1001/jama.1985.03350270123038
[196]
J. L. Jackson, C. Kay, W. C. Jackson, and M. Frank, ‘The Quality of Written Feedback by Attendings of Internal Medicine Residents’, Journal of General Internal Medicine, vol. 30, no. 7, pp. 973–978, July 2015, doi: 10.1007/s11606-015-3237-2
[197]
J. Rudolph, D. Raemer, and J. Shapiro, ‘We know what they did wrong, but not why : the case for “frame-based” feedback’, The Clinical Teacher, vol. 10, no. 3, pp. 186–189, June 2013, doi: 10.1111/j.1743-498X.2012.00636.x
[198]
G. Scally and L. J. Donaldson, ‘Looking forward: Clinical governance and the drive for quality improvement in the new NHS in England’, BMJ, vol. 317, no. 7150, pp. 61–65, July 1998, doi: 10.1136/bmj.317.7150.61
[199]
S. Shaw, ‘Research governance: where did it come from, what does it mean?’, Journal of the Royal Society of Medicine, vol. 98, no. 11, pp. 496–502, Nov. 2005, Available: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1275997/pdf/496.pdf
[200]
‘Dimensions of Quality’. 2010. Available: https://www.advance-he.ac.uk/knowledge-hub/dimensions-quality
[201]
D. Anderson and L. S. Ackerman-Anderson, Beyond change management: how to achieve breakthrough results through conscious change leadership, 2nd ed., vol. 36. San Francisso: Pfeiffer, 2010. Available: https://ebookcentral.proquest.com/lib/gla/detail.action?docID=624404
[202]
D. Hay, I. Kinchin, and S. Lygo‐Baker, ‘Making learning visible: the role of concept mapping in higher education’, Studies in Higher Education, vol. 33, no. 3, pp. 295–311, June 2008, doi: 10.1080/03075070802049251
[203]
D. B. Hay, P. L. Tan, and E. Whaites, ‘Non‐traditional learners in higher education: comparison of a traditional MCQ examination with concept mapping to assess learning in a dental radiological science course’, Assessment & Evaluation in Higher Education, vol. 35, no. 5, pp. 577–595, Aug. 2010, doi: 10.1080/02602931003782525
[204]
L. Richstone, M. J. Schwartz, C. Seideman, J. Cadeddu, S. Marshall, and L. R. Kavoussi, ‘Eye Metrics as an Objective Assessment of Surgical Skill’, Annals of Surgery, vol. 252, no. 1, pp. 177–182, July 2010, doi: 10.1097/SLA.0b013e3181e464fb
[205]
N. Suetsugu, M. Ohki, and T. Kaku, ‘Quantitative Analysis of Nursing Observation Employing a Portable Eye-Tracker’, Open Journal of Nursing, vol. 06, no. 01, pp. 53–61, 2016, doi: 10.4236/ojn.2016.61006
[206]
J. Gould and P. Day, ‘Hearing you loud and clear: student perspectives of audio feedback in higher education’, Assessment & Evaluation in Higher Education, vol. 38, no. 5, pp. 554–566, Aug. 2013, doi: 10.1080/02602938.2012.660131
[207]
J. Frost, G. de Pont, and I. Brailsford, ‘Expanding assessment methods and moments in history’, Assessment & Evaluation in Higher Education, vol. 37, no. 3, pp. 293–304, May 2012, doi: 10.1080/02602938.2010.531247
[208]
C. J. Harrison, A. J. Molyneux, S. Blackwell, and V. J. Wass, ‘How we give personalised audio feedback after summative OSCEs’, Medical Teacher, vol. 37, no. 4, pp. 323–326, Apr. 2015, doi: 10.3109/0142159X.2014.932901
[209]
S. Voelkel and L. V. Mello, ‘Audio Feedback – Better Feedback?’, Bioscience Education, vol. 22, no. 1, pp. 16–30, July 2014, doi: 10.11120/beej.2014.00022
[210]
H. Ashraf, M. H. Sodergren, N. Merali, G. Mylonas, H. Singh, and A. Darzi, ‘Eye-tracking technology in medical education: A systematic review’, Medical Teacher, vol. 40, no. 1, pp. 62–69, Jan. 2018, doi: 10.1080/0142159X.2017.1391373
[211]
R. E. Mayer, ‘Cognitive Learning’, in Encyclopedia of the sciences of learning, [S.l.]: Springer, 2012. Available: https://ezproxy.lib.gla.ac.uk/login?url=https://link.springer.com/referenceworkentry/10.1007/978-1-4419-1428-6_390
[212]
V. W. Ho, P. G. Harris, R. K. Kumar, and G. M. Velan, ‘Knowledge maps: a tool for online assessment with automated feedback’, Medical Education Online, vol. 23, no. 1, Jan. 2018, doi: 10.1080/10872981.2018.1457394
[213]
S. Y. Guraya, ‘The Desired Concept Maps and Goal Setting for Assessing Professionalism in Medicine’, JOURNAL OF CLINICAL AND DIAGNOSTIC RESEARCH, 2016, doi: 10.7860/JCDR/2016/19917.7832
[214]
S. E. Kassab, M. Fida, A. Radwan, A. B. Hassan, M. Abu-Hijleh, and B. P. O’Connor, ‘Generalisability theory analyses of concept mapping assessment scores in a problem-based medical curriculum’, Medical Education, vol. 50, no. 7, pp. 730–737, July 2016, doi: 10.1111/medu.13054
[215]
O. Courteille et al., ‘The use of a virtual patient case in an OSCE-based exam – A pilot study’, Medical Teacher, vol. 30, no. 3, pp. e66–e76, Jan. 2008, doi: 10.1080/01421590801910216
[216]
S. M. Downing, ‘Guessing on selected-response examinations’, Medical Education, vol. 37, no. 8, pp. 670–671, Aug. 2003, doi: 10.1046/j.1365-2923.2003.01585.x
[217]
Ingrid Tonni, Cynthia C. Gadbury‐Amyot, Marjan Govaerts, Olle ten Cate, Joan Davis, Lily T. Garcia, Richard W. Valachovic, ‘ADEA‐ADEE Shaping the Future of Dental Education III’, Journal of Dental Education, vol. 84, no. 1, pp. 97–104, 2020, doi: 10.1002/jdd.12024