Anderson, D., & Ackerman-Anderson, L. S. (2010). Beyond change management: how to achieve breakthrough results through conscious change leadership (2nd ed, Vol. 36) [Electronic resource]. Pfeiffer. https://ebookcentral.proquest.com/lib/gla/detail.action?docID=624404
Anderson, L. W., & Bloom, B. S. (2001). A taxonomy for learning, teaching, and assessing: a revision of Bloom’s taxonomy of educational objectives (Abridged ed). Longman.
Archer, J. C. (2005). Use of SPRAT for peer review of paediatricians in training. BMJ, 330(7502), 1251–1253. https://doi.org/10.1136/bmj.38447.610451.8F
Arnold, L., Shue, C. K., Kalishman, S., Prislin, M., Pohl, C., Pohl, H., & Stern, D. T. (2007). Can There Be a Single System for Peer Assessment of Professionalism among Medical Students? A Multi-Institutional Study. Academic Medicine, 82(6), 578–586. https://doi.org/10.1097/ACM.0b013e3180555d4e
Arnold, L., Shue, C. K., Kritt, B., Ginsburg, S., & Stern, D. T. (2005). Medical students’ views on peer assessment of professionalism. Journal of General Internal Medicine, 20(9), 819–824. https://doi.org/10.1111/j.1525-1497.2005.0162.x
Ashraf, H., Sodergren, M. H., Merali, N., Mylonas, G., Singh, H., & Darzi, A. (2018). Eye-tracking technology in medical education: A systematic review. Medical Teacher, 40(1), 62–69. https://doi.org/10.1080/0142159X.2017.1391373
Association for the Study of Medical Education. (2019). Understanding medical education: evidence, theory, and practice (T. Swanwick, K. Forrest, & B. C. O’Brien, Eds; Third edition). Wiley-Blackwell. https://ezproxy.lib.gla.ac.uk/login?url=https://dx.doi.org/10.1002/9781119373780
Auewarakul, C., Downing, S. M., Praditsuwan, R., & Jaturatamrong, U. (2005). Item Analysis to Improve Reliability for an Internal Medicine Undergraduate OSCE. Advances in Health Sciences Education, 10(2), 105–113. https://doi.org/10.1007/s10459-005-2315-3
Aylward, M., Nixon, J., & Gladding, S. (2014). An Entrustable Professional Activity (EPA) for Handoffs as a Model for EPA Assessment Development. Academic Medicine, 89(10), 1335–1340. https://doi.org/10.1097/ACM.0000000000000317
Barbara S. Plake, James C. Impara and Patrick M. Irwin. (2000). Consistency of Angoff-Based Predictions of Item Performance: Evidence of Technical Quality of Results from the Angoff Standard Setting Method. Journal of Educational Measurement, 37(4). https://www.jstor.org/stable/1435245
Ben-David, M. F. (2000). AMEE Guide No. 18: Standard setting in student assessment. Medical Teacher, 22(2), 120–130. https://doi.org/10.1080/01421590078526
Biggs, J. (1996). Enhancing Teaching through Constructive Alignment. Higher Education, 32(3), 347–364. https://www.jstor.org/stable/3448076
Bing-You, R. G. (2009). Why Medical Educators May Be Failing at Feedback. JAMA, 302(12). https://doi.org/10.1001/jama.2009.1393
Black, P., & Wiliam, D. (1998). Assessment and Classroom Learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74. https://doi.org/10.1080/0969595980050102
Boursicot, K. A. M., Roberts, T. E., & Burdick, W. P. (2018). Structured Assessments of Clinical Competence. In T. Swanwick, K. Forrest, & B. C. O’Brien (Eds), Understanding Medical Education (pp. 335–345). John Wiley & Sons, Ltd. https://doi.org/10.1002/9781119373780.ch23
Brookfield, S. (1987). Developing critical thinkers: challenging adults to explore alternative ways of thinking and acting. Open University Press.
Brown, C., Ross, S., Cleland, J., & Walsh, K. (2015). Money makes the (medical assessment) world go round: The cost of components of a summative final year Objective Structured Clinical Examination (OSCE). Medical Teacher, 37(7), 653–659. https://doi.org/10.3109/0142159X.2015.1033389
Buckley, S., Coleman, J., Davison, I., Khan, K. S., Zamora, J., Malick, S., Morley, D., Pollard, D., Ashcroft, T., Popovic, C., & Sayers, J. (2009). The educational effects of portfolios on undergraduate student learning: A Best Evidence Medical Education (BEME) systematic review. BEME Guide No. 11. Medical Teacher, 31(4), 282–298. https://doi.org/10.1080/01421590902889897
Burls, A. (2009). What is critical appraisal? https://www.academia.edu/92786872/What_Is_Critical_Appraisal
Carraccio, Carol;Wolfsthal, Susan D.;Englander, Robert;Ferentz, Kevin;Martin, Christine. (n.d.). Shifting Paradigms: From Flexner to Competencies. Academic Medicine, 77(5). https://journals.lww.com/academicmedicine/Fulltext/2002/05000/Shifting_Paradigms__From_Flexner_to_Competencies.3.aspx
Case, S. M., & Swanson, D. B. (1993). Extended‐matching items: A practical alternative to free‐response questions. Teaching and Learning in Medicine, 5(2), 107–115. https://doi.org/10.1080/10401339309539601
CASP Critical Appraisal Skills Programme Oxford UK. (n.d.). http://www.casp-uk.net/
Charlin, B., Roy, L., Brailovsky, C., Goulet, F., & van der Vleuten, C. (2000). The Script Concordance Test: A Tool to Assess the Reflective Clinician. Teaching and Learning in Medicine, 12(4), 189–195. https://doi.org/10.1207/S15328015TLM1204_5
Cleland, J. A., Knight, L. V., Rees, C. E., Tracey, S., & Bond, C. M. (2008). Is it me or is it them? Factors that influence the passing of underperforming students. Medical Education, 42(8), 800–809. https://doi.org/10.1111/j.1365-2923.2008.03113.x
Cobb, K. A., Brown, G., Jaarsma, D. A. D. C., & Hammond, R. A. (2013). The educational impact of assessment: A comparison of DOPS and MCQs. Medical Teacher, 35(11), e1598–e1607. https://doi.org/10.3109/0142159X.2013.803061
Cochrane | Trusted evidence. Informed decisions. Better health. (n.d.). http://www.cochrane.org/
Cohen, L., Manion, L., & Morrison, K. (2018). Research methods in education (Eighth edition). Routledge. https://www.vlebooks.com/vleweb/product/openreader?id=GlasgowUni&isbn=9781315456522
Cohen-Schotanus, J., & van der Vleuten, C. P. M. (2010). A standard setting method with the best performing students as point of reference: Practical and affordable. Medical Teacher, 32(2), 154–160. https://doi.org/10.3109/01421590903196979
Cotton, D. R. E., Cotton, P. A., & Shipway, J. R. (2024). Chatting and cheating: Ensuring academic integrity in the era of ChatGPT. Innovations in Education and Teaching International, 61(2), 228–239. https://doi.org/10.1080/14703297.2023.2190148
Courteille, O., Bergin, R., Courteille, O., Bergin, R., Stockeld, D., Ponzer, S., & Fors, U. (2008). The use of a virtual patient case in an OSCE-based exam – A pilot study. Medical Teacher, 30(3), e66–e76. https://doi.org/10.1080/01421590801910216
Cruess, R. (n.d.). The Professionalism Mini-Evaluation Exercise: A Preliminary Investigation. Academic Medicine, 81(10). https://journals.lww.com/academicmedicine/Fulltext/2006/10001/The_Professionalism_Mini_Evaluation_Exercise__A.19.aspx
Cruess, R. L., Cruess, S. R., & Steinert, Y. (2016). Amending Miller’s Pyramid to Include Professional Identity Formation. Academic Medicine, 91(2), 180–185. https://doi.org/10.1097/ACM.0000000000000913
Da Silva, A., & Dennick, R. (2010). Corpus analysis of problem-based learning transcripts: an exploratory study. Medical Education, 44(3), 280–288. https://doi.org/10.1111/j.1365-2923.2009.03575.x
De Champlain, A. F. (2019). Standard Setting Methods in Medical Education. In T. Swanwick (Ed.), Understanding Medical Education (pp. 347–359). Wiley-Blackwell. https://doi.org/10.1002/9781119373780.ch24
De Wever, B., Van Keer, H., Schellens, T., & Valcke, M. (2011). Assessing collaboration in a wiki: The reliability of university students’ peer assessment. The Internet and Higher Education, 14(4), 201–206. https://doi.org/10.1016/j.iheduc.2011.07.003
Denison, A., Bate, E., & Thompson, J. (2016). Tablet versus paper marking in assessment: feedback matters. Perspectives on Medical Education, 5(2), 108–113. https://doi.org/10.1007/s40037-016-0262-8
Devine, O. P., Harborne, A. C., & McManus, I. C. (2015). Assessment at UK medical schools varies substantially in volume, type and intensity and correlates with postgraduate attainment. BMC Medical Education, 15(1). https://doi.org/10.1186/s12909-015-0428-9
Dimensions of Quality. (2010). https://www.advance-he.ac.uk/knowledge-hub/dimensions-quality
Ditchfield, C. (2007). How do learners make sense of the formative assessment opportunities available to inform their learning in a PBL course.
Dory, V., Gagnon, R., Vanpee, D., & Charlin, B. (2012). How to construct and implement script concordance tests: insights from a systematic review. Medical Education, 46(6), 552–563. https://doi.org/10.1111/j.1365-2923.2011.04211.x
dos S. Ribeiro, C., van de Burgwal, L. H. M., & Regeer, B. J. (2019). Overcoming challenges for designing and implementing the One Health approach: A systematic review of the literature. One Health, 7. https://doi.org/10.1016/j.onehlt.2019.100085
Downing, S. M. (2003). Guessing on selected-response examinations. Medical Education, 37(8), 670–671. https://doi.org/10.1046/j.1365-2923.2003.01585.x
Downing, S. M., Tekian, A., & Yudkowsky, R. (2006). RESEARCH METHODOLOGY: Procedures for Establishing Defensible Absolute Passing Scores on Performance Examinations in Health Professions Education. Teaching and Learning in Medicine, 18(1), 50–57. https://doi.org/10.1207/s15328015tlm1801_11
Driessen, E., & van Tartwijk, J. (2013). Portfolios in personal and professional development. In T. Swanwick (Ed.), Understanding Medical Education (3rd ed, pp. 255–262). Wiley-Blackwell. https://doi.org/10.1002/9781119373780.ch18
Driessen, E. W., Muijtjens, A. M. M., van Tartwijk, J., & van der Vleuten, C. P. M. (2007). Web- or paper-based portfolios: is there a difference? Medical Education, 41(11), 1067–1073. https://doi.org/10.1111/j.1365-2923.2007.02859.x
Duers, L. E., & Brown, N. (2009). An exploration of student nurses’ experiences of formative assessment. Nurse Education Today, 29(6), 654–659. https://doi.org/10.1016/j.nedt.2009.02.007
Durning, S. J., Cation, L. J., Markert, R. J., & Pangaro, L. N. (2002). Assessing the Reliability and Validity of the Mini-Clinical Evaluation Exercise for Internal Medicine Residency Training. Academic Medicine, 77(9). https://journals.lww.com/academicmedicine/pages/articleviewer.aspx?year=2002&issue=09000&article=00020&type=abstract
Elnicki, D. M., Layne, R. D., Ogden, P. E., & Morris, D. K. (1998). Oral versus written feedback in medical clinic. Journal of General Internal Medicine, 13(3), 155–158. https://doi.org/10.1046/j.1525-1497.1998.00049.x
Elstein, A. S., Sprafka, S. A., & Shulman, L. S. (n.d.). Medical Problem Solving: An Analysis of Clinical Reasoning.
Epstein, R. M. (2002). Defining and Assessing Professional Competence. JAMA, 287(2). https://doi.org/10.1001/jama.287.2.226
Epstein, R. M. (2007). Assessment in Medical Education. New England Journal of Medicine, 356(4), 387–396. https://doi.org/10.1056/NEJMra054784
Ertmer, P. A., Richardson, J. C., Belland, B., Camin, D., Connolly, P., Coulthard, G., Lei, K., & Mong, C. (2007). Using Peer Feedback to Enhance the Quality of Student Online Postings: An Exploratory Study. Journal of Computer-Mediated Communication, 12(2), 412–433. https://doi.org/10.1111/j.1083-6101.2007.00331.x
Eva, K. W., Rosenfeld, J., Reiter, H. I., & Norman, G. R. (2004). An admissions OSCE: the multiple mini-interview. Medical Education, 38(3), 314–326. https://doi.org/10.1046/j.1365-2923.2004.01776.x
Farmer, E. A., & Page, G. (2005). A practical guide to assessing clinical decision-making skills using the key features approach. Medical Education, 39(12), 1188–1194. https://doi.org/10.1111/j.1365-2929.2005.02339.x
FENDERSON, B. (1997). The virtues of extended matching and uncued tests as alternatives to multiple choice questions. Human Pathology, 28(5), 526–532. https://doi.org/10.1016/S0046-8177(97)90073-3
Ferrell, G. (2013). Supporting assessment and feedback practice with technology: from tinkering to transformation. https://repository.jisc.ac.uk/5450/
Finn, G., Sawdon, M., Clipsham, L., & McLachlan, J. (2009). Peer estimation of lack of professionalism correlates with low Conscientiousness Index scores. Medical Education, 43(10), 960–967. https://doi.org/10.1111/j.1365-2923.2009.03453.x
Fournier, J., Demeester, A., & Charlin, B. (2008). Script Concordance Tests: Guidelines for Construction. BMC Medical Informatics and Decision Making, 8. https://doi.org/10.1186/1472-6947-8-18
Fowell, S. L., Fewtrell, R., & McLaughlin, P. J. (2008). Estimating the Minimum Number of Judges Required for Test-centred Standard Setting on Written Assessments. Do Discussion and Iteration have an Influence? Advances in Health Sciences Education, 13(1), 11–24. https://doi.org/10.1007/s10459-006-9027-1
Frost, J., de Pont, G., & Brailsford, I. (2012). Expanding assessment methods and moments in history. Assessment & Evaluation in Higher Education, 37(3), 293–304. https://doi.org/10.1080/02602938.2010.531247
Garrison, C., & Ehringhaus, M. (2007). Formative and Summative Assessments in the Classroom. https://www.amle.org/portals/0/pdf/articles/Formative_Assessment_Article_Aug2013.pdf
Garrison, D. R. (1991). Critical thinking and adult education: a conceptual model for developing critical thinking in adult learners. International Journal of Lifelong Education, 10(4), 287–303. https://doi.org/10.1080/0260137910100403
Gaufberg, E., & Fitzpatrick, A. (2008). The favour: a professional boundaries OSCE station. Medical Education, 42(5), 529–530. https://doi.org/10.1111/j.1365-2923.2008.03067.x
Ginsburg, S. (n.d.). Context, Conflict, and Resolution: A New Conceptual Framework for Evaluating Professionalism. Academic Medicine, 75(10). https://journals.lww.com/academicmedicine/Fulltext/2000/10001/Context,_Conflict,_and_Resolution__A_New.3.aspx
Ginsburg, S., Regehr, G., & Lingard, L. (2004). Basing the evaluation of professionalism on observable behaviours: a cautionary tale. Academic Medicine, 79(10), S1–S4. https://ezproxy.lib.gla.ac.uk/login?url=https://journals.lww.com/academicmedicine/Fulltext/2004/10001/Basing_the_Evaluation_of_Professionalism_on.1.aspx
Ginsburg, S., Regehr, G., & Mylopoulos, M. (2009). From behaviours to attributions: further concerns regarding the evaluation of professionalism. Medical Education, 43(5), 414–425. https://doi.org/10.1111/j.1365-2923.2009.03335.x
Ginsburg, S., van der Vleuten, C., Eva, K. W., & Lingard, L. (2016). Hedging to save face: a linguistic analysis of written comments on in-training evaluation reports. Advances in Health Sciences Education, 21(1), 175–188. https://doi.org/10.1007/s10459-015-9622-0
GMC. (n.d.). Development of generic professional capabilities. General Medical Council. http://www.gmc-uk.org/education/23581.asp
Goldie, J. (2013). Assessment of professionalism: A consolidation of current thinking. Medical Teacher, 35(2), e952–e956. https://doi.org/10.3109/0142159X.2012.714888
Gorania, R. (2021). Situational judgement stress. British Dental Journal, 231(8), 426–426. https://doi.org/10.1038/s41415-021-3577-8
Gould, J., & Day, P. (2013). Hearing you loud and clear: student perspectives of audio feedback in higher education. Assessment & Evaluation in Higher Education, 38(5), 554–566. https://doi.org/10.1080/02602938.2012.660131
Gravina, E. W. (2017). Competency-Based Education and Its Effect on Nursing Education: A Literature Review. Teaching and Learning in Nursing, 12(2), 117–121. https://doi.org/10.1016/j.teln.2016.11.004
Guraya, S. Y. (2016). The Desired Concept Maps and Goal Setting for Assessing Professionalism in Medicine. JOURNAL OF CLINICAL AND DIAGNOSTIC RESEARCH. https://doi.org/10.7860/JCDR/2016/19917.7832
Hagel, C. M., Hall, A. K., & Dagnone, J. D. (2016). Queen’s University Emergency Medicine Simulation OSCE: an Advance in Competency-Based Assessment. CJEM, 18(3), 230–233. https://doi.org/10.1017/cem.2015.34
Haladyna, T. M., Downing, S. M., & Rodriguez, M. C. (2002). A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment. Applied Measurement in Education, 15(3), 309–333. https://doi.org/10.1207/S15324818AME1503_5
Hammick, M., Dornan, T., & Steinert, Y. (2010). Conducting a best evidence systematic review. Part 1: From idea to data coding. BEME Guide No. 13. Medical Teacher, 32(1), 3–15. https://doi.org/10.3109/01421590903414245
Harden, R. M. (2007). Learning outcomes as a tool to assess progression. Medical Teacher, 29(7), 678–682. https://doi.org/10.1080/01421590701729955
Harden, R. M. (2015). Misconceptions and the OSCE. Medical Teacher, 37(7), 608–610. https://doi.org/10.3109/0142159X.2015.1042443
Harden, R. M. (2016). Revisiting ‘Assessment of clinical competence using an objective structured clinical examination (OSCE)’. Medical Education, 50(4), 376–379. https://doi.org/10.1111/medu.12801
Harden, R. M., Lilley, P., Patricio, M., & Norman, G. R. (2016). The definitive guide to the OSCE: the Objective Structured Clinical Examination as a performance assessment. Elsevier. https://www.vlebooks.com/vleweb/product/openreader?id=GlasgowUni&isbn=9780702055492
Harden, R. M., Stevenson, M., Downie, W. W., & Wilson, G. M. (1975). Assessment of clinical competence using objective structured examination. BMJ, 1(5955), 447–451. https://doi.org/10.1136/bmj.1.5955.447
Harrison, C. J., Molyneux, A. J., Blackwell, S., & Wass, V. J. (2015). How we give personalised audio feedback after summative OSCEs. Medical Teacher, 37(4), 323–326. https://doi.org/10.3109/0142159X.2014.932901
Hauer, K. E., Soni, K., Cornett, P., Kohlwes, J., Hollander, H., Ranji, S. R., ten Cate, O., Widera, E., Calton, B., & O’Sullivan, P. S. (2013). Developing Entrustable Professional Activities as the Basis for Assessment of Competence in an Internal Medicine Residency: A Feasibility Study. Journal of General Internal Medicine, 28(8), 1110–1114. https://doi.org/10.1007/s11606-013-2372-x
Hay, D. B., Tan, P. L., & Whaites, E. (2010). Non‐traditional learners in higher education: comparison of a traditional MCQ examination with concept mapping to assess learning in a dental radiological science course. Assessment & Evaluation in Higher Education, 35(5), 577–595. https://doi.org/10.1080/02602931003782525
Hay, D., Kinchin, I., & Lygo‐Baker, S. (2008). Making learning visible: the role of concept mapping in higher education. Studies in Higher Education, 33(3), 295–311. https://doi.org/10.1080/03075070802049251
Hift, R. J. (2014). Should essays and other "open-ended”-type questions retain a place in written summative assessment in clinical medicine? BMC Medical Education, 14(1). https://doi.org/10.1186/s12909-014-0249-2
Ho, V. W., Harris, P. G., Kumar, R. K., & Velan, G. M. (2018). Knowledge maps: a tool for online assessment with automated feedback. Medical Education Online, 23(1). https://doi.org/10.1080/10872981.2018.1457394
Hodges, B. D. (2017). A practical guide for medical teachers (J. A. Dent, R. M. Harden, & D. Hunt, Eds; Fifth edition). Elsevier. https://www.vlebooks.com/vleweb/product/openreader?id=GlasgowUni&isbn=9780702068935
Hodges, B. D., Ginsburg, S., Cruess, R., Cruess, S., Delport, R., Hafferty, F., Ho, M.-J., Holmboe, E., Holtman, M., Ohbu, S., Rees, C., Ten Cate, O., Tsugawa, Y., Van Mook, W., Wass, V., Wilkinson, T., & Wade, W. (2011). Assessment of professionalism: Recommendations from the Ottawa 2010 Conference. Medical Teacher, 33(5), 354–363. https://doi.org/10.3109/0142159X.2011.577300
Hodges, B., & McIlroy, J. H. (2003). Analytic global OSCE ratings are sensitive to level of training. Medical Education, 37(11), 1012–1016. https://doi.org/10.1046/j.1365-2923.2003.01674.x
Hodges, B., Regehr, G., McNaughton, N., Tiberius, R., & Hanson, M. (1999). OSCE checklists do not capture increasing levels of expertise. 74(10). https://journals.lww.com/academicmedicine/abstract/1999/10000/osce_checklists_do_not_capture_increasing_levels.17.aspx
Hofstee, W. K. B. (1984). The Case for Compromise in Educational Selection and Grading. On Educational Testing. https://benwilbrink.nl/publicaties/83hofstee_compromise.htm
Holmboe, E. S., Huot, S., Chung, J., Norcini, J., & Hawkins, R. E. (2003). Construct Validity of the MiniClinical Evaluation Exercise (miniCEX). Academic Medicine, 78(8). https://journals.lww.com/academicmedicine/pages/articleviewer.aspx?year=2003&issue=08000&article=00018&type=abstract
Holmboe, Eric. S., & Durning, S. J. (2024). Practical Guide to the Assessment of Clinical Competence (Third Edition).
Hopwood, J., Myers, G., & Sturrock, A. (2021). Twelve tips for conducting a virtual OSCE. Medical Teacher, 43(6), 633–636. https://doi.org/10.1080/0142159X.2020.1830961
Horsley, T., Hyde, C., Santesso, N., Parkes, J., Milne, R., & Stewart, R. (1996). Teaching critical appraisal skills in healthcare settings. Cochrane Database of Systematic Reviews. https://doi.org/10.1002/14651858.CD001270.pub2
Huang, G. C., Newman, L. R., & Schwartzstein, R. M. (2014). Critical Thinking in Health Professions Education: Summary and Consensus Statements of the Millennium Conference 2011. Teaching and Learning in Medicine, 26(1), 95–102. https://doi.org/10.1080/10401334.2013.857335
Humphrey-Murto, S., Côté, M., Pugh, D., & Wood, T. J. (2018). Assessing the Validity of a Multidisciplinary Mini-Clinical Evaluation Exercise. Teaching and Learning in Medicine, 30(2), 152–161. https://doi.org/10.1080/10401334.2017.1387553
Hurst, Y. K., Prescott-Clements, L. E., & Rennie, J. S. (2004). The patient assessment questionnaire: A new instrument for evaluating the interpersonal skills of vocational dental practitioners. British Dental Journal, 197(8), 497–500. https://doi.org/10.1038/sj.bdj.4811750
IC McManus. (2014). Implementing statistical equating for MRCP(UK) parts 1 and 2. BMC Medical Education, 14(1). https://bmcmededuc.biomedcentral.com/articles/10.1186/1472-6920-14-204
Ingrid Tonni, Cynthia C. Gadbury‐Amyot, Marjan Govaerts, Olle ten Cate, Joan Davis, Lily T. Garcia, Richard W. Valachovic. (2020). ADEA‐ADEE Shaping the Future of Dental Education III. Journal of Dental Education, 84(1), 97–104. https://doi.org/10.1002/jdd.12024
J. G. Boyle. (2020). Viva la VOSCE? BMC Medical Education, 20(1). https://bmcmededuc.biomedcentral.com/articles/10.1186/s12909-020-02444-3
Jackson, J. L., Kay, C., Jackson, W. C., & Frank, M. (2015). The Quality of Written Feedback by Attendings of Internal Medicine Residents. Journal of General Internal Medicine, 30(7), 973–978. https://doi.org/10.1007/s11606-015-3237-2
Jenicek, M. (2006). The hard art of soft science: Evidence-Based Medicine, Reasoned Medicine or both? Journal of Evaluation in Clinical Practice, 12(4), 410–419. https://doi.org/10.1111/j.1365-2753.2006.00718.x
Joanna Briggs Institute QARI. (n.d.). https://jbi.global/
Jolly, B. (2019). Written Assessment. In T. Swanwick (Ed.), Understanding Medical Education (3rd ed, pp. 261–261). Wiley-Blackwell. https://doi.org/10.1002/9781119373780.ch21
Jolly, B., & Dalton, M. J. (2018). Written Assessment. In T. Swanwick, K. Forrest, & B. C. O’Brien (Eds), Understanding Medical Education (3rd ed, pp. 291–317). John Wiley & Sons, Ltd. https://doi.org/10.1002/9781119373780.ch21
Kakadia, R., Chen, E., & Ohyama, H. (2020). Implementing an online OSCE during the COVID‐19 pandemic. Journal of Dental Education. https://doi.org/10.1002/jdd.12323
Karantonis, A., & Sireci, S. G. (2006). The Bookmark Standard-Setting Method: A Literature Review. Educational Measurement: Issues and Practice, 25(1), 4–12. https://doi.org/10.1111/j.1745-3992.2006.00047.x
Kassab, S. E., Fida, M., Radwan, A., Hassan, A. B., Abu-Hijleh, M., & O’Connor, B. P. (2016). Generalisability theory analyses of concept mapping assessment scores in a problem-based medical curriculum. Medical Education, 50(7), 730–737. https://doi.org/10.1111/medu.13054
Kee, F., & Bickle, I. (2004). Critical thinking and critical appraisal: the chicken and the egg? QJM, 97(9), 609–614. https://doi.org/10.1093/qjmed/hch099
Kelly, M., O’Flynn, S., McLachlan, J., & Sawdon, M. A. (2012). The Clinical Conscientiousness Index. Academic Medicine, 87(9), 1218–1224. https://doi.org/10.1097/ACM.0b013e3182628499
Kessel, D., Jenkins, J., & Neville, E. (2012). Workplace based assessments are no more. BMJ. https://doi.org/10.1136/bmj.e6193
Kirkpatrick, D. (1996). Great Ideas Revisited: Revisiting Kirkpatrick’s Four-Level Model. Training and Development, 50(1), 54–59. https://ezproxy.lib.gla.ac.uk/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=9602066395&site=ehost-live
Kogan, J. R., Bellini, L. M., & Shea, J. A. (2003). Feasibility, Reliability, and Validity of the Mini-Clinical Evaluation Exercise (mCEX) in a Medicine Core Clerkship. Academic Medicine, 78(10). https://journals.lww.com/academicmedicine/Fulltext/2003/10001/Feasibility,_Reliability,_and_Validity_of_the.11.aspx
Lane, P. (2005). Recruitment into training for general practice—the winds of change or a breath of fresh air? BMJ, 331(7520), s153–s153. https://doi.org/10.1136/bmj.331.7520.s153
Linn, A. M. J., Tonkin, A., & Duggan, P. (2013). Standard setting of script concordance tests using an adapted Nedelsky approach. Medical Teacher, 35(4), 314–319. https://doi.org/10.3109/0142159X.2012.746446
Lubarsky, S., Dory, V., Meterissian, S., Lambert, C., & Gagnon, R. (2018). Examining the effects of gaming and guessing on script concordance test scores. Perspectives on Medical Education, 7(3), 174–181. https://doi.org/10.1007/s40037-018-0435-8
Ma, I. W. Y., Zalunardo, N., Pachev, G., Beran, T., Brown, M., Hatala, R., & McLaughlin, K. (2012). Comparing the use of global rating scale with checklists for the assessment of central venous catheterization skills using simulation. Advances in Health Sciences Education, 17(4), 457–470. https://doi.org/10.1007/s10459-011-9322-3
Marshall, S. (Ed.). (2020). A handbook for teaching and learning in higher education: enhancing academic practice (Fifth edition). Routledge. https://ebookcentral.proquest.com/lib/gla/detail.action?docID=5983041
Martinsen, S. S. S., Espeland, T., Berg, E. A. R., Samstad, E., Lillebo, B., & Slørdahl, T. S. (2021). Examining the educational impact of the mini-CEX: a randomised controlled study. BMC Medical Education, 21(1). https://doi.org/10.1186/s12909-021-02670-3
Mayer, R. E. (2012). Cognitive Learning [Electronic resource]. In Encyclopedia of the sciences of learning. Springer. https://ezproxy.lib.gla.ac.uk/login?url=https://link.springer.com/referenceworkentry/10.1007/978-1-4419-1428-6_390
McCormack, W. T., Lazarus, C., Stern, D., & Small, P. A. (2007). Peer Nomination: A Tool for Identifying Medical Student Exemplars in Clinical Competence and Caring, Evaluated at Three Medical Schools. Academic Medicine, 82(11), 1033–1039. https://doi.org/10.1097/01.ACM.0000285345.75528.ee
McCoubrie, P. (2004). Improving the fairness of multiple-choice questions: a literature review. Medical Teacher, 26(8), 709–712. https://doi.org/10.1080/01421590400013495
McKinley, D. W., & Norcini, J. J. (2014). How to set standards on performance-based examinations: AMEE Guide No. 85. Medical Teacher, 36(2), 97–110. https://doi.org/10.3109/0142159X.2013.853119
McLachlan, J. C., Finn, G., & Macnaughton, J. (2009). The Conscientiousness Index: A Novel Tool to Explore Students’ Professionalism. Academic Medicine, 84(5), 559–565. https://doi.org/10.1097/ACM.0b013e31819fb7ff
Meskell, P., Burke, E., Kropmans, T. J. B., Byrne, E., Setyonugroho, W., & Kennedy, K. M. (2015). Back to the future: An online OSCE Management Information System for nursing OSCEs. Nurse Education Today, 35(11), 1091–1096. https://doi.org/10.1016/j.nedt.2015.06.010
Miller, G. (1990). The assessment of clinical skills/competence/performance. Academic Medicine, 65(9). https://ezproxy.lib.gla.ac.uk/login?url=https://oce.ovid.com/article/00001888-199009000-00045/PDF
Miller, M. D., Linn, R. L., & Gronlund, N. E. (2013). Measurement and assessment in teaching (11th ed., International ed). Pearson Education.
Missimer, C. A. (1995). Good arguments: an introduction to critical thinking (3rd ed). Prentice Hall.
Mitchell, C., Bhat, S., Herbert, A., & Baker, P. (2011). Workplace-based assessments of junior doctors: do scores predict training difficulties? Medical Education, 45(12), 1190–1198. https://doi.org/10.1111/j.1365-2923.2011.04056.x
Moore, T. J. (2011). Critical thinking and disciplinary thinking: a continuing debate. Higher Education Research & Development, 30(3), 261–274. https://doi.org/10.1080/07294360.2010.501328
Murphy, D. J., Bruce, D. A., Mercer, S. W., & Eva, K. W. (2009). The reliability of workplace-based assessment in postgraduate medical education and training: a national evaluation in general practice in the United Kingdom. Advances in Health Sciences Education, 14(2), 219–232. https://doi.org/10.1007/s10459-008-9104-8
NEWBLE, D. I., & SWANSON, D. B. (1988). Psychometric characteristics of the objective structured clinical examination. Medical Education, 22(4), 325–334. https://doi.org/10.1111/j.1365-2923.1988.tb00761.x
Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218. https://doi.org/10.1080/03075070600572090
Noel, G. L. (1992). How Well Do Internal Medicine Faculty Members Evaluate the Clinical Skills of Residents? Annals of Internal Medicine, 117(9). https://doi.org/10.7326/0003-4819-117-9-757
Norcini, J., Anderson, B., Bollela, V., Burch, V., Costa, M. J., Duvivier, R., Galbraith, R., Hays, R., Kent, A., Perrott, V., & Roberts, T. (2011). Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 Conference. Medical Teacher, 33(3), 206–214. https://doi.org/10.3109/0142159X.2011.551559
Norcini, J., & Burch, V. (2007). Workplace-based assessment as an educational tool: AMEE Guide No. 31. Medical Teacher, 29(9–10), 855–871. https://doi.org/10.1080/01421590701775453
Norcini, J. J. (2003a). Peer assessment of competence. Medical Education, 37(6), 539–543. https://doi.org/10.1046/j.1365-2923.2003.01536.x
Norcini, J. J. (2003b). The Mini-CEX: A Method for Assessing Clinical Skills. Annals of Internal Medicine, 138(6). https://doi.org/10.7326/0003-4819-138-6-200303180-00012
Norcross, W. A. (1985). The Consultation: An Approach to Learning and Teaching. JAMA: The Journal of the American Medical Association, 253(3). https://doi.org/10.1001/jama.1985.03350270123038
Olson, B. L., & McDonald, J. L. (2004). Influence of Online Formative Assessment Upon Student Learning in Biomedical Science Courses. Journal of Dental Education, 68(6), 656–659. https://doi.org/10.1002/j.0022-0337.2004.68.6.tb03783.x
Orsini, C., & Binnie, V. I. (2016). Entrustment decisions in dental education: Is it time to start formalising? Medical Teacher, 38(3), 322–322. https://doi.org/10.3109/0142159X.2015.1114598
Palmer, E. J., & Devitt, P. G. (2007). Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions?: research paper. BMC Medical Education, 7(1). https://doi.org/10.1186/1472-6920-7-49
Paniagua, M., & Swygert, K. (Eds). (n.d.). The Gold Book - constructing written test questions for the basic and clinical sciences. https://www.nbme.org/sites/default/files/2020-01/IWW_Gold_Book.pdf
Papadakis, M. A., Teherani, A., Banach, M. A., Knettler, T. R., Rattner, S. L., Stern, D. T., Veloski, J. J., & Hodgson, C. S. (2005). Disciplinary Action by Medical Boards and Prior Behavior in Medical School. New England Journal of Medicine, 353(25), 2673–2682. https://doi.org/10.1056/NEJMsa052596
Patterson, F., Zibarras, L., & Ashworth, V. (2016). Situational judgement tests in medical education and training: Research, theory and practice: AMEE Guide No. 100. Medical Teacher, 38(1), 3–17. https://doi.org/10.3109/0142159X.2015.1072619
Paul, R. (1995). Critical thinking: how to prepare students for a rapidly changing world. foundation for critical thinking.
Paul, R., & Elder, L. (2006). The Miniature Guide to Critical Thinking: Concepts and Tools. https://www.criticalthinking.org/files/Concepts_Tools.pdf
Peters, H., Holzhausen, Y., Boscardin, C., ten Cate, O., & Chen, H. C. (2017). Twelve tips for the implementation of EPAs for assessment and entrustment decisions. Medical Teacher, 39(8), 802–807. https://doi.org/10.1080/0142159X.2017.1331031
Piedra, N., Chicaiza, J., Lopez, J., Romero, A., & Tovar, E. (2010). Measuring collaboration and creativity skills through rubrics: Experience from UTPL collaborative social networks course. IEEE EDUCON 2010 Conference, 1511–1516. https://doi.org/10.1109/EDUCON.2010.5492349
Pohl, C. A., Hojat, M., & Arnold, L. (2011). Peer Nominations as Related to Academic Attainment, Empathy, Personality, and Specialty Interest. Academic Medicine, 86(6), 747–751. https://doi.org/10.1097/ACM.0b013e318217e464
Puryer, J., & O’Sullivan, D. (2015). An introduction to standard setting methods in dentistry. BDJ, 219(7), 355–358. https://doi.org/10.1038/sj.bdj.2015.755
Quantrill, S. J., & Tun, J. K. (2012). Workplace-based assessment as an educational tool. Guide supplement 31.5 – Viewpoint. Medical Teacher, 34(5), 417–418. https://doi.org/10.3109/0142159X.2012.668234
Ramani, S., & Krackov, S. K. (2012). Twelve tips for giving feedback effectively in the clinical environment. Medical Teacher, 34(10), 787–791. https://doi.org/10.3109/0142159X.2012.684916
Ramsey, P. G. (1993). Use of Peer Ratings to Evaluate Physician Performance. JAMA: The Journal of the American Medical Association, 269(13). https://doi.org/10.1001/jama.1993.03500130069034
Read, E. K., Bell, C., Rhind, S., & Hecker, K. G. (2015). The Use of Global Rating Scales for OSCEs in Veterinary Medicine. PLOS ONE, 10(3). https://doi.org/10.1371/journal.pone.0121000
Regehr G1, MacRae H, Reznick RK, Szalay D. (1998). Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med. https://www.ncbi.nlm.nih.gov/pubmed/9759104
Rekman, J., Hamstra, S. J., Dudek, N., Wood, T., Seabrook, C., & Gofton, W. (2016). A New Instrument for Assessing Resident Competence in Surgical Clinic: The Ottawa Clinic Assessment Tool. Journal of Surgical Education, 73(4), 575–582. https://doi.org/10.1016/j.jsurg.2016.02.003
Richstone, L., Schwartz, M. J., Seideman, C., Cadeddu, J., Marshall, S., & Kavoussi, L. R. (2010). Eye Metrics as an Objective Assessment of Surgical Skill. Annals of Surgery, 252(1), 177–182. https://doi.org/10.1097/SLA.0b013e3181e464fb
Rimmer, A. (2023). Situational judgment test is scrapped under new system for allocating foundation training places. BMJ. https://doi.org/10.1136/bmj.p1269
Ross, M. (2015). Entrustable professional activities. The Clinical Teacher, 12(4), 223–225. https://doi.org/10.1111/tct.12436
Royal College of Physicians. (2005). Doctors in Society: Medical professionalism in a changing world. https://cdn.shopify.com/s/files/1/0924/4392/files/doctors_in_society_reportweb.pdf?15745311214883953343
Royal College of Physicians. (2018). Advancing medical professionalism. https://www.healthcarevalues.ox.ac.uk/files/ampsummarypdf
Rudolph, J., Raemer, D., & Shapiro, J. (2013). We know what they did wrong, but not why : the case for ‘frame-based’ feedback. The Clinical Teacher, 10(3), 186–189. https://doi.org/10.1111/j.1743-498X.2012.00636.x
Rushforth, H. E. (2007). Objective structured clinical examination (OSCE): Review of literature and implications for nursing education. Nurse Education Today, 27(5), 481–490. https://doi.org/10.1016/j.nedt.2006.08.009
Rushton, A. (2005). Formative assessment: a key to deep learning? Medical Teacher, 27(6), 509–513. https://doi.org/10.1080/01421590500129159
Ryan, A., Carson, A., Reid, K., Smallwood, D., & Judd, T. (2020). Fully online OSCEs: A large cohort case study. MedEdPublish, 9(1). https://doi.org/10.15694/mep.2020.000214.1
Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144. https://doi.org/10.1007/BF00117714
Sadler, D. R. (1998). Formative Assessment: revisiting the territory. Assessment in Education: Principles, Policy & Practice, 5(1), 77–84. https://doi.org/10.1080/0969595980050104
Sam, A. H., Field, S. M., Collares, C. F., van der Vleuten, C. P. M., Wass, V. J., Melville, C., Harris, J., & Meeran, K. (2018). Very-short-answer questions: reliability, discrimination and acceptability. Medical Education, 52(4), 447–455. https://doi.org/10.1111/medu.13504
Scally, G., & Donaldson, L. J. (1998). Looking forward: Clinical governance and the drive for quality improvement in the new NHS in England. BMJ, 317(7150), 61–65. https://doi.org/10.1136/bmj.317.7150.61
Schoonheim-Klein, M., Muijtjens, A., Habets, L., Manogue, M., Van Der Vleuten, C., & Van Der Velden, U. (2009). Who will pass the dental OSCE? Comparison of the Angoff and the borderline regression standard setting methods. European Journal of Dental Education, 13(3), 162–171. https://doi.org/10.1111/j.1600-0579.2008.00568.x
Schubert, S., Ortwein, H., Dumitsch, A., Schwantes, U., Wilhelm, O., Kiessling, C., Schubert, S., Ortwein, H., Dumitsch, A., Schwantes, U., Wilhelm, O., & Kiessling, C. (2008). A situational judgement test of professional behaviour: development and validation. Medical Teacher, 30(5), 528–533. https://doi.org/10.1080/01421590801952994
Schuwirth, L. W. T., & van der Vleuten, C. P. M. (2003). ABC of learning and teaching in medicine: Written assessment. BMJ, 326(7390), 643–645. https://doi.org/10.1136/bmj.326.7390.643
Schuwirth, L. W. T., & van der Vleuten, C. P. M. (2004). Different written assessment methods: what can be said about their strengths and weaknesses? Medical Education, 38(9), 974–979. https://doi.org/10.1111/j.1365-2929.2004.01916.x
Schuwirth, L. W. T., & van der Vleuten, C. P. M. (2011). General overview of the theories used in assessment: AMEE Guide No. 57. Medical Teacher, 33(10), 783–797. https://doi.org/10.3109/0142159X.2011.611022
Schuwirth, L. W. T., & van der Vleuten, C. P. M. (2012). Programmatic assessment and Kane’s validity perspective. Medical Education, 46(1), 38–48. https://doi.org/10.1111/j.1365-2923.2011.04098.x
Schuwirth, L. W., & van der Vleuten, C. P. (2019). How to Design a Useful Test: The Principles of Assessment. In T. Swanwick (Ed.), Understanding Medical Education (3rd ed, pp. 277–289). Wiley-Blackwell. https://doi.org/10.1002/9781119373780.ch20
Sender Liberman, A., Liberman, M., Steinert, Y., McLeod, P., & Meterissian, S. (2005). Surgery residents and attending surgeons have different perceptions of feedback. Medical Teacher, 27(5), 470–472. https://doi.org/10.1080/0142590500129183
Shaw, S. (2005). Research governance: where did it come from, what does it mean? Journal of the Royal Society of Medicine, 98(11), 496–502. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1275997/pdf/496.pdf
Siau, K., Dunckley, P., Valori, R., Feeney, M., Hawkes, N., Anderson, J., Beales, I., Wells, C., Thomas-Gibson, S., & Johnson, G. (2018). Changes in scoring of Direct Observation of Procedural Skills (DOPS) forms and the impact on competence assessment. Endoscopy, 50(08), 770–778. https://doi.org/10.1055/a-0576-6667
Snell, L. S., & Frank, J. R. (2010). Competencies, the tea bag model, and the end of time. Medical Teacher, 32(8), 629–630. https://doi.org/10.3109/0142159X.2010.500707
Spielman, A., Fulmer, T., Eisenberg, E., & Alfano, M. (2005). Dentistry, Nursing, and Medicine: A Comparisaon of Core Competencies. Journal of Dental Education, 69(11), 1257–1271. https://pubmed.ncbi.nlm.nih.gov/16275689/
Stern, D. T. (2006). Measuring medical professionalism [Electronic resource]. Oxford University Press. https://ebookcentral.proquest.com/lib/gla/detail.action?docID=3053707
Stern, D. T., Frohna, A. Z., & Gruppen, L. D. (2005). The prediction of professional behaviour. Medical Education, 39(1), 75–82. https://doi.org/10.1111/j.1365-2929.2004.02035.x
Sturpe, D. A. (2010). Objective Structured Clinical Examinations in Doctor of Pharmacy Programs in the United States. American Journal of Pharmaceutical Education, 74(8). https://doi.org/10.5688/aj7408148
Suetsugu, N., Ohki, M., & Kaku, T. (2016). Quantitative Analysis of Nursing Observation Employing a Portable Eye-Tracker. Open Journal of Nursing, 06(01), 53–61. https://doi.org/10.4236/ojn.2016.61006
Sutherland, R. M., Reid, K. J., Chiavaroli, N. G., Smallwood, D., & McColl, G. J. (2019). Assessing Diagnostic Reasoning Using a Standardized Case-Based Discussion. Journal of Medical Education and Curricular Development, 6. https://doi.org/10.1177/2382120519849411
Tavakol, M., & Doody, G. A. (2016). A novel psychometric programme for the rapid analysis of OSCE data. Medical Teacher, 38(1), 104–105. https://doi.org/10.3109/0142159X.2015.1062085
Taylor, C. A. (2011). Development of a modified Cohen method of standard setting. Medical Teacher, 33(12), e678–e682. https://doi.org/10.3109/0142159X.2011.611192
Tekian, A., Ten Cate, O., Holmboe, E., Roberts, T., & Norcini, J. (2020). Entrustment decisions: Implications for curriculum development and assessment. Medical Teacher, 1–7. https://doi.org/10.1080/0142159X.2020.1733506
ten Cate, O. (2013a). Nuts and Bolts of Entrustable Professional Activities. Journal of Graduate Medical Education, 5(1), 157–158. https://doi.org/10.4300/JGME-D-12-00380.1
ten Cate, O. (2013b). Nuts and Bolts of Entrustable Professional Activities. Journal of Graduate Medical Education, 5(1), 157–158. https://doi.org/10.4300/JGME-D-12-00380.1
Ten Cate, O. (2017). Competency-Based Postgraduate Medical Education: Past, Present and Future. GMS Journal for Medical Education, 34(5). https://doi.org/10.3205/zma001146
ten Cate, O., & Young, J. Q. (2012). The patient handover as an entrustable professional activity: adding meaning in teaching and practice. BMJ Quality & Safety, 21(Suppl 1), i9–i12. https://doi.org/10.1136/bmjqs-2012-001213
The Campbell Collaboration. (n.d.). http://www.campbellcollaboration.org/
Torsney, K. M., Cocker, D. M., & Slesser, A. A. P. (2015). The Modern Surgeon and Competency Assessment: Are the Workplace-Based Assessments Evidence-Based? World Journal of Surgery, 39(3), 623–633. https://doi.org/10.1007/s00268-014-2875-6
Van De Ridder, J. M. M., Stokking, K. M., McGaghie, W. C., & Ten Cate, O. T. J. (2008). What is feedback in clinical education? Medical Education, 42(2), 189–197. https://doi.org/10.1111/j.1365-2923.2007.02973.x
Van Der Vleuten, C. P. M. (1996). The assessment of professional competence: Developments, research and practical implications. Advances in Health Sciences Education, 1(1), 41–67. https://doi.org/10.1007/BF00596229
van der Vleuten, C. P. M., & Schuwirth, L. W. T. (2005). Assessing professional competence: from methods to programmes. Medical Education, 39(3), 309–317. https://doi.org/10.1111/j.1365-2929.2005.02094.x
van der Vleuten, C. P. M., Schuwirth, L. W. T., Driessen, E. W., Dijkstra, J., Tigelaar, D., Baartman, L. K. J., & van Tartwijk, J. (2012). A model for programmatic assessment fit for purpose. Medical Teacher, 34(3), 205–214. https://doi.org/10.3109/0142159X.2012.652239
van Mook, W. N. K. A., Gorter, S. L., O’Sullivan, H., Wass, V., Schuwirth, L. W., & van der Vleuten, C. P. M. (2009). Approaches to professional behaviour assessment: Tools in the professionalism toolbox. European Journal of Internal Medicine, 20(8), e153–e157. https://doi.org/10.1016/j.ejim.2009.07.012
van Mook, W. N. K. A., van Luijk, S. J., O’Sullivan, H., Wass, V., Schuwirth, L. W., & van der Vleuten, C. P. M. (2009). General considerations regarding assessment of professional behaviour. European Journal of Internal Medicine, 20(4), e90–e95. https://doi.org/10.1016/j.ejim.2008.11.011
Verkerk, M. A., de Bree, M. J., & Mourits, M. J. E. (2007). Reflective professionalism: interpreting CanMEDS’ ‘professionalism’. Journal of Medical Ethics, 33(11), 663–666. https://doi.org/10.1136/jme.2006.017954
Voelkel, S., & Mello, L. V. (2014). Audio Feedback – Better Feedback? Bioscience Education, 22(1), 16–30. https://doi.org/10.11120/beej.2014.00022
Watson, R., Stimpson, A., Topping, A., & Porock, D. (2002). Clinical competence assessment in nursing: a systematic review of the literature. Journal of Advanced Nursing, 39(5), 421–431. https://doi.org/10.1046/j.1365-2648.2002.02307.x
Weaver, M. R. (2006). Do students value feedback? Student perceptions of tutors’ written responses. Assessment & Evaluation in Higher Education, 31(3), 379–394. https://doi.org/10.1080/02602930500353061
Wilkinson, T. J., Wade, W. B., & Knock, L. D. (2009). A Blueprint to Assess Professionalism: Results of a Systematic Review. Academic Medicine, 84(5), 551–558. https://doi.org/10.1097/ACM.0b013e31819fbaa2
Williams, D. M., Davies, S., Horner, M., & Handley, J. (2016). Peer and near-peer OSCE examiners. Medical Teacher, 38(2), 212–213. https://doi.org/10.3109/0142159X.2015.1072266
Williams, R. G., Verhulst, S., Colliver, J. A., & Dunnington, G. L. (2005). Assuring the reliability of resident performance appraisals: More items or more observations? Surgery, 137(2), 141–147. https://doi.org/10.1016/j.surg.2004.06.011
Wood, D. F. (2019). Formative Assessment. In T. Swanwick (Ed.), Understanding Medical Education (pp. 317–328). John Wiley & Sons, Ltd. https://doi.org/10.1002/9781119373780.ch25
Wood, T. J., Humphrey-Murto, S. M., & Norman, G. R. (2006). Standard Setting in a Small Scale OSCE: A Comparison of the Modified Borderline-Group Method and the Borderline Regression Method. Advances in Health Sciences Education, 11(2), 115–122. https://doi.org/10.1007/s10459-005-7853-1
Wood, T. J., & Pugh, D. (2020). Are rating scales really better than checklists for measuring increasing levels of expertise? Medical Teacher, 42(1), 46–51. https://doi.org/10.1080/0142159X.2019.1652260
Woodhouse, L. (n.d.). Comparison of Cohen and Angoff methods of standard setting: is Angoff worth it? European Board of Medical Assessors Annual Academic Conference: Crossing Boundaries: Assessment in Medical Education. https://eprints.ncl.ac.uk/251674
Yardley, S., & Dornan, T. (2012). Kirkpatrick’s levels and education ‘evidence’. Medical Education, 46(1), 97–106. https://doi.org/10.1111/j.1365-2923.2011.04076.x
Zijlstra-Shaw, S., Robinson, P. G., & Roberts, T. (2012). Assessing professionalism within dental education; the need for a definition. European Journal of Dental Education, 16(1), e128–e136. https://doi.org/10.1111/j.1600-0579.2011.00687.x