4.5 Article

Standard setting for clinical competence at graduation from medical school: A comparison of passing scores across five medical schools

期刊

ADVANCES IN HEALTH SCIENCES EDUCATION
卷 11, 期 2, 页码 173-183

出版社

SPRINGER
DOI: 10.1007/s10459-005-5291-8

关键词

Angoff; assessment; high-stakes examinations; OSCEs; standard setting

向作者/读者索取更多资源

While Objective Structured Clinical Examinations (OSCEs) have become widely used to assess clinical competence at the end of undergraduate medical courses, the method of setting the passing score varies greatly, and there is no agreed best methodology. While there is an assumption that the passing standard at graduation is the same at all medical schools, there is very little quantitative evidence in the field. In the United Kingdom, there is no national licensing examination; each medical school sets its own graduating assessment and successful completion by candidates leads to the licensed right to practice by the General Medical Council. Academics at five UK medical school were asked to set passing scores for six OSCE stations using the Angoff method, following a briefing session on this technique. The results were collated and analysed. The passing scores set for the each of the stations varied widely across the five medical schools. The implication for individual students at the different medical schools is that a student with the same level of competency may pass at one medical school but would fail at another even when the test is identical. Postulated reasons for this difference include different conceptions of the minimal level of competence acceptable for graduating students and the possible unsuitability of the Angoff method for performance based clinical tests.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据