Main Article Content
The development, implementation and evaluation of a short course in objective structured clinical examination (OSCE) skills
Abstract
Background: Objective structured clinical examination (OSCE) examiner training is widely employed to address some of the reliability and validity issues that accompany the use of this assessment tool. An OSCE skills course was developed and implemented at the Stellenbosch Faculty of Health Sciences and its influence on participants (clinicians) evaluated.
Method: Participants attended the OSCE skills course, which included theoretical sessions concerning topics such as standard setting, examiner influence and assessment instruments, as well as two staged OSCEs, one at the beginning and the other at the end of the course. During the latter, each participant examined a student role-player performing a technical skill while being video recorded. Participants’ behaviour and assessment results from the two OSCEs were evaluated, as well as the feedback from participants regarding the course and group interviews with student role players.
Results: There was a significant improvement in inter-rater reliability as well as a slight decrease in inappropriate examiner behaviour, such as teaching and prompting during assessment of students. Furthermore, overall feedback from participants and perceptions of student role players was positive.
Conclusions: In this study, examiner conduct and inter-rater reliability was positively influenced by the following interventions: examiner briefing, involvement of examiners in constructing assessment instruments, as well as examiners viewing (on DVD) and reflecting on their assessment behaviour. This study proposes that the development and implementation of an OSCE skills course is a worthwhile endeavour in improving validity and reliability of the OSCE as an assessment tool.
Keywords: OSCE, Objective Structured Clinical Examination, examiner training, examiner conduct, inter-rater reliability, inter-rater agreement