Original Article

Evaluation of Checklist and Inter-Rater Agreement in Oral Case Presentation of Undergraduate Medical Students

Jungwon Huh, Miae Lee, Whasoon Chung
Author Information & Copyright
Department of Laboratory Medicine, School of Medicine, Ewha Womans University, Korea.

Copyright ⓒ 2007. Ewha Womans University School of Medicine. This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Published Online: Mar 30, 2007

Abstract

Background

Undergraduate medical students should learn oral presentation skills, which are central to physician-physician communication. The purpose of this study was to compare checklist scores with global ratings for evaluation of oral case presentation and to investigate interrater agreement in the scoring of checklists.

Methods

The study group included twenty-one teams of undergraduate medical students who did clerkship for 2 weeks in the department of Laboratory Medicine of Mokdong Hospital, School of Medicine, Ewha Womans University from January 2005 to October 2006. Three faculty raters independently evaluated oral case presentations by checklists, composing of 5 items. A consensus scores of global ratings were determined after discusssion. Inter-rater agreement was measured using intraclass correlation coefficient(ICC). As the ICC values approaches 1.0, it means higher inter-rater agreement.

Results

The mean of consensus global ratings was significantly higher than that of checklists by three faculty raters(12.6±1.7 vs 11.1±2.0, P<0.001). Spearman's correlation coefficient between global ratings and checklist scores was r=0.82(P<0.01). The overall scores of checklist were significantly different among three raters (12.3±2.0, 10.8±2.8, 10.0±2.7, P<0.05). ICC values in the scoring of checklists were as follows ; for overall scores, 0.750 ; for individual checklist items, 0.350-0.753.

Conclusions

These results suggest that checklist scores by faculty raters could be one of the most useful tools for evaluation of oral case presentation, if checklist would be modified to make less ambiguous and more objective and faculty raters would have opportunities to be educated and trained for evaluation skills of oral case presentation.

Keywords: Oral case presentation; Undergraduate medical students; Global rating; Checklist; Inter-rater agreement