Abstract
Study Objective To assess reliability and reproducibility of a recently instituted anesthesiology resident applicant interview scoring system at our own institution. Design Retrospective evaluation of 2 years of interview data with a newly implemented scoring system using randomly assigned interviewing faculty. Setting Interview scoring evaluations were completed as standard practice in a large academic anesthesiology department. Subjects All anesthesiology resident applicants interviewed over the 2013/14 and 2014/15 seasons by a stable cohort of faculty interviewers. Data collection blinded for both interviewers and interviewees. Interventions None for purposes of study - collation of blinded data already used as standard practice during interview process and analysis. Measurements None specific to study. Main Results Good inter-rater faculty reliability of interview scoring (day-of) and excellent inter-faculty reliability of application review (pre-interview). Conclusions Development of a department-specific interview scoring system including many elements beyond traditional standardized tests shows good-excellent reliability of faculty scoring of both the interview itself (including non-technical skills) and the application resume.
Original language | English (US) |
---|---|
Pages (from-to) | 131-136 |
Number of pages | 6 |
Journal | Journal of Clinical Anesthesia |
Volume | 31 |
DOIs | |
State | Published - Jun 1 2016 |
Keywords
- Applicant
- Faculty
- Interview
- Reliability
- Reproducibility
- Resident
ASJC Scopus subject areas
- Anesthesiology and Pain Medicine