|ORIGINAL RESEARCH PAPER
|Year : 2011 | Volume
| Issue : 2 | Page : 421
Clinical Skills Assessment: Comparison of Student and Examiner Assessment in an Objective Structured Clinical Examination
F Jahan, S Sadaf, S Bhanji, N Naeem, R Qureshi
Aga Khan University Hospital (AKUH), Karachi, Pakistan
|Date of Submission||13-Nov-2009|
|Date of Acceptance||07-Dec-2010|
|Date of Web Publication||10-Aug-2011|
Family Medicine Department, AKUH, Stadium Road, Karachi
Source of Support: None, Conflict of Interest: None
Background: Learning of basic clinical skills is introduced in Years 1 and 2 of the MBBS Program at the Aga Khan University, Pakistan, through a structured Clinical Skills Teaching program. Acquisition of competence in performing these skills is assessed through use of the Objective Structured Clinical Examination (OSCE). Self-assessment is defined broadly as the involvement of learners in judging whether or not learner-identified standards have been met.
Objective: We compared Year 2 students' self-assessment of clinical skills with examiners' assessment of performance in an OSCE using a standard rating scale.
Methods: A self-assessment questionnaire was completed by all Year 2 students immediately after the OSCE. Students assessed their performance at three stations, using a performance rating scale. Examiners observed and evaluated the students during history-taking and physical examination using the same rating scale.
Results: There were significant positive correlations between examiners' assessments of performance and students' self-assessed ratings in taking consent, obtaining demographic information, history of presenting problems and summarization. Significant differences were observed in pre-procedural skills, comment on prostate, liver palpation percussion and spleen percussion.
Conclusions: Findings highlight the strengths and weaknesses in clinical competence at the end of Year 2 and provide a direction to improve the gaps in the Clinical Skills Teaching program.
Keywords: Self-assessment, medical education, clinical skills, OSCE
|How to cite this article:|
Jahan F, Sadaf S, Bhanji S, Naeem N, Qureshi R. Clinical Skills Assessment: Comparison of Student and Examiner Assessment in an Objective Structured Clinical Examination. Educ Health 2011;24:421
|How to cite this URL:|
Jahan F, Sadaf S, Bhanji S, Naeem N, Qureshi R. Clinical Skills Assessment: Comparison of Student and Examiner Assessment in an Objective Structured Clinical Examination. Educ Health [serial online] 2011 [cited 2022 Jan 19];24:421. Available from: https://www.educationforhealth.net/text.asp?2011/24/2/421/101446
Clinical skills and theoretical knowledge are two equally important parts of medical education1. Early introduction of clinical skills makes students more comfortable in performing patient assessment during their clerkship years2. It is important for undergraduate medical students to acquire interviewing/communication techniques and physical examination skills early in medical education, in order to adopt correct behaviors in approaching patients’ problems3,4. However, early introduction of clinical skills has several challenges which can affect student learning5.
The objective structured clinical examination (OSCE) has been recognized not only as a useful assessment tool but also as a valuable method of promoting student learning. Multiple skills assessment can be performed by combining a global rating scale with a checklist in an OSCE. Student self-assessment is also viewed as a means of helping students recognize their strengths and weaknesses, understand the relevance of core learning objectives and take more responsibility in the medical encounter. In health professions education, self-assessment is a key step in the continuing professional development cycle.
Self-assessment of knowledge and accuracy of skill performance is essential to the practice of medicine and self-directed life-long learning6. However, overestimation of competencies can lead to misdiagnosis, inadequate performance and premature closure. Underestimation, on the other hand, usually leads to overuse of diagnostic tests, excessive uncertainty and unnecessary referrals. Comparison of students with faculty assessments can result in focus on areas in need of feedback and provide input into the accuracy of self-assessment. It is critical to consider behavioral indications of awareness of the limits of self-assessment ability7.
One major component of the curriculum at Aga Khan University (AKU) targets acquiring patient interviewing and communication skills, history-taking and development of physical examination skills for each body system. For Years 1 and 2, this has been done using simulated patients. Clinical skills utilizes small group demonstrations by facilitators and then performance by each student with feedback from peers, facilitators as well as from patients. There are 12 total modules for Years 1 and 2 in the clinical skills curriculum. This includes symptom-based history-taking, general physical examination, and examination of the abdomen, including digital rectal examination, chest, cardiovascular system, central nervous system, musculoskeletal system, pelvic examination and breast examination. These clinical examinations and practices are performed on standardized patients, models, mannequins and peers.
At the end of Year 1, there is a formative clinical skills objective structured clinical examination, followed by feedback. At the end of Year 2, students attend a mandatory on-campus clinical rotation in ambulatory clinics and wards in the specialties of Surgery, Medicine and Family Medicine. This is the first encounter with real patients and is supervised by the respective department’s clinical faculty. Students must pass the end of Year 2 summative clinical skills OSCE to be promoted to Year 3. An evaluation objective at this time is to better understand how capable students are in self-assessment and look at any differences between students’ self-assessment and faculty assessment. This provides an opportunity to assist students in identifying their strengths and weaknesses. Within this context, in the present study, we compared self-assessments of clinical skills by Year 2 students with assessments of their performance by the examiners at the year-end OSCE, using a standard rating scale.
Study Design: Our cross-sectional study focused on second year medical students in the OSCE program in June 2009. Students had completed the scheduled and mandatory clinical skills training in the previous two years with clinical faculty. There were 16 stations in the OSCE, conducted in three parallel circuits to accommodate all eligible students in one day. Each OSCE circuit was comprised of stations on history-taking, communication skills, physical examination and procedures, with a rest station. The last four stations were designated as study stations designed to help the students identify strengths and weaknesses in their developing repertoire of clinical skills. These stations addressed: history of flank pain; examination of the liver and spleen; procedural skills in the digital rectal examination; and self-assessment (rest) station.
Three of the study stations involved working with a standardized patient (SP) and mannequin, which was a true replica of the OSCE. As mentioned, participating Year 2 students were asked to self-assess their competence in clinical skills using a rating scale also used by a faculty member, who assessed the same set of competences in clinical skills of the student.
Study Setting: The study took place at Aga Khan University-Medical College (AKU-MC), focusing on Year 2 students eligible for the OSCE. Students who had missed more than two clinical skills sessions were not eligible for the present study. Students were given two briefing sessions before the OSCE, where the goals and objectives of the study were explained, queries and concerns were addressed and consent for participation was taken. Participation by completing the study questionnaire was on a voluntary basis and students were assured that non-responders would not be penalized.
A total of 15 faculty examiners were nominated for the OSCE by their respective departments. There were three stations where the study was conducted, so examiners were needed for each of three parallel cycles, or nine examiners. Names of nominated faculty were listed in alphabetical order and using SPSS 16.0 software, we selected the nine examiners through random sampling. All examiners were clinical teaching faculty and had attended the workshop 'Introductory Short Course in Clinical Teaching' at AKU. The primary investigator also conducted two sessions on 'Orientation to the OSCE Rating Scale' for the examiners prior to the OSCE, to standardize the process of assessment on the rating scale. The project was initiated after approval from the AKU Ethics Review committee. Consent for participation in the study was obtained from both faculty and students prior to the study.
Instrument and Data Collection: A rating scale consisting of items relevant to specific history-taking and physical examination skills was developed keeping in view the objectives of the clinical skills training received in the first two years of medical school. Face and content validity of each rating scale was established through review and consensus by a core group of senior faculty. Stations were selected to represent the curricular goals and objectives and to reflect authentic clinical situations. Rating scales were designed based on a literature search as well as through consultation with the faculty of the Department for Educational Development and Family Medicine. Based on discussions, consensus was achieved on the items and structure of the rating scale, which included the features viewed to be most important by the development committee. Generic aspects of history-taking, like questioning skills, professional manner and organization of interviews with time management and closing of interviews were included in the rating scale. Correctness of skills, purposeful and logical flow and completion of all relevant components were included as items in the scale.
The evaluation of performance was done using an eight-point response range consisting of not done, poor, marginal, satisfactory, good, very good, excellent and outstanding. Satisfactory was the minimum passing criteria. All attributes were assessed in the same manner. The rating scale also included assessment of overall performance on a global seven-point response range, from poor to outstanding. Additionally, the rating scale for the study station on history of a patient with flank pain included communication skills with items on meet and greet, self-introduction and taking consent before history-taking.
Collection of patient demographic details included age, occupation and residence or part of the city in which the person lived. Items related to details of flank pain included site, onset, duration, frequency, progression, character, radiation, intensity with pain score and aggravating and relieving factors. Associated symptoms were nausea, vomiting, fever and dysuria. History-taking also included past medical and surgical problems, family history and personal and drug history. At the end of history-taking, students needed to summarize the history to the patient.
The task at the station on digital rectal examination was performed on a mannequin. The rating scale for this station included an item where the student, at the start of the performance of the task, had to mention that if this would have been a real patient s/he would greet, explain the procedure to the patient, obtain consent and identify the need of a chaperone as a prerequisite to the examination. Items on the rating scale also included mentioning the position of the patient, wearing gloves and use of lubricant, inspection of the rectal area, insertion of finger into the rectum, commenting on mucosa and prostate, withdrawal of finger and checking for finger stall.
The items on the rating scale for the station on examination of liver and spleen, performed on a simulated patient, included meet and greet, introduction of self, taking consent after explaining the exposure to the patient and asking about pain. Other items to be rated were: inspection from foot end and bedside; examination of the liver by palpation starting in right iliac fossa with correct hand positioning; palpation of the spleen starting from right iliac fossa diagonally towards left costal margin and turning the patient to right lateral position; and percussion to measure liver span and splenic enlargement.
Data Analysis: Results were analyzed using SPSS 16.0 for Windows. For each attribute, mean and standard deviation of assessment scores for both students and faculty were calculated. These were tested using Paired Students t-test for significant differences or the Wilcoxan Sign rank test, as appropriate. Level of significance was set at p < 0.05. Correlations between students and faculty evaluations were calculated using a Spearman rank order correlation.
There were 93 participating students, 41 female and 52 male. The nine examiners were from Surgery (four), Medicine (two), Anesthesiology (two) and Family Medicine (one).
First, in terms of the rating scale, reliability of the scale was calculated to be 0.93 (Cronbach’s alpha). In terms of overall competency performance, 93% of students achieved greater than minimum competence in their overall scores, which was considered to be more than satisfactory.
Table 1 presents results of both student and faculty assessments of competency in skill performance. Students most highly self-assessed their performance in relation to performing liver palpation and percussion (5.0 and 5.2, respectively), introducing themselves to patients (5.0) and obtaining consent (5.0). Faculty assessed the students significantly lower on each of these components.
Students’ lower assessments of themselves included competencies in comments on mucosa (4.0), summarizing history in general (4.0) and taking a personal history (4.1). Summarizing a history in general was also the component rated as the weakest by faculty (2.9). Faculty tended to more highly assess students’ competencies in procedural skills (4.6) as well as introductions to the patient, obtaining consent and taking a family history (4.4, 4.4 and 4.3, respectively). Overall, to a statistically significant degree, students more highly assessed their competencies on 12 of the 16 dimensions compared to faculty assessments.
There was a statistically significant strong correlation (r=0.84; p<0.001) between students’ assessment on the seven-point global rating scale and the History station and moderate correlations for global rating and the Procedures (r=0.66; p<0.001) and Physical Examination stations (r=0.62; p<0.001). Correlations between faculty assessment on the standardized and global scale were statistically significant and very strong for all three study stations: History (r=0.89; p<0.001); Procedures (r=0.86; p<0.001); and Physical Examination (r=0.92; p<0.001). The overall correlation between student and faculty assessments was low and not significant (r=0.06; p=0.55).
Table 1: Comparison between student and faculty assessments of competency in clinical skills performance
Students seemed to demonstrate appropriate self-assessment skills in various attributes like introduction to patients, exploring presenting problems, drug and family history and pre-procedural skills as they are usually motivated in focused practical sessions. It has been shown that introduction to early clinical skills can enhance student learning interest with confidence8. Pre-clinical patient contact helps students to better understand their own weaknesses and their impact on a patient’s care9.
In our study, students overestimated their skills in history-taking and abdominal examination, which they had opportunities for practicing in the last two years through focused clinical skills sessions, including review sessions and on-campus clinical rotations. Students demonstrated appropriate self-evaluation skills and were quite motivated to participate in skills learning. As Regehr and Eva describe, self-assessment is a tool for self-regulating professionals, and usually relates to the frequency of performance as in introductory history-taking, exploring presenting problems and taking drug and family histories10.
On the other hand, students underestimated their skills mainly related to procedures, especially not being able to comment on the mucosa of the rectum and prostate. This may be attributed to the fact that they had practiced procedures on mannequins, where they focused more on the technical aspects of the procedure rather than the actual steps in a clinical setting with real patients. Self-assessment accuracy is stable and may be influenced by task familiarity11,12. Longitudinal skills learning influences learning abilities, but also depends on teaching style and exposure in the specific areas.
Comparisons between student and examiner assessments showed significant differences in such areas as communication skills, exploring presenting problems and in liver examination and spleen percussion. The differences in procedural skills were mainly related to pre-procedural skills and prostate examination. One past study has shown that students tended to assess their skills much lower than expected by their teachers13.
Overall, self-assessment provides students with the opportunity to identify their strengths and weaknesses14. Self-assessment as a tool for further learning and professional development incorporated into structured clinical skills in the early years of education makes a great contribution to positive skill development in later years15. Self-assessment of knowledge and accuracy of skill performance is essential to the practice of medicine and self-directed life-long learning, which is an ultimate goal of medical education16,17.
Interestingly, our medical students were more confident in their communication skills while taking a history, in contrast to a study in Germany18 where medical students evaluated their competence and found themselves to be deficient in communication skills and diagnosis. We found a significant positive relation between actual performance and self-rating in communication skills and history-taking.
The introduction of structured clinical skills teaching in the early years of education can contribute greatly to the development of students’ clinical skills by helping them build on their strengths and address their learning needs19. Our study demonstrates the OSCE as a convenient tool for providing deeper insight into students’ ability to prioritize, self-assess and promote their own learning20. Comparing medical students’ self-assessment with examiners’ assessment in the OSCE highlighted students' strengths and weaknesses, as well as challenges in curriculum implementation21,22.
Future group performances and self-assessment skills could be improved by conducting mid-clerkship formative assessments in order to identify and address weaknesses in a timely fashion.
Additionally, to strengthen the clinical skills curriculum and ensure consistent and uniform exposure to core competences, it is important that clinical skills teaching is accompanied with rigorous feedback and review sessions at the end of each teaching module.
The authors acknowledge contributions from the following faculty members: Dr. Rukhsana Zuberi, Associate Dean of Education (Family Medicine/Department for Educational Development); Dr. Rashida Ahmed, Associate Professor, Department. of Pathology and Microbiology; Dr. Naheed Nabi (Family Medicine); Dr. Aziza Hussaini (Anesthesia); Dr. Tanveer ul Rehman (Surgery); Dr. Tashfeen Ahmed (Surgery); Dr. Junaid Patel (Medicine); Dr. Kamran Hafeez (Surgery); Dr. Muhammad Shahid (Emergency Medicine); Dr. Iram Naz (Surgery); Dr. Muneer Amanullah (Cardiac Surgery); and Dr. Faisal Shamim (Anesthesia).
1. Remmen R, Scherpbier A, van der Vleuten C, Denekens J, Derese A, Hermann I, Hoogenboom R, Kramer A, Van Rossum H, Van Royen P, Bossaert L. Effectiveness of basic clinical skills training programs: A cross-sectional comparison of four medical schools. Medical Education. 2001; 35:121-128.
2. Diemers AD, Dolmans DHJM, Verwijnen MGM, Heineman E, Scherpbier AJJA. Students’ opinion about the effects of preclinical patient contacts on their learning. Advances in Health Sciences Education: Theory and Practice. 2008; 13(5):633-647.
3. Sanson-Fisher RW, Rolfe IE, Jones P, Ringland C, Agrez M. Trialling a new way to learn clinical skills: Systematic clinical appraisal and learning. Medical Education. 2002; 36(11):1028-1034.
4. Kamalski DM, Braak EW, Cate OT, Borleffs JC. Early clerkships. Medical Teacher. 2007; 29(9):915-920.
5. Lam TP, Irwin M, Chow LWC, Chan P. Early introduction of clinical skills teaching in a medical curriculum—Factors affecting students’ learning. Medical Education. 2002; 36(3):233-240.
6. Pierre RB, Wierenga A, Barton M, Thame K, Branday JM, Christie CDC. Student self-assessment in a paediatric objective structured clinical examination. West Indian Medical Journal. 2005; 54(2)144-148.
7. Eva KW, Regehr G. Knowing when to look it up: A new conception of self assessment ability. Academic Medicine. 2007; 82(10 suppl):S81-S84.
8. Makoul G, Altman M. Early assessment of medical students’ clinical skills. Academic Medicine. 2002; 77(11):1156.
9. Eva KW, Regehr G. Self-assessment in the health professions: A reformulation and research agenda. Academic Medicine. 2005; 80(10 suppl):46-54.
10. Regehr G, Eva K. Self assessment, self direction and self regulating professional. Clinical Orthopedics Related Research. 2006; 449:34-48.
11. Bianchi F, Stobbe K, Eva K. Comparing academic performance of medical students in distributed learning sites: The McMaster experience. Medical Teacher. 2008;30(1):67-71.
12. Tara M. Using assessment for learning and learning from assessment. Assessment Evaluation in Higher Education. 2002;27:501-510.
13. Sičaja M, Romić D, Prka Z. Medical students’ clinical skills do not match their teachers’ expectations: Survey at Zagreb University School of Medicine, Croatia. Croatian Medical Journal. 2006; 47(1):169-175.
14. Benbassat J, Baumal R. Enhancing self-awareness in medical students. An overview of teaching approaches. Academic Medicine. 2005; 80(2):156-161.
15. Gudal D, Ozcaker N, Yeniceri N, Dontlu C, Ulusel B. Comparison of clinical skills of 3rd-year students who completed structured clinical skills program with 6th-year students who acquired clinical skills in unsystematic way. Teaching and Learning in Medicine. 2005; 17(1):21-26.
16. Evans AW, McKenna C, Oliver M. Self assessment in medical practice. Journal of the Royal Society of Medicine. 2002; 95(10):511-513.
17. Moercke AM, Eika B. What are the clinical skills levels of newly graduated physicians? Self-assessment study of an intended curriculum identified by a Delphi process. Medical Education. 2002; 36(5):472-478.
18. Fischer T, Chenot JF, Simmonroth-Navda A, Heinemann S, Kochen MM, Himmel W. Learning core clinical skills - A survey at 3 times point during medical education. Medical Teacher. 2007; 29(4):397-399.
19. Fitzgerald JT, Gruppen LD, White CB. The influence of task formats on the accuracy of medical students' self-assessments. Academic Medicine. 2000; 75:737-741
20. Mattheos N, Nattestad A, Falk-Nilsson E, Attström R. The interactive examination: Assessing students’ self assessment ability. Medical Education. 2004;38(4):378-389.
21. Hodder RV, Rivington RN, Calcutt LE, Hart IR. The effectiveness of immediate feedback during the objective structured clinical examination. Medical Education. 1989;23(2):184-188.
22. Eva KW, Cunnington JP, Reiter HI, Keane DR, Norman GR. How can I know what I don't know? Poor self-assessment in a well defined domain. Advances in Health Sciences Education: Theory and Practice. 2004; 9:211-224.