|
|
ORIGINAL RESEARCH PAPER |
|
Year : 2012 | Volume
: 25
| Issue : 3 | Page : 148-152 |
|
Use of Computer-based Clinical Examination to Assess Medical Students in Surgery
Gamal E H A El Shallaly, Abdelrahman M Mekki
Alzaiem Alazhari University, Khartoum, Sudan
Date of Web Publication | 29-Mar-2013 |
Correspondence Address: Gamal E H A El Shallaly Surgical Department, Alzaiem Alazhari University, PO Box 2910 Khartoum Sudan
 Source of Support: None, Conflict of Interest: None  | Check |
DOI: 10.4103/1357-6283.109789
Introduction: To improve the viewing of the video-projected structured clinical examination (ViPSCE), we developed a computerized version; the computer-based clinical examination (CCE). This was used to assess medical students' higher knowledge and problem solving skills in surgery. We present how we did this, test score descriptive statistics, and the students' evaluation of the CCE. Methods: A CCE in surgery was administered to assess a class of 43 final year medical students at the end of their surgical clerkship. Like the ViPSCE, the exam was delivered as a slide show, using a PowerPoint computer program. However, instead of projecting it onto a screen, each student used a computer. There were 20 slides containing either still photos or short video clips of clinical situations in surgery. The students answered by hand writing on the exam papers. At the end, they completed evaluation forms. The exam papers were corrected manually. Test score descriptive statistics were calculated and correlated with the students' scores in other exams in surgery. Results: Administration of the CCE was straightforward. The test scores were normally distributed (mean = median = 4.9). They correlated significantly with the total scores obtained by the students in surgery (r = 0.68), and with each of the other exam modalities in surgery, such as the multiple choice and structured essay questions. Acceptability of the CCE to the students was high and they recommended the use of the CCE in other departments. Discussion: CCE is feasible and popular with students. It inherits the validity and reliability of the ViPSCE with the added advantage of improving the viewing of the slides. Keywords: Assessment of medical students, Computer assisted Assessment (CAA), Computer-based assessment (CBA), E assessment, end of surgical clerkship examination, Objective Structured Clinical Exam (OSCE), undergraduate medical education, Video-projected structured clinical examination (ViPSCE)
How to cite this article: El Shallaly GE, Mekki AM. Use of Computer-based Clinical Examination to Assess Medical Students in Surgery. Educ Health 2012;25:148-52 |
Introduction | |  |
Alzaiem Alazhari is a governmental (public) university, named after the first president of the Republic of Sudan. It was established in 1994 and, to date, over 1400 doctors have graduated from its school of medicine. Sudan is a sub-Saharan African country with limited resources.
The authors invented the video-projected structured clinical examination (ViPSCE) in 2001. [1],[2] The ViPSCE replaced the oral (viva) examination as a student assessment tool in clinical surgery in our department. It has been useful for assessing student's cognitive skills, including problem-solving abilities. It proved to be valid, reliable, and practicable, and it saved money and time for both staff and students. [2] It is not surprising, therefore, how it rapidly became popular with teachers and students alike, both in other departments in our Faculty and with other medical schools across Sudan.
There were some limitations with the ViPSCE, however. The exam consisted of slides of still photographs or short video clips of patients, instruments, investigations, or procedures. One of the concerns in administering the ViPSCE has been the visibility and clarity of the slides when projected on a screen. This was partially alleviated through several measures. We used a room with special characteristics. It was large enough to accommodate the increasing numbers of students, who could reach 120 to150. The room was wide enough to accommodate a large screen visible from a wide angle. Light within the room could be dimmed near the screen to enhance visibility of the slides but was brighter elsewhere in the room to allow students to read and answer questions printed on paper. High resolution video projectors with strong illumination power were used. Students with weak vision were allowed to sit in the first row near the screen.
We were fortunate to have an information technology (IT) center built near the medical school, with 50 computers. It was thought that presenting the exam slides on computer screens would increase slide clarity for students and reduce the requirements of the room. The agenda of transforming the ViPSCE into a computer-based exam was thus born. The idea of using computers in assessment of medical students is not new. We present a delivery model that could be employed at institutions with limited resources if the computers are available. The study was exempt from human subjects review.
Methods | |  |
The preparation of the exam
We assessed 43 medical students at the end of their surgical clerkship. The students were all of Sudanese nationality, and all were more or less within a narrow age range. The exam was in the form of a slide show. It consisted of 20 slides using the PowerPoint (PPt) computer program (Microsoft 2007). The slides changed automatically every 3 minutes and the change was marked by an applause sound to draw the attention of the student to the appearance of the next slide.
Some of the slides contained still photographs; others showed short video clips of less than a minute duration. The material presented included photographs of patients, investigations such as X-rays with and without contrast, computed tomography (CT) scans, surgical instruments, catheters, and various surgical and anesthetic tubes. Video clips showed patients undergoing physical examinations of conditions such as hernias, and surgical procedures such as endoscopy, and spinal anesthesia. All pictures and videos featured local patients who were treated in our governmental teaching hospital after obtaining their consent. Preexam meetings, attended by all staff of the department of surgery, were held to make sure that the exam covered the entire surgery curriculum evenly, including emergency situations and nonemergency cases. The topics within surgery were covered evenly by using the 'blueprint' we already designed for the ViPSCE. [2] The distribution was as follows: 12 questions on general surgery, 2 each on orthopedics, urology, pediatric surgery, and anesthesia.
The examination paper consisted of questions related to the slides. The number of the questions matched the number of the slide. These questions were constructed in an objective and structured way, mainly in the form of surgical problems. We avoided open-ended questions. They explored all the levels of knowledge, from simple identification to problem solving. Examples of the questions students answered and their layout is shown in [Figure 1]. | Figure 1: Examples of the questions students answered (objective and structured)
Click here to view |
The students answered the questions by hand writing in the spaces allotted in the same exam paper.
Administration of the exam
There were two computer rooms available to the department, each contained 25 computers. The 43 students were, therefore, divided into two groups. Each student had his/her own computer [Figure 2] and [Figure 3]. There were 2-3 IT tutors present in each room to help the students in case of technical problems. In addition, at least one consultant surgeon was present in each class. Both groups underwent the same test using the same administration approaches.
Before the beginning of the exam, the students were briefed about the exam. The use of computers was simplified, as the PPt program was prepared to start by one touch of the mouse. The two groups started and finished the exam at more or less the same time to prevent communication between the groups. Mobile (cell) phones and other communication media were prohibited, and the IT tutors helped in monitoring the students [Figure 2]. | Figure 2: Administration of the CCE (Front view). Students handwrite their answers on the question sheet. (Note the monitor standing at the back.)
Click here to view |
 | Figure 3: Administration of the CCE. View from the back of the class. Note the greater clarity of the picture on the computer screens than that projected
Click here to view |
The students were shown both projected and computerized images of the same slides at the same time [Figure 3]. This was done because the students were asked to compare the two images in the evaluation form they completed at the end of the exam.
The exam duration was 60 minutes (3 minutes per slide). The slides changed automatically. After the last slide (end of exam), the students were allowed 3 more minutes for revisions. During these final 3 minutes, students were allowed free use of the PPt program on their own computers to review the slides, if they wished. The students filled evaluation forms following the exam.
The exam was graded manually using a model answer sheet. The maximum test score was 100, with 5 points per slide. This was divided by 10 to yield a score with a maximum of 10. Thus the contribution of the CCE as a tool was 10% of the total surgery score of 100. The other tools of assessment in surgery included an OSCE (40 points), multiple choice questions (MCQs) (20 points), and short structured exam questions or SSEQ (30 points).
Analysis of CCE scores and evaluation forms
The scores in the CCE were described statistically and correlated with scores obtained by the students in the other tools of assessment in surgery, using the SPSS 16.0 (SPSS Inc, Chicago, IL ,USA) statistical package.
The evaluation forms were analyzed manually by simple counting and calculating percentages. Common themes were identified and positive and negative statements counted.
Results | |  |
The scores: distribution and correlations
[Figure 4] shows the distribution of the CCE scores. The histogram describes a normal distribution with a mean of (4.9), median (4.9), standard deviation (1.17), with a range of 5 (2.5-7.5). | Figure 4: Histogram showing the distribution of CCE scores of 43 students [Test score = 10. Highest score = 7.5. Lowest score = 2.5]
Click here to view |
The 2-tailed Pearson correlation tests showed a strong correlation (P < 0.001) between the score each student received in the CCE with their total surgery score (r = 0.683), scores in the SSEQs and MCQs added together [i.e., total paper exam score] (r = 0.513), as well with each of the SSEQs (r = 0.505), and MCQs (r = 0.482) when correlated separately with the CCE scores [Table 1]. | Table 1: Two-tailed Pearson correlation between the CCE scores and scores on the various components of the surgery exam for each student
Click here to view |
Students' acceptability and evaluation
Twenty-eight students completed evaluation forms (65% response rate). Regarding computer literacy, it was interesting to note that 93% of the students reported prior knowledge of the use of the computers and 61% reported that they accessed the Internet regularly.
Most of the students thought the instructions of the exam (85%) and the slides (86%) were clear or very clear [Table 2]. The seating was thought by most to be comfortable (79%). Most (65%) thought the computerized pictures and videos were better than the screen projections, and that the CCE should be recommended for use in other departments (64%). Free comments showed that most of the students were happy with the exam particularly as it used IT.
Only six students (21%) stated they had problems with the exams regarding either the time, which they thought was too short, and/or with the questions, which they thought were too difficult. | Table 2: Students' evaluation of the computerized clinical exam (CCE) [n = 28]
Click here to view |
Discussion | |  |
At the Alazhari University medical school, we use a variety of assessment tools in the summative examination at the end of surgical clerkship. Historically, this exam consisted of a 'theoretical' and a 'clinical' part each constituting 50% of the exam's total points. The oral exam formed a part of the traditional clinical exam. This has been replaced by the ViPSCE, [1],[2] and the traditional clinical exam by the OSCE. In this paper, we describe the development of the ViPSCE into a CCE, and present our experience with it, statistical analysis of its scores, and students' rating of its acceptability.
Computers have been used for assessment in medicine since the 1960s to test knowledge and problem solving skills. [3] A survey of United Kingdom medical schools on the use of computers has shown that CBA has a potential for improving the assessment of physicians and other health professionals by developing more valid exams. It is generally popular with candidates and efficient in the delivery and marking of tests. [4] CBA, also called computer-based testing (CBT) and CAA, has been used increasingly as an assessment tool in many medical disciplines. [5],[6],[7] We investigated a delivery model that could be employed at institutions with limited resources, as long as the computers are available.
Like the ViPSCE, the CCE has the potential to assess all levels of knowledge, from simple identification to critical analysis and problem solving. In this way it has been complementary to the OSCE, in which surgical skills including history and physical examination and students' attitudes can be assessed. The OSCE in some medical schools consists of both static and interactive stations. We have used the ViPSCE as a tool to accommodate the 'static stations' and thus leave our OSCE for pure interactive stations. As a result, our practical clinical exam could be made to have 27 stations. This has the advantage of increasing the scope and depth of the exam.
The ease with which the exam was carried out proved its feasibility: It saved time for faculty. The initial cost of building expertise in IT and building air-conditioned rooms supplied with enough computers may be high. The number of students is ever increasing, and this must be matched by computer numbers and more facilities. Establishing IT centers can be profitable in the long run, as they can be used by students and also be rented out for commercial courses, seminars, and other needs.
There were no major problems or glitches during the administration of the CCE, likely in part because the program was installed on the computers and rehearsed with the IT staff prior to the exam. One or two students were noted to begin the exam before the start signal was given, and these students were warned and instructed to wait for the signal.
The scores of the CCE were normally distributed around a mean of 4.97, very close to the pass mark of 5. This, with scores ranging from 2.5 to 7.5, indicated a balanced exam that was neither too easy nor too difficult for students.
The scores for each student correlated with their scores in other tools of surgical assessment denoting a measure of reliability of the CCE. It is worth noting that the CCE does not assess surgical skills; nevertheless, its scores correlated strongly with those of the OSCE. This validated the name we gave this tool: the computer-based clinical exam. Our next step is to develop special software to enable this exam to be answered and corrected on the computer. There is also a potential for computer use as an educational tool and for self-assessment.
The 65% response rate may be taken as low. Students who responded were not found to be particularly different from nonrespondents. Some students did not hand in their evaluations. It would have helped if we asked students to hand in their evaluations immediately after the exam and before being allowed to leave the exam room.
Our survey showed high computer literacy among our students, although the exam itself did not require much knowledge of computer use. The CCE proved popular with the candidates who generally seemed to like the idea of using computers as an exam tool. Most students thought the slides were clear, the seating comfortable, and they generally approved of the CCE.
The minority who disapproved expressed their negative opinion on the time allowed and the questions difficulty, but notably not on the method of the exam that involves use of computers. [8]
Computers have come to stay, and computerized exams have shown a lot of potential as a tool for student self-assessment. [9]
Conclusions | |  |
The CCE is a useful assessment method that complements the OSCE. It is also feasible given the availability of enough computers. The CCE inherits the positive characteristics of the ViPSCE as a valid, reliable, and practical exam, which is acceptable to students. It has the added advantage of improving the viewing of the slides, and has the potential to permit computerized marking.
References | |  |
1. | El Shallaly GH, Ali EA. Use of a Video-Projected Structured Clinical Examination (ViPSCE) instead of oral examination in the assessment of final year medical students. Med Educ 2003;37:1048.  |
2. | El Shallaly G. Ali E. Use of a Video-Projected Structured Clinical Examination (ViPSCE) instead of oral Examination in the assessment of final year medical students. Educ Health (Abingdon) 2004;17:17-26.  |
3. | Swets JD, Feurzeig W. Computer aided instruction. Science 1965;150:572-6.  |
4. | Cantillon P, Irish B, Sales D. Using Computers for Assessment in Medicine. Br Med J 2004;329:606-9.  |
5. | Wolfson PJ, Veloski JJ, Robeson MR, Maxwell KS. Administration of open-ended test questions by computer in a clerkship final examination. Acad Med 200;76:835-9.  |
6. | Hulsman RL, Mollema ED, Hoos AM, De Haes JC, Donnison-Speijer JD. Assessment of medicalcommunication skills by computer: assessment method and student experiences. Med Educ 2004;38:813-24.  |
7. | Lieberman SA, Frye AW, Litwins SD, Rasmusson KA, Boulet JR. Introduction of patient video clips into computer-based testing: effects on item statistics and reliability estimates. Acad Med 2003;78:48-51.  |
8. | Hochlehnert A, Brass K, Moeltner A, Juenger J. Does medical students' preference of test format (computer-based vs. paper-based) have an influence on performance? BMC Med Educ 2011;11:89.  |
9. | Agrawal S, Norman GR, Eva KW. Influences on medical students' self-regulated learning after test completion. Med Educ 2012;46:326-35.  |
[Figure 1], [Figure 2], [Figure 3], [Figure 4]
[Table 1], [Table 2]
This article has been cited by | 1 |
Examination in the Time of COVID-19—MCh Plastic Surgery Examination: How Did We Do It? |
|
| Shashank Chauhan,Suvashis Dash,Piyush Ranjan,Maneesh Singhal | | Indian Journal of Plastic Surgery. 2021; 54(02): 168 | | [Pubmed] | [DOI] | | 2 |
The use of an essay examination in evaluating medical students during the surgical clerkship |
|
| Blair J. Smart,Daniel Rinewalt,Shaun C. Daly,Imke Janssen,Minh B. Luu,Jonathan A. Myers | | The American Journal of Surgery. 2016; 211(1): 274 | | [Pubmed] | [DOI] | | 3 |
Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment |
|
| Anja J. Boevé,Rob R. Meijer,Casper J. Albers,Yta Beetsma,Roel J. Bosker,Bert De Smedt | | PLOS ONE. 2015; 10(12): e0143616 | | [Pubmed] | [DOI] | | 4 |
Objective Structured Video Examination in Psychiatric and Mental Health Nursing: A Learning and Assessment Method |
|
| Abeer A. Selim,Eman Dawood | | Journal of Nursing Education. 2015; | | [Pubmed] | [DOI] | |
|
 |
 |
|