Print this page Email this page Users Online: 1079 | Click here to view old website
Home About us Editorial Board Search Current Issue Archives Submit Article Author Instructions Contact Us Login 


 
 Table of Contents  
ORIGINAL RESEARCH ARTICLE
Year : 2017  |  Volume : 30  |  Issue : 3  |  Page : 198-202

Use of international foundations of medicine clinical sciences examination to evaluate students' performance in the local examination at the University of Sharjah, United Arab Emirates


1 Department of Clinical Sciences, College of Medicine, University of Sharjah, Sharjah, United Arab Emirates
2 Department of Medical Education, College of Medicine, University of Sharjah, Sharjah, United Arab Emirates
3 Department of Family and Community Medicine, College of Medicine, University of Sharjah, Sharjah, United Arab Emirates

Date of Web Publication18-Apr-2018

Correspondence Address:
Dr. Nihar Ranjan Dash
Department of Clinical Sciences, College of Medicine, University of Sharjah, Sharjah 27272
United Arab Emirates
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/efh.EfH_339_16

Rights and Permissions
  Abstract 


Background: Several medical schools around the world are moving away from isolated, locally developed in-house assessments to the introduction of external examinations into their curriculum. Although the objective varies, it is typically done to evaluate, audit, and compare students' performance to international standards. Similarly, the International Foundations of Medicine-Clinical Sciences Examination (IFOM-CSE) was introduced in the College of Medicine at the University of Sharjah as an external assessment criterion in addition to the existing in-house assessments. The aim of this study was to compare the student performance in this newly introduced IFOM-CSE examination and the existing in-house final examination in the college. Methods: The scores of three consecutive final-year undergraduate medical student batches (2013-2015) who took both the IFOM-CSE and the existing in-house final examination were analyzed. Pearson correlation and one-way analysis of variance test were conducted using SPSS 22. Results: The students' scores in the IFOM-CSE and in the final examination prepared locally were highly correlated with Pearson correlation coefficients of 0.787 for batch 2013, 0.827 for batch 2014, and 0.830 for batch 2015 (P < 0.0005). Interestingly, while the mean scores of the IFOM-CSE among the three batches in the years 2013, 2014, and 2015 (475, 492, and 513, respectively) showed improvement with borderline significance (F[2226] = 2.73, P = 0.067), local examination scores showed a significant improvement during the study period (F[2277] = 52.87, P < 0.0005). Discussion: The findings of this study showed that students' scores in the local examination were consistently correlated with their scores in the IFOM-CSE over all the three batches. Thus, introduction of external examination can be an important evaluation tool to a comprehensive internal assessment system providing evidence of external validity.

Keywords: Assessment, benchmarking, examination, International Foundations of Medicine, Sharjah, undergraduate


How to cite this article:
Dash NR, Abdalla ME, Hussein A. Use of international foundations of medicine clinical sciences examination to evaluate students' performance in the local examination at the University of Sharjah, United Arab Emirates. Educ Health 2017;30:198-202

How to cite this URL:
Dash NR, Abdalla ME, Hussein A. Use of international foundations of medicine clinical sciences examination to evaluate students' performance in the local examination at the University of Sharjah, United Arab Emirates. Educ Health [serial online] 2017 [cited 2023 Jun 6];30:198-202. Available from: https://educationforhealth.net//text.asp?2017/30/3/198/229510




  Introduction Top


Traditionally, medical schools around the world measure their student performances and curriculum outcomes using an internal assessment system. Recently, the trend is changing as more and more medical schools are introducing some form of an external examination or criteria into their curriculum with an aim to improve the quality of assessments, better evaluation, monitor outcomes, external validity, and serve as a quality improvement tool.[1],[2] The use of external examination varies from place to place; some colleges use it for quality assurance while others have used it as a component of their assessment. In an interesting development, some medical schools in Netherland and Australia have collaborated in developing benchmark assessments in their respective countries and have added them to or replaced their own in-house assessments.[3],[4],[5] In France, a consortium of medical schools uses the expertise and materials of the National Board of Examiners (NBME), USA, to select candidates for the national residency program.[6]

The NBME has developed an examination called the International Foundations of Medicine (IFOM) to assess the core knowledge of undergraduate medical students globally. The IFOM examination has two parts; the IFOM Basic Sciences Examination is designed to assess the common international core of knowledge in the preclinical curriculum and the IFOM Clinical Sciences Examination (CSE) is designed to assess the international core of clinical knowledge in medicine, surgery, pediatrics, obstetrics and gynecology, and psychiatry expected of students in the final year of undergraduate medical education. Both these examinations contain 160 single best-answer multiple-choice questions and are administered through paper or web based as per requirement.[7] Due to the commonality of the core knowledge, learning objectives, and assessment tools such as multiple choice questions (MCQs), it makes these external examinations such as IFOM a reliable assessment procedure.

Recently, several medical schools around the world and many in the Middle East have started using IFOM examinations for various purposes. Some have used the test for undergraduate assessment and others for selection of students to residency programs.[8],[9],[10] However, published data comparing and correlating student performance in these external examinations with local grades in their own in-house assessments are lacking.

The College of Medicine at the University of Sharjah, United Arab Emirates, offers a 6-year medical program. It uses a problem-based learning curriculum in the preclerkship years and rotations in the clerkship during the last 2 years of study.[11] To graduate at the end of the 6th year, students have to go through a final examination that uses several assessment instruments including a 3-h 120-MCQ paper covering medicine, surgery, obstetrics and gynecology, pediatrics, and mental health. In 2013, the college added an external examination in the form of IFOM-CSE with an objective to evaluate the student achievements. The aim of this study was to compare the student performance in this newly introduced IFOM-CSE examination and the existing in-house final examination in the college.


  Methods Top


Study design and participants

This retrospective study analyzed the grades of students who appeared in both the IFOM-CSE and final examination conducted in the college, for 3 consecutive academic years 2013, 2014, and 2015.

Ninety-one students in 2013 and 92 in 2014 sat for both examinations, and their scores were compared. During these years, the IFOM-CSE examination was obligatory for the final-year students along with the scheduled in-house examination in the college. However, in the year 2015, the IFOM-CSE examination was reassigned as an optional examination and was open to students who requested for it. Therefore, 46 out of 97 students in the 2015 batch appeared in both examinations and their grades were correlated. In total, the scores of 229 students who sat for both examinations were analyzed in this study.

Examination setting

The IFOM-CSE examination was scheduled 2 weeks before the local examination. A web-based version of the IFOM-CSE was administered having 160 MCQs over 4½ h in the college. The examination covered the core knowledge in subjects such as medicine, surgery, obstetrics and gynecology, pediatrics, and mental health. The final examination developed locally is a 120-MCQ paper testing knowledge in the same subjects described earlier. The duration of this local examination was 3 h. The college student's assessment committee follows a rigorous procedure in compiling the in-house MCQ paper and keeps the examination content standardized. The process includes preparation of examination blueprint based on the learning outcomes, writing of MCQs by the subject-matter experts, and qualitative review of the questions by the assessment committee members to prepare the final version of the examination. Postexamination, the items' responses are analyzed and stored in the question bank.

Data collection and statistical analysis

Approval to use student academic grades for research last purpose was obtained from the college assessment committee with the condition that confidentiality will be maintained and individual scores of the student will remain anonymous.

Students' scores from the IFOM-CSE and the in-house MCQ paper were entered and analyzed using SPSS version 22. Scores on both examinations were kept as continuous variables on a scale from 200 to 800 for the IFOM-CSE examination and 0–100 (percent correct) for the local multiple-choice paper examination. Means and standard deviations (SD) were used to summarize data that were normally distributed and medians were used to describe scores that were skewed. Kolmogorov–Smirnov test was used to test for normality of score distribution. Scatter plots were used to check for bivariate linear correlation between students' scores on both examinations, and Pearson correlation coefficient was calculated to measure the direction and strength of linear correlations. One-way analysis of variance (ANOVA) test was used to check for significant differences in the mean scores of students among the three batches. A significant ANOVA test result was followed by the post hoc test, Tukey's honestly significant difference, to check for significant differences between each pair of means. P = 0.05 or less indicated a statistically significant difference.


  Results Top


In the year 2013, the mean students' score on the IFOM-CSE examination was 475 (SD = 78.8). Students' scores ranged between 315 and 682 and were normally distributed. The mean students' score on the locally administered MCQ paper examination was 73.8% (SD = 6.89) where scores were normally distributed and the minimum and maximum scores were 58.3% and 85.7%, respectively [Table 1]. Pearson correlation coefficient between students' scores on the IFOM-CSE examination and in-house examination was 0.79 (P < 0.0005) as shown in [Figure 1].
Table 1: Student's scores in the International Foundations of Medicine Clinical Sciences Examination and local examination by year

Click here to view
Figure 1: Scatterplots showing bivariate correlation between students' scores on the International Foundations of Medicine-Clinical Sciences Examination and Multiple-Choice Question Paper of Local Examination

Click here to view


In the year 2014, the mean students' scores on the IFOM-CSE examination and the in-house prepared MCQ paper examination were 492.5 (SD = 107.0) and 68.8% (SD = 10.54), respectively. Students' scores on the IFOM-CSE examination were normally distributed and ranged between 304 and 748 while their scores on the in-house MCQ paper examination were negatively skewed and ranged between 32.3% and 86.9%. The median score was 70.7%. Pearson correlation coefficient between students' scores on the IFOM-CSE and in-house administered examinations was 0.83 (P < 0.0005) [Figure 1].

In the year 2015, the means students' score on the in-house MCQ paper examination increased to 80.5% (SD = 5.13) where the minimum score achieved was 65.2% and the maximum was 88.9%. Students' scores were negatively skewed and the median score was 80.3%. On the IFOM-CSE examination, students' mean score was 513.0 (SD = 81.9) and the scores ranged between 330 and 673 and had a normal distribution about the mean. IFOM-CSE scores and in-house examination scores were highly correlated with Pearson correlation coefficient r = 0.830 (P < 0.0005) [Figure 1]. There was a marginal difference between the three batches of students' scores of the IFOM-CSE examination as determined by the ANOVA test (F(2226) = 2.73, P = 0.067). On the other hand, the students' scores of the MCQ paper examination developed locally in the college were found to be significantly different (F(2277) = 52.87, P < 0.0005) across the 3 years.


  Discussion Top


Medical schools nowadays seek to compare their students' performance to other medical schools or against known standards as a measure of quality and external validity. Evidence is accumulating in favor of IFOM as an external examination to supplement the local assessment process in many medical schools.[1],[8],[12] In the same line, our study demonstrated a strong correlation in the students' scores between the external examination IFOM-CSE and the local examination in the college. The consistency of this positive trend was further strengthened over the 3 years from 2013 to 2015. Similar comparisons and correlations have been carried out by medical schools that have introduced IFOM examination into their curriculum and provided evidence that IFOM can be used as an evidence of external validity and supplement to the local assessment system.[13],[14]

The mean score of IFOM examination across the 3 years showed a positive trend from 478.8 in 2013 to 513 in 2015. However, the score remained well below the recommended NBME's international standard competence score of 557, similar to findings from an Australian study.[8] This might have been due to several reasons including the fact that in the first 2 years, the IFOM-CSE examination was compulsory to all, and at the same time, it was formative as the IFOM scores were not considered for their final grades in the college assessment, but the students received feedback from the NBME. That means many students were not serious of IFOM-CSE examination yet took it as it was compulsory. However, in 2015, when the IFOM-CSE examination became an optional choice, only students who were really interested and considered it as a reflection of other NBME examination, such as USMLE, opted to take the examination and that resulted in the improved mean score. The other reason might be the content focus. When students prepare for the local examination, they do focus on local health issues and spend considerable time studying them. A similar observation was made in Queensland University, Australia, when IFOM-CSE was first introduced in 2012 for the final-year students.[8]

While comparing the students' performances between batches, it is important to note that the content (questions used) of both IFOM-CS and local examination varied between the three batches though they were developed using same blueprints and had similar difficulty levels. Similarly, student ability was also not taken into account for the comparisons and was considered as a natural variation. However, students in the three batches underwent the same selection process upon entry to the college, had the same curriculum and training, making the three cohorts of students somewhat similar.

There were several limitations to this study. First, only the knowledge component was assessed through the MCQs in comparing students' performance although clinical skill and attitude were also an integral part of the learning process. This is because IFOM examinations are designed to test only core knowledge. Therefore, other assessment components such as directly observed clinical examination and objective structured clinical examination scores were not analyzed in this study. Second, the number of students varies between the batches. The number of students in batch 3 (46) was not comparable to batches 1 (91) and 2 (92) because of the small number who took the IFOM-CS examination. Finally, the study considered the whole score in the IFOM examination and did not analyze the performance of students in individual disciplines such as medicine, surgery, obstetrics and gynecology, and pediatrics, which might have given better evidence of the strengths and weaknesses of students' performance per discipline.


  Conclusions Top


External examinations such as IFOM are being increasingly adopted by medical schools in the United Arab Emirates and in the Middle East. This study showed evidence that IFOM can be an important addition to a comprehensive in-house assessment system and can be used to compare students' performances in the local examination. They can be used judiciously for external validity of the learning process and student's achievement. Exposure to such an international examination may also lead to improvement of the locally prepared examinations. These improvements can directly get reflected in the students learning process as “assessment drives learning.”

Acknowledgment

We would like to extend our sincere thanks to the chair of the students' assessment committee, College of Medicine, University of Sharjah, for giving access to the data sets used in the study.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
Winward ML, De Champlain AF, Grabovsky I, Scoles PV, Swanson DB, Holtzman KZ, et al. Gathering evidence of external validity for the foundations of medicine examination: A collaboration between the national board of medical examiners and the university of minho. Acad Med 2009;84:S116-9.  Back to cited text no. 1
[PUBMED]    
2.
Stuetz A, Green W, McAllister L, Eley DS. Preparing medical graduates for an interconnected world: Current practices and future possibilities for internationalizing the medical curriculum. J Stud Int Educ 2014;9:28-45.  Back to cited text no. 2
    
3.
O'Mara DA, Canny BJ, Rothnie IP, Wilson IG, Barnard J, Davies L, et al. The Australian medical schools assessment collaboration: Benchmarking the preclinical performance of medical students. Med J Aust 2015;202:95-8.  Back to cited text no. 3
    
4.
Edwards D, Wilkinson D, Canny BJ, Pearce J, Coates H. Developing outcomes assessments for collaborative, cross-institutional benchmarking: Progress of the Australian medical assessment collaboration. Med Teach 2014;36:139-47.  Back to cited text no. 4
[PUBMED]    
5.
van der Vleuten CP, Schuwirth LW, Muijtjens AM, Thoben AJ, Cohen-Schotanus J, van Boven CP, et al. Cross institutional collaboration in assessment: A case on progress testing. Med Teach 2004;26:719-25.  Back to cited text no. 5
    
6.
De Champlain AF, Melnick D, Scoles P, Subhiyah R, Holtzman K, Swanson D, et al. Assessing medical students' clinical sciences knowledge in France: A collaboration between the NBME and a consortium of French medical schools. Acad Med 2003;78:509-17.  Back to cited text no. 6
[PUBMED]    
7.
International Foundations of Medicine, Examination Program Guide. National Board of Examiners. Available from: http://www.nbme.org/pdf/ifom/IFOM_Program_Guide.pdf. [Last retrieved on 2016 Oct 25].  Back to cited text no. 7
    
8.
Wilkinson D, Schafer J, Hewett D, Eley DS, Swanson D. Global benchmarking of Australian medical student learning outcomes: Implementation and pilot results of the international foundations of medicine clinical sciences exam at the university of Queensland. Med Teach 2014;36:62-7.  Back to cited text no. 8
    
9.
Oman Medical Specialty Board (OMSB). Available from: http://www.omsb.org/OMSB_A_Srv/Templates/Documents/IF OM%20-2016.pdf. [Last retrieved on 2016 Oct 10].  Back to cited text no. 9
    
10.
Hamad Medical Corporation (HMC). Available from: http://www.hamad.qa/EN/Education-and-research/Medical-Education/Documents/HMC%20Residency%20Matching%20Program%20Application%20Form.pdf. [Last retrieved on 2016 Oct 25].  Back to cited text no. 10
    
11.
Abdelkhalek N, Hussein A, Gibbs T, Hamdy H. Using team-based learning to prepare medical students for future problem-based learning. Med Teach 2010;32:123-9.  Back to cited text no. 11
[PUBMED]    
12.
Palha JA, Almeida A, Correia-Pinto J, Costa MJ, Ferreira MA, Sousa N, et al. Longitudinal evaluation, acceptability and long-term retention of knowledge on a horizontally integrated organic and functional systems course. Perspect Med Educ 2015;4:191-5.  Back to cited text no. 12
    
13.
Annual Report. Weill Cornell Medical College in Qatar. Available form: http://www.qatar-weill.cornell.edu/media/documents/WC-AR%20_%202012_Eng%20LR.pdf. [Last retrieved on 2016 Oct 20].  Back to cited text no. 13
    
14.
International Projects. Faculty of Medicine. Ku Leuven. Available form: http://www.med.kuleuven.be/eng/internationalisation/projects. [Last retrieved on 2016 Oct 20].  Back to cited text no. 14
    


    Figures

  [Figure 1]
 
 
    Tables

  [Table 1]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Introduction
Methods
Results
Discussion
Conclusions
References
Article Figures
Article Tables

 Article Access Statistics
    Viewed6516    
    Printed209    
    Emailed0    
    PDF Downloaded359    
    Comments [Add]    

Recommend this journal