|
|
BRIEF COMMUNICATION |
|
Year : 2010 | Volume
: 23
| Issue : 2 | Page : 334 |
|
Factors That Affect Implementation of Web-based Faculty Evaluation Forms: Residents' Perspectives from a Developing Country
SH Ibrahim1, SK Ali1, S Sadaf2
1 Aga Khan University, Karachi, Pakistan 2 Aga Khan University, Karachi, Pakistan, Pakistan
Date of Submission | 01-Apr-2009 |
Date of Acceptance | 01-May-2010 |
Date of Web Publication | 16-Aug-2010 |
Correspondence Address: S H Ibrahim Department of Department of Pediatrics and Child Health, Faculty Office Building, Aga Khan University Pakistan
 Source of Support: None, Conflict of Interest: None  | Check |
PMID: 20853237 
Context: A web-based evaluation system for residents to provide feedback on faculty was piloted in four training programs at the Aga Khan University prior to institution-wide implementation. Of the four programs, less than 50% of forms were submitted by residents of three programs while more than 70% of forms were submitted by the residents of one program. This study was conducted to identify reasons for the varying participation rates of the four programs with a view to improving the system. Methods: A qualitative approach was employed using focus group discussions (FGDs). Volunteers were invited and three groups of eight to ten residents each were formed. Participants for FGDs were selected from all residency years. FGDs were used to identify residents' perceptions regarding the web-based faculty evaluation system and to identify residents' problems and concerns with completing the web-based faculty evaluating forms. Results: Technical issues in completing and submitting the forms online were identified to be the main deterrents to completing the evaluation forms. Non-accessibility of a resource person for resolving technical problems with the software and the burden of taking time out to complete the forms were considered as limiting factors by many residents. Residents recommended a focused orientation session to the new system within the departments. Conclusion: Residents' confidence and support are key to promoting adequate participation in web-based evaluations. Focused orientation sessions, reinforcement, reminders, assurances of confidentiality, and removal of technical glitches should help to improve resident participation. Keywords: Web-based, evaluation, residents
How to cite this article: Ibrahim S H, Ali S K, Sadaf S. Factors That Affect Implementation of Web-based Faculty Evaluation Forms: Residents' Perspectives from a Developing Country. Educ Health 2010;23:334 |
How to cite this URL: Ibrahim S H, Ali S K, Sadaf S. Factors That Affect Implementation of Web-based Faculty Evaluation Forms: Residents' Perspectives from a Developing Country. Educ Health [serial online] 2010 [cited 2023 Jun 7];23:334. Available from: https://educationforhealth.net//text.asp?2010/23/2/334/101493 |
Background
Regular faculty evaluations are considered critical for the professional development of residents as well as their supervisors, and form an integral part of the overall evaluation of many residency programs1.
A number of studies have reported the limitations of paper-based evaluation systems, including high costs, the need for an elaborate system for storing and collating information, and problems in being able to quickly retrieve information2. Online web-based assessment systems offer advantages since they allow ease of analysis and retrieval, and respondents can complete evaluations on their own time2-4. However, when planning to use web-based technology in a teaching hospital in a developing country, it is important to understand students’ prior knowledge of the use of computers, since those not comfortable with computers will be challenged with a web-based approach5.
This study was conducted to identify the reasons why many residents of three programs at Aga Khan University (AKU) failed to complete faculty evaluation forms, and why 70% of the residents of the fourth program did complete forms. The findings will be used to improve the web-based faculty evaluation system before implementing it across all residency programs at AKU and have implications for use of similar web-based systems in other developing countries.
Context and intervention
AKU is one of the first private universities in Pakistan accredited by relevant accrediting and regulatory authorities for its undergraduate and postgraduate programs. It has a 500-bed hospital and is located in the heart of Karachi, Pakistan’s biggest metropolitan city.
The postgraduate programs at AKU were started in 1983, and a postgraduate medical education (PGME) department was formalized in 1993 to oversee the quality of residency programs. To date a paper-based system of faculty evaluation has been in place. The PGME Committee (PGMEC) observed that the paper-based evaluations were not being completed regularly by the residents, and that both faculty and residents voiced concerns about this system. Lack of expertise in collating and organizing evaluation data leading to delays in submitting reports to respective department heads, as well as handwritten feedback needing to be typed by departmental secretaries were of concern to the faculty. The residents had concerns over issues of confidentiality, and the PGMEC perceived that the feedback received was not a true reflection of the quality of interactions between faculty and residents. Although studies on residents’ discomfort in using paper-based evaluations have not been widely reported, some studies have commented on the awkwardness students feel in evaluating their faculty6.
A taskforce constituted by the PGMEC to strengthen the assessment and evaluation processes in the postgraduate programs proposed a web-based evaluation approach to address the concerns of both faculty and residents around efficiency and confidentiality. Web-based evaluations have been shown to be an effective approach with the potential to provide more dependable and reliable data3. Web-based systems are helpful in maintaining the confidentiality of our responding learners, since raw data in our school would be submitted directly to the University’s Information Systems Department (ISD) and separate reports would be generated for each faculty member. Residents come to AKU from different universities and computer literacy is not gauged at the time of residents’ selection. Similar to other developing countries, computer use is taught in very few undergraduate schools in Pakistan. Hence, residents’ knowledge and skills in computer use varies7-10. AKU has a separate ISD that provides technical support to both faculty and residents at all times. However, there is no dedicated person identified from ISD who is the key person residents are to call should they have difficulty in computer usage or in understanding the system.
The existing faculty evaluation form was revised and an electronic version developed in consultation with experts from the AKU ISD. An orientation session was held for all residents, introducing them to the new system and the forms accessible on the university’s intranet. This system was piloted in four residency programs, which were Paediatrics, Neurology, Psychiatry and Diagnostic Radiology. After piloting the system for six months, residents’ response rate was 70% in Diagnostic Radiology, while less than 50% in the other three disciplines despite repeated reminders by program directors (PDs).
Methodology
Volunteers were invited and three focus groups of 8 to11 residents were formed. Participants for the focus group discussions (FGDs) were selected from all residency years. Advance appointments were made through the respective program directors and a meeting venue and time were scheduled after discussion with the chief residents.
The first group was comprised of residents (n=11) in Paediatrics, Neurology and Psychiatry, where the response rate to the online evaluation was less than 50%. The three disciplines were combined because of the small numbers of residents in Neurology and Psychiatry.
The second group was comprised of residents (n=8) from Diagnostic Radiology, where the response rate was over 70%. The third group had residents from Medicine (n=10) with whom the pilot was not run, although Medicine residents had attended the orientation session on the web-based system.
A qualitative approach was employed using focus group discussions to identify residents’ problems and concerns with completing the web-based faculty evaluating forms. Three FGDs of two hours each were conducted in February and March of 2008. All of the authors were involved in conducting the FGDs and analyzing the data. The FGDs were conducted in the ward seminar rooms of the three principal disciplines and were recorded. Written and verbal consent was obtained from participating residents to record and disseminate the information they provided during the discussion. One of the groups initially had some reservations about being recorded, however consented after being reassured of confidentiality.
The FGDs were recorded by the principal investigator (PI), who is a faculty member of the Department of Paediatrics. Discussions were led by two of the co-investigators, who are faculty members from the Department for Educational Development and are known to the residents, as they conduct various workshops for residents. The presence of a faculty member from the Department of Paediatrics could inhibit residents’ willingness to be open in their comments; however, the principal investigator reassured the groups that she was only there to record the conversation and was not be part of the discussion. The PI used a flipchart to record key points brought up in discussions so participants could see and contest the point if they felt it did not accurately reflect participants’ intended meaning.
The written and taped data was transcribed, followed by systematic analysis and independent coding by the first and the second author. The process included researchers’ familiarization with the raw data, identification of themes and sub-themes, and interpretation11. The coding was then shared for consistency and agreement. Disagreements in identification of a theme were resolved by rereading the transcript together and coming to a consensus.
The following areas were explored in groups 1 and 2 where the forms had been piloted:
- How many residents (of those present in this meeting) had filled out the forms? This was a general question for the FGD leaders to know how many of the residents present had and had not completed the form to be sure that appropriate numbers of both were available to go to the next question.
- What were the reasons for completing or not completing the forms?
- What were the opinions of the residents who submitted the forms regarding time required to complete the form?
- Did they face any problems in completing the forms?
- What changes would the residents suggest to make the system more user-friendly and facilitate completing the forms?
In the group where the forms were not piloted, the following questions were asked:
- Were the participants aware of the online forms?
- What issues do they think there would be in completing the online forms?
- What changes/modifications should be made to make the process more user-friendly and facilitate the completion of the forms?
Results
Framework analysis helped identify themes, which were based on concerns raised by a majority of the residents in all three groups. The themes included issues with the web-based forms, the burdensome number of faculty members to be evaluated, confidentiality of the evaluations submitted, maintenance of privacy, technology-related problems and the questionable value of the feedback. Details of the themes are described below.
Web-based forms
Most of the residents found the form to be too long and offering too many response options. They thought that the form design was poor; it was not user-friendly and required too much time to complete.
“Five to six residents had to sit together and brainstorm before they could figure out the usage.” (Participant 3 in Group 1- P3G1)
Number of faculty members to be evaluated
The burdensome number of faculty to be evaluated came up as a theme in all three FGDs. The sub-themes within this included residents not knowing a number of faculty to be evaluated, too many faculty members to be evaluated, not knowing if they were to review only those with whom they had recently rotated or if they were to evaluate the entire faculty they had encountered during the preceding six month period. Residents were generally of the view that since they were supervised by all faculty members at different times they should evaluate all.
“There are 35 faculty members and it took my colleague 2-3 hours to complete.” (P5G1)
Few commented that the list was not updated and faculty members who had left were still on the list and others said that the list was incomplete.
Some of the queries that the residents had and which they felt were not addressed in the orientation session were:
“What if the rotation was of one month and the form was to be filled out after 6 months? We would tend to forget.” (P6G3)
“What if the resident rotated with the same faculty more than once in the stipulated period?” (P4G3)
Confidentiality of the evaluations submitted
Confidentiality of evaluations was identified as a major concern by participants in all three groups since residents used their own user ID for completing the evaluation forms which could potentially be tracked. One of the participants said that this concern was based on a previous experience.
“There was another form that we filled out, we were told that it is confidential but it was found later that it was not.” (P7G2)
However, few of the residents said that they did not have any fear of repercussions even if their identity was disclosed.
Maintenance of privacy
A related issue was the difficulty of maintaining privacy while the forms were being filled out. As identified by one of the participants:
“Computers are available but there is no place one could complete it with privacy, it would be easy for others to see the forms were being filled out.” (P3G1)
Another participant said that:
“One could not leave the terminal till the forms were complete.” (P7G1)
Information technology related problems
Many sub-themes were identified relating to information technology (IT) problems. These included issues of accessibility from residents’ homes, the second page of the form not opening after completing page one, and the form being technically difficult and taxing.
“It is difficult to complete and if one gets paged in the middle one has to close it and then start all over again. It is easier with a hard copy as you can take it with you wherever you go and fill it piece meal whenever you want.” (P4G2)
Another common problem encountered was that of previous records, which did not get deleted or changed when residents logged off the system. If the computer was opened again by another person, they could see another resident’s completed form.
“Forms did not disappear from the window and whenever they reopened the window the same filled form reappeared.”(P1G2)
Some also said that the dates were not changing and others had problems with the system accepting their form once it was completed. One participant said that the forms would not be accepted if comments were not written on the last page, which was what the form designers had intended. Others felt that a ‘not applicable’ option should have been offered instead.
The residents of the department where the largest number of forms were completed reported that a person from ISD was based within the department and available to them. They also reported that they had received two briefings, the one given to all residents and the second one given by this resource person.
“The IT person walked us through the form and actually filled out a form and showed us how to do it.” (P2G2)
The general orientation session conducted only once was not found to be very useful by the residents who did not have IT support personnel in their departments. Residents of the department where such support was available did not comment on it. Repeated reminders from the departmental secretary via the paging system were also found to be very helpful in prompting residents to complete their evaluations.
Value of the feedback
Some of the participants were unsure about the value of this activity. They felt that although the faculty would take critical analysis seriously, they were unsure if the PGMEC would also take it seriously. For this reason, some residents chose to mark all faculty as excellent, as why bother providing truly honest and critical feedback which would be ignored.
“…..no information on where the forms would go and how would the information be circulated to the faculty.” (P8G1)
The PD of one department sent repeated reminders for the forms to be filled out and emphasized the importance of completing the forms.
“We were constantly paged by the secretary and sent emails as reminders to fill the forms.” (P4G2)
Recommendations by residents
When asked about how the activity could be made more valuable and about ways to promote greater participation by the residents, residents gave the following recommendations:
- A shortcut pathway on the desktop should be provided so that the residents can access the web page directly, and the system should be made user-friendly.
- A timeframe for completing the forms should not be set. Residents should be able to fill it out whenever they desire.
- Individual forms should not reach the faculty; rather, faculty should only receive compiled data from all reviewers.
- Residents should be reminded to complete forms through general reminders every three months.
- The Chief Residents should be sent information as to how many forms have been filled by each resident and they should then remind individual residents.
- The web page should be designed in such a way that, when the web page is opened, only the names appear of faculty who each resident is to evaluate.
- The residents need protected time to complete the forms. Separate mandatory academic sessions and time for residents should be set, allowing 1 1/2 hours. This could be arranged in the Learning Resource Center.
- The residents should be given special codes to access the file. Only the ISD should know the real identity.
- Incentives should be provided for participation.
- There should be a way to give early, urgent feedback “If there was a pressing matter that needed sorting out, then six months is a long time to wait.”
- The residents should be told more about how the data are used, through a meeting with program directors.
Discussion
Online evaluation and assessment systems cannot only be used to provide data on trainee competence, but they can also provide valuable feedback on faculty teaching, curriculum quality, and the learning environment12. In one study a web-based system increased residents’ compliance in providing faculty and rotation evaluations from 20% and 34%, respectively, to 100%, which was then maintained for 22 months13. In our study, greater than 75% compliance was reported in programs where logistic and IT support was available.
A web-based system can satisfy requirements and provide more complete and better quality data than paper systems can D’Cunha2 et al. have reported that completion of evaluations improved from 50% to 80% in the initial six months of implementing a web-based system (p<0.01), with no significant difference between faculty and resident compliance. In their evaluation of "ease of use," a total of 612 responses were received over this period with a total average score of 3.5 (5-point scale, 5=strongly agree). Residents' opinions (average score, 3.69) were slightly more positive than those of faculty (average score, 3.31). Confidentiality was felt to have improved over paper-based systems through a detailed security network. These data show that an internet-based evaluation system is a potentially powerful instrument for evaluation. It can help improve residency programs and their educational systems in a timely and efficient manner2.
FGD was best suited for our study since it allowed participants to not only identify issues and concerns but also to identify solutions to specific problems14. We think that during the FGDs the residents did not show inhibitions in speaking freely. They felt that the facilitators were representing PGME and that whatever they said would only go directly to the Associate Dean. They felt their feedback would help in improving the system rather than cause any backlash for them. Participants’ comfort may also have been due to the fact that two of the investigators were not clinical faculty but rather were from the Department of Educational Development, which residents might have perceived as being neutral.
Based on residents’ comments, we feel that before initiating a web-based system a proper orientation of the residents is required. Further, residents should be given assurance by either the Dean of Medical College or the Associate Dean of the PGME that confidentiality will be maintained and that no electronic tracks will remain to allow comments to be traced back to specific residents. Gaining residents’ confidence in the system’s confidentiality is one key to the success of a web-based system.
Advantages of the web-based system over the traditional paper-based evaluation were identified, including the ability to provide notifications to residents of overdue evaluations, routine reminders, more immediate feedback to residents and faculty, and automatically summated data for the faculty. These were seen mainly in the program where IT support services were available on-site. The positive response by residents of this one program could also be attributed to the fact that the program director was part of the taskforce and provided repeated reminders to the residents to complete the forms.
The system failed in part due to technological support not being available in the remaining two programs. Although Internet services have been available at the hospital for a number of years, the residents come from different areas of Pakistan and are not technologically proficient in using the web-based system. Being a Third World country, IT is not taught at all schools and colleges and therefore the system may not have worked in our hospital because of the varying levels of computer literacy among the residents and technological support not being available when needed7.
Confidentiality is a critical issue in any evaluation process, particularly when it involves residents and faculty; breeches of confidentiality remain a fear with some of the residents. We have not seen this reported before in places where web-based evaluations have been used. Confidence in a foolproof and confidential system needs to be reinforced with residents before it is used. Identifying fear of reprimand among residents needs to be studied further to assess if it is a broad, cultural phenomenon or an issue at our specific institution.
Conclusion
Web-based faculty evaluation systems are still in their infancy in our country and computers are not being used in all undergraduate programs. Educators wishing to use online evaluations need to be mindful of the computer competency of the residents and departments. Based on our residents’ comments, orientation to the web-based system should be intensive so that residents feel comfortable in using the system. Time should be allocated for the residents to complete these forms and all efforts should be made to ensure that confidentiality is maintained. Residents’ confidence and support in the system are key to promoting adequate participation in web-based evaluation.
Ethics approval
This study was approved by the Aga Khan University Ethics Committee.
References
1. Rosenberg ME, Watson K, Paul J, Miller W, Harris I, Valdivia T. Development and implementation of a web-based evaluation system for an internal medicine residency program. Academic Medicine. 2001; 76(1): 92-5.
2. D'Cunha J, Larson CE, Maddaus MA, Landis GH. An Internet-based evaluation system for a surgical residency program. Journal of American College of Surgeons. 2003; 196(6): 905-910.
3. Collins J, Herring W, Kwakwa F, Tarver RD, Blinder RA, Gray-Leithe L, Wood B. Current practices in evaluating radiology residents, faculty, and programs: results of a survey of radiology residency program directors. Academic Radiology. 2004;11(7): 787-794.
4. Benjamin S, Robbins LI, Kung S. Online resources for assessment and evaluation. Academic Psychiatry. 2006; 30(6): 498-504.
5. Samuel M, Coombes JC, Miranda JJ, Melvin R, Young EJW, Azarmina P. Assessing computer skills in Tanzania Medical Students: an elective experience. BioMed Central Public Health. 2004; 4:37.
6. Willett RM, Lawson SR, Garry JS, Kancitis IA. Medical student evaluation of faculty in student-preceptor pairs. Academic Medicine. 2007; 82(10 Suppl): S30-33.
7. Shoaib SF, Mirza S, Murad F, Malik AZ. Current status of e-health awareness among healthcare professionals in teaching hospitals of Rawalpindi: a survey. Telemedicine journal and e-health. 2009; 15(4):347-352.
8. Mansoor I. Computer skills among medical learners: a survey at King Abdul Aziz University, Jeddah. Journal of Ayub Medical College, Abbottabad. 2002; 14(3):13-15.
9. Ajuwon GA. Computer and internet use by first year clinical and nursing students in a Nigerian teaching hospital. BioMed Central Informatics and Decision Making. 2003; 3:1-10.
10. Link TM, Marz R. Computer literacy and attitudes towards e-learning among first year medical students. BioMed Central Medical Education. 2006; 6:34.
11. Liamputtong P, Ezzy D. ‘Focus Group Discussions’, in P Liamputtong, D Ezzy (eds), Qualitative Research Methods. Hong Kong: Oxford University Press; 2005.
12. Schell SR, Lind DS. An Internet-based tool for evaluating third-year medical student performance. American Journal of Surgery. 2003; 185(3): 211-215.
13. Civetta JM, Morejon OV, Kirton OC, Reilly PS, Serebriakov II, Dobkin ED et al. Beyond requirements: residency management through the internet. Archives of Surgery. 2001; 136:412-417.
14. Kitzinger J. Qualitative research. Introducing focus groups. British Medical Journal. 1995 ; 311(7000):299-302.
|