Print this page Email this page Users Online: 732 | Click here to view old website
Home About us Editorial Board Search Current Issue Archives Submit Article Author Instructions Contact Us Login 

 Table of Contents  
Year : 2008  |  Volume : 21  |  Issue : 2  |  Page : 233

In the News! An Opinion: Measuring Effects of Faculty Development

Associate Editor, Education for Health

Date of Web Publication12-Jan-2013

Correspondence Address:
J van Dalen
Associate Editor, Education for Health

Login to access the Email id

Source of Support: None, Conflict of Interest: None

Rights and PermissionsRights and Permissions

How to cite this article:
van Dalen J. In the News! An Opinion: Measuring Effects of Faculty Development. Educ Health 2008;21:233

How to cite this URL:
van Dalen J. In the News! An Opinion: Measuring Effects of Faculty Development. Educ Health [serial online] 2008 [cited 2022 Aug 15];21:233. Available from:

At an earlier occasion on this same spot, I addressed the increased attention for staff training or faculty development (van Dalen, 2006). Since then there has been a steady stream of publications describing attempts at improving teachers' efforts to help learners acquire new knowledge and insights.

Yvonne Steinert is one of the authors who are most frequently cited in this respect.

In the April 2008 issue of Medical Teacher her group published a report on the success of a workshop for educators, addressing how to develop successful workshops (Steinert et al., 2008).

In that same period (July 2008) Medical Education published two papers about the (short- and) long-term effects of faculty development programmes for medical educators. One group from the USA described the long-term follow-up of a 10-month programme in curriculum development (Gozu et al., 2008). The other, Danish paper described a controlled study of the short- and long-term effects of a Train-the-Trainers course (Rubak et al., 2008).

In his famous work on the effects of teaching activities, Kirkpatrick (1998) stated that training effects could be expressed on four levels: reaction, learning, behaviour, and organization. Effect studies of educational activities do not often reach beyond the level of self-reported learning, the first level Kirkpatrick described.

The three studies mentioned above all attempt to proceed beyond this level. The dependent variables in Steinert et al.'s study were: immediate and delayed post-workshop evaluations, assessment of self-perceived efficacy and tracking of site specific activities. Gozu et al. surveyed cohorts and compared these with control groups on self-reports of curriculum development activities that were actually deployed and on needs analyses done. Moreover, participants were asked to judge the impact of the programme during a follow-up study 6 to 13 years after completion. Assessing the long-term effect is clearly an additional contribution to our knowledge about the effect of teaching programmes. The Danish group claim to have gone even further: they tried to measure what actually happened after their Train-the-Trainers course. Their dependent variables were; knowledge, teaching behaviour and learning climate. They state that 'learning climate' could be used as an indicator of the effect at Kirkpatrick's organizational level. Whether this is actually the case remains to be seen; after all they did use self-report questionnaires to measure learning climate. The relation between these self-reports and external (learners') judgments still needs to be clarified.

Nevertheless, these three studies are very good attempts to address the difficult issue of effects of faculty development programmes. As with any social scientific research the multi-causality of the proposed outcome measure is a serious handicap. Do learners learn more (compared to before) because their teachers have participated in a course? Can learners see the difference? Is 'learning more' the desired outcome? Or are we also happy with 'learning differently'?

In view of the rapidly developing field of faculty development, and in view of our quest to reveal how good learning can be facilitated, more studies like these are badly needed.

Jan van Dalen

Associate Editor Education for Health

Gozu, A., Windish, D.M., Knight, A.M., Thomas, P.A., Kolodner, K., Bass, E.B., Sisson, S.D. & Kern, D.E. (2008). Long term follow-up of a

10-month programme in curriculum development for medical educators: a cohort study. Medical Education, 42:7, 684-692.

Kirkpatrick, D.L. (1998). Evaluating training programmes: the four levels. London: Berrett-Koehler Publishers.

Rubak, S., Mortensen, L., Ringsted, C. & Malling, B. (2008). A controlled study of the short- and long-term effects of a Train-the-Trainers

course. Medical Education, 42:7, 693-702.

Steinert, Y., Boillat, M., Meterissan, S., Liben, S. & McLeod, P. (2008). Developing successful workshops: a workshop for educators. Medical Teacher, 30:3, 328-330.

van Dalen, J. (2006). Staff training, health professions education or curriculum reform? Education for Health, 19:2, 271-272.


Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

  In this article

 Article Access Statistics
    PDF Downloaded107    
    Comments [Add]    

Recommend this journal