Self-Observation and Peer Feedback as a Faculty Development Approach for Problem-Based Learning Tutors: A Program Evaluation
GARCIA, Irène, et al.
Problem: Good teaching requires spontaneous, immediate, and appropriate action in response to various situations. It is even more crucial in problem-based learning (PBL) tutorials, as the tutors, while directing students toward the identification and attainment of learning objectives, must stimulate them to contribute to the process and provide them with constructive feedback. PBL tutors in medicine lack opportunities to receive feedback from their peers on their teaching strategies. Moreover, as tutorials provide little or no time to stop and think, more could be learned by reflecting on the experience than from the experience itself. We designed and evaluated a faculty development approach to developing PBL tutors that combined self-reflection and peer feedback processes, both powerful techniques for improving performance in education.
GARCIA, Irène, et al. Self-Observation and Peer Feedback as a Faculty Development
Approach for Problem-Based Learning Tutors: A Program Evaluation. Teaching and Learning in Medicine, 2017, vol. 29, no. 3, p. 313-325
DOI : 10.1080/10401334.2017.1279056
Disclaimer: layout of this document may differ from the published version.
1 / 1
EDUCATIONAL CASE REPORTS
Self-observation and peer feedback as a faculty
development approach for problem-based learning tutors:
A program evaluation
, Richard W. Jamesb
, Paul Bischofc
and Anne Baroffiod
Department of Pathology and Immunology, University of Geneva Faculty of Medicine, Geneva, Switzerland;b
Department of Internal Medicine, University of Geneva Faculty of Medicine, Geneva, Switzerland;c
Department of gynecology and obstetrics, University of Geneva Faculty of Medicine, Geneva, Switzerland; d
Unit of Development and Research in Medical Education, University of Geneva Faculty of Medicine, Geneva, Switzerland
KEY WORDS: peer coaching, reflection, peer feedback, problem-based learning (pbl) tutors, faculty development
CONTACT: Anne Baroffio, email@example.com - Geneva University
Medical School - Unit for Development and Research in Medical Education -1 rue
Michel Servet - 1211 Geneva 4, Switzerland - tel +41 22 379 59 39 - fax +41 22 379
Good teaching requires spontaneous, immediate, and appropriate action in response to various situations. It is even more crucial in problem-based learning (PBL) tutorials as the tutors, at the same time as directing students toward the identification and attainment of learning objectives, have to stimulate them to contribute to the process and provide them with constructive feedback. PBL tutors in medicine lack opportunities to receive feedback from their peers on their teaching strategies.
Moreover, as tutorials provide little or no time to stop and think, more could be learned by reflecting on the experience than from the experience itself. We designed and evaluated a faculty development approach to developing PBL tutors, that combined self-reflection and peer feedback processes, both powerful techniques for improving performance in education.
We developed an observation instrument for PBL facilitation, to be used both by tutors to self- observe and reflect on own teaching strategies, and by peers to observe and provide feedback to tutors. Twenty PBL sessions were video-recorded. Tutors completed the instrument immediately after their PBL session and again while watching their video-recorded session (self-observation). A group of three observers completed the instrument while watching each recorded session and provided feedback to each tutor (peer observation and feedback). We investigated tutors’
perceptions of the feasibility and acceptability of the approach, and gathered data on its effectiveness in enhancing tutors’ facilitation skills.
The preclinical medical curriculum at the University of Geneva is essentially taught by PBL. A new program of faculty development based on self-observation and peer-feedback was offered to voluntary tutors and evaluated.
Our results suggest that self-observation and peer-feedback, supported by an instrument, can be effective in enhancing tutors’ facilitation skills. Reflection on self-observation raised teachers’
awareness of the effectiveness of the strategies they used to foster student learning. This motivated a need to change their teaching practice. However, for the changes to become operative, peer feedback was required, providing the cues and strategies needed to improve the facilitation skills.
3 Lessons Learned
Peer coaching was considered feasible and useful to improve tutors’ facilitation skills. Evaluating the program made it possible to assess tutors’ needs and the reasons underlying their difficulties, and this in turn provided the basis for advanced workshops. Nonetheless, aspects related to logistics and the time constraints of such an individualized approach, as well as the cultural appropriation of peer coaching, might be obstacles that need to be addressed.
Helping teachers to become competent and professionalize their teaching practices requires faculty development.1 To be effective, faculty development activities should, inter alia, be evidence-based, tailored to suit individual needs, encourage experiential learning, provide feedback, and use strategies that stimulate reflection.1-3 Until now, the faculty development approaches most commonly used have been formal, such as seminars, workshops, and longitudinal programs.3
Informal approaches, such as workplace learning and peer coaching, have been used to a much lesser extent.3 Peer coaching (also referred to as peer observation of teaching) has proved to be an
effective technique for improving teaching skills4-6 and helping teachers to implement new strategies,6 resulting in a better application of learned skills and long-term changes in practice.7
Peer coaching can be defined as a collaborative relationship between the coach and coachee for the purpose of attaining professional and/or personal developments valued by the coachee.8 Although the definitions in the literature vary, there are common features among them: (1) It is a voluntary relationship based on collaboration, (2) it includes a component of self-evaluation, (3) the coach gives feedback to the coachee, (4) the partners discuss goals or preferred outcomes, and (5) the focus is on participants’ strengths and the amplification of capacity.9 Thus, in addition to meeting individual needs, peer coaching promotes feedback, which is one of the most effective techniques for improving performance in education.10-13 Moreover, mentoring and obtaining input from supportive peers is one of the most influential elements in enabling the development of reflection,14-17 which is an essential aspect of professional competence.14 What seems particularly coherent and effective in peer coaching is that it associates the processes of self-evaluation, self- reflection, and feedback.
Moreover, reflection seems to be the process through which feedback becomes effective.18-23 Altogether, these studies suggest that the reflection process can lead to the realization of one’s deficiencies, but that actual changes are attained provided that the process is guided through peer feedback.24 This combined approach may change teaching practice.25
In PBL, the tutors’ main role is to create optimal conditions for student learning, which requires that tutors master the PBL cases subject matter and facilitate the learning process.2, 26 Therefore, tutors need to develop facilitation skills that stimulate students’ cognitive activities, such as elaborating, making connections, and synthesizing and integrating knowledge.27 Tutors also have to help students
identify learning needs and resources and monitor their own learning. In addition, their role is to provide feedback and to facilitate the group process.28-30
Faculty development is crucial for the effective implementation of PBL.31, 32 At our institution, we implemented workshops, communities of practice, and co-tutoring, some of which were shown to be effective in improving tutors’ facilitation skills.33-35 However, tutors in medicine have very few
opportunities to receive feedback on their teaching strategies from their peers and they could learn more by reflecting on their experience than from the experience itself. As peer coaching has been reported as helpful for PBL,32 we sought to design a peer coaching approach for PBL tutors to enable further improvement in our faculty development program.
Goals and Steps
We designed and tested a faculty development approach combining self-reflection and peer
feedback and investigated its feasibility and acceptability by the tutors and its potential effectiveness in improving tutors’ facilitation skills. We video-recorded tutorial sessions and developed a new instrument to be used both by peers to observe, record, and provide feedback to tutors on their facilitation strategies, and by the tutors themselves to self-record and reflect on their teaching strategies. Second, we conducted the intervention with 20 tutorial groups. Third, we investigated tutors’ perceptions of the feasibility and acceptability of the approach through semi-structured interviews and gathered preliminary data on the effectiveness of the approach by exploring tutors’
self-perceptions of changes in their actual practice, as well as examining their students’ ratings of teaching skills.
The project was carried out in accordance with the Declaration of Helsinki and exempted of formal review by the Chair of the Ethics Committee for Public Health Research.
This program was conducted during the 2006/2007 and 2008/2009 academic years within the context of the faculty development program for tutors teaching in the preclinical years. Briefly, the undergraduate curriculum is divided into a selection year (1st year of study), two preclinical years (2nd and 3rd years), two clinical years (4th and 5th years), and one elective year (6th year). The learning activities in the preclinical years are integrated, and in large part based on PBL. Each PBL case was studied in two sessions: a tutorial (case opening) and a reporting (case wrap-up) session.
They were typically scheduled to last two hours. In the tutorial session, a group of 10 students analyzed and attempted to explain the problem and generated the learning objectives. Using these objectives and the reading references provided, the students engaged in self-directed study. Around
two to three days later, they reconvened in the reporting session to put together their newly acquired knowledge to explain and answer the questions elicited by the problem in greater depth.
They finished the session by analyzing their learning processes and newly acquired knowledge and evaluating their group functioning. Both sessions were carried out with the tutors, whose role was as follows: to guide the students’ learning and their processes of analyzing and explaining the problem;
to encourage group communication and collaboration; to help students to evaluate group
functioning, in particular whether they attained the set learning objectives; to provide students with feedback.2,28-30
To recruit tutors, we first presented the study and its goals to the directors of the 11 PBL
instructional units (PIU) of the preclinical program. The directors were asked to relay to their tutors our need to enlist 2 or 3 volunteers per PIU. Twenty-seven tutors (between 1 to 3 tutors from 10 of the 11 PIUs) agreed to participate. Of these, 5 were finally not included in the study: 2 because the students did not consent to video-recording their session, 2 due to scheduling difficulties, and 1 because the tutor had changed his mind. We video-recorded 22 PBL sessions, each including a tutorial and a reporting session. Two sessions (2 tutors) were used to pilot the instrument and 20 were included in the study.
Of the 22 tutors, 16 had more than 10 years of tutoring experience and 6 had from 4 to 6 years’
experience. Each tutor facilitated a students’ group during one PIU (2–6 weeks) once a year.
The four authors were tutors with more than 10 years of experience and acted as peer observers.
One of them (AB) had initiated the study, and the other three had volunteered to participate in testing the approach.
An instrument was designed to guide both peer feedback and the tutors’ self-reflection processes. It was developed based on the key facilitation strategies considered to optimize student learning, derived from peer observers’ tutoring experiences and from the existing literature.2, 28-30 The 24 items were constructed with observable behaviors or strategies21, 36, 37 and explicit performance criteria, which would be easy to observe and assess, and organized according to the main steps of the PBL tutorial (problem analysis, preparation for self-directed learning, problem synthesis, discussing the group process, and managing group dynamics). The facilitation strategies were rated on a four-point
scale as optimally (score 4), partially (score 3), insufficiently (score 2), or not (score 1) promoting student learning, with the extreme scores (1 and 4) described as rubrics or anchoring behaviors.38,39-41 The first version of the instrument was pre-tested and completed by the four peer observers while they watched two video-recorded PBL sessions. Two videotaped tutors also completed the
instrument, and were then interviewed to gather their suggestions and comments. Altogether, the strategies described in the instrument were confirmed as valid by the tutors’ and peers’
observations, but a few adjustments in the formulations were found to be necessary. These were made by consensus among all peer observers and the version obtained was that used for the study (Table 1).
The intervention comprised the following sequence (see also Table 2):
• Before the PBL session, the volunteer tutors were oriented to the aims and the different steps of the study and were presented with the instrument and given guidance on how to complete it.
We emphasized that the procedure was formative. Similarly, the students were informed of the aim of the study and were asked to provide their consent to participate.
• All tutors conducted a tutorial and reporting session on a problem (a PBL session) with their student groups, and immediately rated themselves with the instrument on how they thought they had used the facilitation strategies to foster their students’ learning. Sheets were returned to the investigators. The sessions were video recorded to allow subsequent observation by peer observers and the review of own sessions by the tutors.
• During the following month, each tutor again rated themselves with the instrument while watching the recorded PBL session on how they observed they had used the facilitation strategies with students and to reflect on their actions in student learning (self-observation).
• Peer observers (in groups of three) independently reviewed each tutor’s recorded session while completing the instrument and rated how tutors used facilitation strategies (peer observation).
At the end of this phase, they compared and discussed their observations to prepare for the feedback session.
• During the next two weeks, a one-hour feedback session was scheduled between each observed tutor and the three observers. The feedback process essentially concentrated on whether their facilitation strategies were effective in facilitating student learning and used the following steps recommended for guiding reflection:13, 36, 42, 43 (1) needs assessment (“Are there any concerns,
difficulties or particular points that you observed in your videotaped session and want to discuss today?”); (2) tutor self-assessment (“Could you detail for us what you think your strong points are … and the points that could be improved?” ); (3) peer feedback ( based on needs and self- assessment, reinforcing strong points, confirming points for improvement and when necessary raising weaknesses not recognized by the tutor); (4) outcomes (propositions and exchanges concerning strategies that could be used for improvement); (5) transfer (“What have you taken from this discussion that you want to apply during your next PBL session?”).
Preliminary Psychometric Properties of the Observation Instrument
We used the data obtained from the video peer ratings to explore (1) the internal consistency of the instrument (i.e., whether the items were related to each other and measured a single characteristic;
Cronbach’s α coefficients considered satisfactory if ≥ 0.80), (2) inter-coder reliability (i.e., whether the three peer observers coded similarly; Kendall concordance coefficients considered satisfactory if
≥ 0.75), and (3) the capacity of the instrument to discriminate tutors (relative generalizability coefficients G considered satisfactory if ≥ 0.80). G coefficients estimate the proportion of true
variance, that is, the differentiation in the variance due to what was measured (i.e., tutors) in relation to the total variance (i.e., the sum of differentiation and error variance which is due to other sources such as instrumentation effects).They were calculated through the generalizability method using the symmetry principle:44 The measurement plan included tutors (T) as the differentiation facet, and raters (R) and items (I) as the instrumentation facets.
Tutors’ Perceptions of the Feasibility, Acceptability, and Utility of the Approach Tutors’ perceptions of the feasibility and utility of the approach were investigated through individual semi-structured interviews with 18 tutors immediately after the feedback session, aiming to
investigate the potential impact of the video-recordings on group and tutor functioning, the
relevance and ease of use of the instrument, and the utility of the approach (see Table 3 for details of questions asked). The open answers to the questions were content analyzed and the issues raised by tutors were classified by frequency to determine their relative importance.
Effectiveness in Improving Facilitation Skills Tutors’ perceptions
A second individual semi-structured interview was planned during tutors’ next PBL session, which occurred one year later to investigate whether tutors used new facilitation strategies and had
improved their teaching effectiveness following the faculty development intervention (see Table 3 for details of questions asked). It was analyzed with the same procedure as the first interview.
Tutors’ self-ratings of facilitation skills
To investigate whether “self-observation” led tutors to reappraise how they used facilitation strategies, their self-ratings upon reviewing their videos were compared to their pre-intervention self-ratings and to the video peer ratings (Table 2) with a repeated measure MANOVA.
To investigate whether “receiving peer feedback” led tutors to modify their facilitation strategies, they were asked once again to complete the instrument one year later after their PBL session. These post-intervention self-ratings were compared to the pre-intervention ratings (Table 2) with a
repeated measure MANOVA.
Students’ evaluation ratings of tutors
To complement tutors’ self-ratings, we used ratings obtained from their students. We compared the student ratings that 16 tutors obtained before and one year after the intervention. The students’
ratings were obtained from the standard tutor evaluation questionnaire that is routinely
administered after each teaching unit33 (17 items rated on a 5-point scale).Assuming that only low- rated tutors would potentially improve their ratings, we distinguished between those tutors rated as
“good” by students before the intervention (mean rating ≥ 4 out of 5; n=12) and those students rated as “needing improvement” (mean rating < 4; n=4), a criterion adopted by our institution for a tutor in need of remediation. Comparisons were made with a 3-way ANOVA (pre- vs. post-intervention; good vs. needing improvement; tutor).
We used the Statistical Package for the Social Sciences (SPSS release 22, SPSS Inc., Chicago, IL, USA) for the descriptive and inferential analyses, and the EduG software (version 3.07, EduG Inc., Québec, QC, Canada) for analyses involving the G study.
Preliminary Psychometric Properties of the Observation Instrument
With a G coefficient (dl 356, N=17) = 0.84 (error variance lower than 20%), the instrument seemed to allow peers to discriminate reliably between the tutors. The internal consistency of the instrument, Cronbach’s α (N=24) = 0.875, and the inter-coder reliability, Kendall’s coefficient (N=3) = 0.75, were both found to be satisfactory.
Tutors’ Perceptions of the Feasibility, Acceptability, and Utility of the Approach The detailed analysis of tutors’ responses in the first semi-structured interview is shown in the first part of Table 3.
Process of video recording
Most tutors reported that the tutorial process they had undergone during the study was essentially representative of their usual tutorials. A few, however, reported modest changes such as speaking less than they usually did (justified by these tutors as an awareness of speaking too much during their usual tutorials) or students being less active in the tutorial.
Tutors assessed the instrument as relevant, feasible, and easy to record. Some, however, raised the need for the items and scale to be introduced explicitly before use.
The use of an instrument was considered relevant and useful in reminding tutors of the steps and strategies to be used during a PBL tutorial, and in guiding them to reflect on how they acted with regard to students’ learning. Self-observation of the video allowed them either to confirm their initial mental image or to identify problems in their functioning of which they had been unaware. A few felt positively impressed by their own performance. More than half felt motivated to change by using new strategies. One tutor who was confronted by unprofessional student behavior but did not react was convinced not to tolerate any further such situations.
Faculty development approach
Self-observation of the video was useful for tutors in recognizing and learning what strategies worked or failed with students. Peer feedback was experienced as formative, reassuring, and useful, making them aware of their strengths and weaknesses, providing new strategies, and promoting self- reflection. All tutors acknowledged that the approach should be part of the tutor training, either for experienced tutors or for new tutors, and should be included once every 2 to 5 years. Most of them were willing to pair with another tutor to exchange observations and feedback. However, they were divided on whether the partnering tutor should be from the same teaching unit, allowing discussion of issues related to the content being taught, or from another teaching unit, with an “appropriate”
distance so that the discussion could be objective and focused on the pedagogical process. A few tutors expressed a preference to be observed and receive feedback from educational experts.
11 Effectiveness in Improving Facilitation Skills Tutors’ perceptions (Table 3, second part)
Of the 17 tutors who were interviewed one year later, all but one remembered the points they had wanted to improve. More than half of them reported having modified their strategies following the recommendations and thought they had improved as tutors. Examples of new strategies were
“asking questions back to students instead of answering them,” “giving more feedback to students,”
and “positively reinforcing their learning.” Some tutors felt more at ease and comfortable because they perceived their practice as having been reinforced.
Asked to rank what had been most useful in their training, 11 tutors considered that “self-
observation” had been even more important for them than “getting peer feedback” (5 tutors) and
“using the instrument as a reminder of the tutor role” (1 tutor).
Tutors’ self-ratings of facilitation skills (Table 4)
The phase of self-observation did not lead tutors to reappraise the facilitation strategies they had used (pre- vs. self-observation, F(6,6) = 1.08; p=.434; size effect=.394). Moreover, mean self-ratings differed significantly from mean peer ratings (self- vs. peer observation, F(6,6) = 9.14; p=.001; size effect=.820). This was essentially due to “self-directed learning” and “discussing the group process,”
which peers reported as not happening as often as tutors thought it did. In contrast, after receiving peer feedback, tutors rated themselves as having better facilitation skills (pre- vs. post-intervention, F(6,6) = 4.71; p=.041; size effect=.825). Thus, we infer from these results that it is peer feedback combined with self-observation that induces tutors to improve their self-ratings, confirming their impression of being better tutors.
Students’ evaluation ratings of tutors (Table 5)
Tutors who had been rated by students as “needing improvement” (mean rating <4; n=4) significantly increased their ratings after the intervention, whereas those who had been rated as good (mean rating ≥ 4; n=12) stayed stable (interaction effect, F(1,1) = 13.71, ; p=.002; size effect=.495). This confirmed the tutors’ self-ratings, at least for those tutors who needed to and could improve (according to their students).
The purpose of this study was to design and evaluate a peer coaching approach to support and improve the practice of PBL tutors in medicine. It was based on self-reflection and peer feedback and used a single instrument designed to guide tutors to self-reflect on their facilitation strategies and to
aid peer observers in giving feedback to tutors. This study suggests that the approach is acceptable to tutors, feasible, and effective.
Faculty Development Approach In General
Generally, the tutors observed and the peer observers found the approach feasible and acceptable, in addition to being relevant and useful. Videotaping affected the tutorial process moderately or not at all; this is an important issue as it offers much more flexibility to observers in watching and rating tutorial sessions, and is also necessary for tutors to reflect on their action.
Like other peer coaching programs,45-48 our participants evaluated the whole process as relevant and important in that it provided an opportunity for insight and reflection49, 50 which in turn helped create a positive learning environment. They stated that it should be integral to the faculty development program. Similarly to Sullivan et al,47 we also observed a collateral additional benefit for peer observers in that the multiple observations allowed us to benchmark what constitutes good
performance and deduce rules for good functioning. This in turn promoted peer tutors’ reflection on their own facilitation style and its content.
Utility of the Instrument for Observation of PBL Facilitation
The instrument’s content was judged by tutors as coherent with the PBL tutorial process in terms of strategies that can be used by tutors to enhance student learning. They declared it to be easy to use, providing that explanations on how to apply it are given first. The four peer observers who developed and tested the instrument found it useful and sufficiently wide-ranging to give a reliable account of what occurred during the tutorial. The descriptive nature of each item and the use of a rating scale based on rubrics40, 41 were found to be useful for identifying the strengths and weaknesses of the tutors observed and facilitated preparation for the feedback sessions by providing the information needed to have a constructive dialogue concerning potential improvements in teaching skills. Our preliminary psychometric analysis suggested satisfactory reliability and inter-peer agreement.
More importantly, tutors reported the instrument to be useful for reviewing or clarifying their role and enabling them to reflect on their own strategies or habits of fulfilling or ignoring certain steps.
The provision of a self-assessment tool can provide clear standards and a definition of excellence in tutoring; in addition, it may increase the influence of external change agents on facilitation practices, in our case the influence of peers.24
Respective Roles of Self-Observation and Peer Feedback
Self-observation and the rating of own video-recorded performance in PBL sessions were perceived by tutors as providing opportunities to self-assess the actions and strategies they used to foster
student learning. It also increased their motivation for change. Peer feedback was experienced as reassuring and useful for making progress by promoting awareness and reflection on own strengths and weaknesses and providing new strategies to facilitate student learning.
Self-observation is an external source of information that can illustrate gaps between desired and actual practices and inform self-assessment. However, unlike the reported lowering of residents’
rating of their performance in simulated resuscitations after video review,51 our tutors did not self- rate differently after self-observation. Moreover their self-ratings disagreed with those of their peers on two parts of the tutorial process “preparing students for self-directed learning” and “discussing group process”. Peer-ratings have been shown to be more reliable and valid than self-ratings.52 The discrepancy observed in this study could be due to an inaccurate tutors’ representation of their role, making them unaware of the expected level of performance. Alternatively, this could relate in difficulty in acknowledging real deficiencies, since self-assessment has been widely demonstrated to be inaccurate.53 The field of self-assessment is very complex and still needs further research.54
Although tutors were not able to adjust their monitoring when confronted with divergences between what they thought they had taught and what they observed they had taught, most of them
considered this step to be even more important than receiving peer feedback in their reflection process. Thus, although not effective in terms of self-assessment, self-observation may have stimulated reflection. As recently theorized,55 reflection can build on own observations, judgments, and reactions the tutor generates during (reflection-in-action) and after (reflection-on-action) the tutoring, when monitoring their performance prior to receiving external feedback; reflection also builds on peer feedback itself by weighing peer observations against one’s own and forming an opinion on that feedback (reflection-on-feedback). The reflection process can lead to the realization of one’s own deficiencies, but actual changes are attained provided that the process is guided through peer feedback.18-23,24
The way in which feedback is perceived and discussed has an influence on whether it will be adopted and will thus contribute to learning.10, 56 This depends in particular on the environment of trust and position of beneficence of the feedback provider.57 In our setting, the peer observers were colleagues with a similar tutoring experience and observed tutors were engaged as active partners in a relaxed and not normative feedback discussion. Moreover, the analysis was focused on student learning and not on the tutor’s performance and good practices were reinforced by peer observers. This might have constituted a safe and trusting environment, favoring acceptance of feedback. As raised by others,58, 59 it is more likely that the feedback dialogue rather than the feedback itself facilitates the analyses of teaching practices. It encouraged teachers to explain their intent, to reflect on their
action, and to enact subsequent changes,16, 25 which in turn motivated subsequent performance improvement.43, 60
Effectiveness of the Intervention in Long-Term Teaching Practice
The intervention did indeed seem to allow participants to change and improve their teaching practices. During the first debriefing session, many reported that the intervention had motivated them to change. One year later, during the second debriefing session, most of them remembered what they had decided to change; even more, they reported having modified their teaching
strategies and perceived themselves as being better tutors (adapted Kirkpatrick level 3a “behavior”3).
In addition to this perception, they also rated themselves better in teaching skills, such as in “guiding problem analysis,” “preparation for self-learning,” “problem synthesis,” and “managing group dynamics.” As raised earlier, self-assessments are inaccurate and they could be inflated due to processes linked to satisfaction with the training.53 However, the better evaluations received from students (adapted Kirkpatrick level 3b “behavior”3) argue in favor of the tutors’ real improvement, at least for those who volunteered for this study and were rated as in need of improvement by their students. This is in line with other peer observation programs, which promote participants’ deeper critical reflection and the development of concrete plans to change their teaching,61 and their students’ rating of their subsequent teaching as better or much better.62
Conditions, Strengths, and Limitations
The main strength of this evaluation is the long-term follow-up of the program participants, together with the collection of both qualitative and quantitative data on the potential outcomes of the approach.
However, our study also has several limitations. First, it needs to be extended to a larger population of tutors to confirm the data presented in this paper. Second, the participants were volunteers, who might thus be more committed to change. Compulsory attendance could elicit negative reactions, compromising the success of the approach. On the other hand, these tutors comprised a mixed population of good and underperforming tutors (from the students’ point of view), and the effect of this intervention could possibly be greater on a population consisting only of tutors who need to improve. Third, the progress in teaching skills attained by the tutors who attended this faculty development activity should be confirmed by peer observers, which would require a second peer observation one year later. Fourth, we did not provide tutors with a written account of the feedback discussion, nor did we ask them to use reflective writing. This could have reinforced their learning.
Fifth, the four peer observers involved in this study had designed the instrument and were able to create a common understanding of the tutor’s role and teaching effectiveness. Consequently, the
inter-coder reliability, which was satisfactory in our setting, might be lower with untrained peer observers.38 In the case of a comprehensive peer observation program, many tutors could be
involved and have different views on the process of tutoring. Hopefully, the instrument is sufficiently detailed to prevent false interpretations, but we nevertheless think any tutor involved in this
program should be trained in analyzing the tutorials.63
In any case, it must be considered that peer observations are not intended as a summative measure of performance, but should be formative, the main purpose of the approach being to elicit a dialogue between the observed and observer tutors to foster reflection and improve facilitation skills and thus lower reliability would not affect the usefulness of the process.
Impact of the Study on the Institutional Faculty Development Program
Before this study, our institution offered comprehensive pedagogical training for first-time PBL tutors so that they would be ready to start teaching.31 Additional workshops addressed potential difficulties encountered by tutors. However, these were not based on an individualized approach directed at tutors’ personal needs. As an outcome of this study, the faculty development program evolved in two directions. First, peer coaching was offered for all tutors wanting to improve their skills and for tutors identified through students’ evaluations of teaching as needing support. Second, as peer observations objectivized recurrent tutoring difficulties, we developed a whole set of two-hour, specific hands-on workshops at lunchtime directed at specific issues (such as “providing effective feedback,” “facilitating learning in small groups,” and “using reflective practice to improve teaching”). The situations observed during the study were used to construct video clips and
vignettes, intended to induce tutors to think and practice in various simulated situations.64 They also made us realize the importance of reflective practice in tutors’ professional development, and prompted us to develop a framework of practical questions based on Schön’s19 and Kolb’s18 models to help teachers engage in reflection and work on their own difficult and as yet unresolved teaching situations.
After four years’ experience, we consider that the program has grown considerably: More than 600 participants have attended 53 advanced workshops and 2–5 volunteer tutors per year choose to attend the peer coaching program. The workshops are highly rated (average of 4.5 on a 5-point Likert scale) and have received a considerable amount of positive feedback. Peer-coached tutors have been very satisfied and have improved their performance. We now intend to extend the peer coaching program to all tutors as it is an opportunity to work at the institutional level, as recommended3 to reinforce the change of culture introducing more feedback, reflective practice, community of
practice, collaborative thinking, and exchange. As with any institutional change, this implies a cultural appropriation of this approach, and there is still work to be done before each unit director accepts making this program mandatory for each tutor. There is also evidence that “not having enough time”
is one of the first barriers nurturing a “resistance to change”65 especially with regard to such an approach. However, we feel that the culture has slowly been changing with regard to faculty development initiatives, in particular because of the success of advanced workshops.
Peer coaching appears as a promising faculty development approach for PBL tutors. It was considered by the study participants not only to be feasible and acceptable but also valuable and useful to improve their teaching skills. This was supported by data obtained one year later.
Introducing and evaluating the program offered a great opportunity 1) to step inside the fine functioning of tutors, 2) to evaluate their needs, and 3) to gain a better understanding of the hidden reasons underlying their difficulties. Nonetheless, aspects related to the logistics and the time constraints of such an individualized approach, as well as the cultural appropriation of peer coaching, might be obstacles that need to be addressed to facilitate the implementation of such a program. We feel that its application in other settings would benefit from similar program evaluation to adapt the approach to the local culture and context.
1. McLean M, Cilliers F, Van Wyk JM. Faculty development: yesterday, today and tomorrow. Medical Teacher 2008;30;6:555-84.
2. Dolmans DH, Gijselaers WH, Moust JH, de Grave WS, Wolfhagen IH, van der Vleuten CP. Trends in research on the tutor in problem-based learning: conclusions and implications for educational practice and research. Medical Teacher 2002;24;2:173-80.
3. Steinert Y, Mann K, Anderson B, Barnett BM, Centeno A, Naismith L, et al. A systematic review of faculty development initiatives designed to enhance teaching effectiveness: A 10-year update:
BEME Guide No. 40. Medical Teacher 2016:1-18.
4. McLeod PJ, Steinert Y. Peer coaching as an approach to faculty development. Medical Teacher 2009;31;12:1043-4.
5. Bell M. Supported reflective practice: a programme of peer observation and feedback for academic teaching development. International Journal for Academic Development 2001;6;1:29- 39.
6. Showers B, Joyce B. The evolution of peer coaching. Educational Leadership 1996;53;6:12-6.
7. Miller WR, Yahne CE, Moyers TB, Martinez J, Pirritano M. A randomized trial of methods to help clinicians learn motivational interviewing. .Journal of Consulting and Clinical Psychology
8. Grant AM, Cavanagh MJ, Parker HM. The State of Play in Coaching Today: A Comprehensive Review of the Field. International Review of Industrial and Organizational Psychology 2010. Wiley- Blackwell, Oxford UK, 2010; 25; 125-67.
9. Schwellnus H, Carnahan H. Peer-coaching with health care professionals: What is the current status of the literature and what are the key components necessary in peer-coaching? A scoping review. Medical Teacher 2014;36;1:38-46.
10. Hattie J, Timperley H. The Power of Feedback. Review of Educational Research 2007;77;1:81-112.
11. Norcini J. The power of feedback. Medical Education 2010;44;1:16-7.
12. Veloski J, Boex JR, Grasberger MJ, Evans A, Wolfson DB. Systematic review of the literature on assessment, feedback and physicians clinical performance: BEME Guide No. 7. Medical Teacher 2006;28;2:117-28.
13. Wood BP. Feedback: a key feature of medical training. Radiology 2000;215;1:17-9.
14. Mann K, Gordon J, MacLeod A. Reflection and reflective practice in health professions education:
a systematic review. Advances in Health Sciences Education 2009;14;4:595-621.
15. Bell A, Mladenovic R. The benefits of peer observation of teaching for tutor development. Higher Education 2008;55;6:735-52.
16. Siddiqui ZS, Jonas-Dwyer D, Carr SE. Twelve tips for peer observation of teaching. Medical Teacher 2007;29;4:297-300.
17. Vidmar DJ. Reflective peer coaching: Crafting collaborative self-assessment in teaching. Research Strategies 2005;20;3:135-48.
18. Kolb DA. Experiential learning: experience as the source of learning and development Englewood Cliffs, Prentice Hall.; 1984.
19. Schön DA. The reflective practitioner - How professionals think in action New York Basic Books, 1983.
20. Pelgrim EAM, Kramer AWM, Mokkink HGA, van der Vleuten CPM. Reflection as a component of formative assessment appears to be instrumental in promoting the use of feedback; an
observational study. Medical Teacher 2013;35;9:772-8.
21. Archer JC. State of the science in health professional education: effective feedback. Medical Education 2010;44;1:101-8.
22. Sargeant J, Mann K, van der Vleuten C, Metsemakers J. Reflection: a link between receiving and using assessment feedback. Advances in Health Sciences Education 2009;14;3:399-410.
23. Sandars J. The use of reflection in medical education: AMEE Guide No. 44. Medical Teacher 2009;31;8:685-95.
24. Ross JA, Bruce CD. Teacher self-assessment: A mechanism for facilitating professional growth.
Teaching and Teacher Education 2007;23;2:146-59.
25. Britton LR, Anderson KA. Peer coaching and pre-service teachers: Examining an underutilised concept. Teaching and Teacher Education 2010;26;2:306-14.
26. Stoddard HA, Borges NJ. A typology of teaching roles and relationships for medical education.
Medical Teacher 2015;38;3:1-6.
27. Gerhardt-Szep S, Kunkel F, Moeltner A, Hansen M, Bockers A, Ruttermann S, et al. Evaluating differently tutored groups in problem-based learning in a German dental curriculum: a mixed methods study. BMC Medical Education 2016;16;1:14.
28. Rosado Pinto P, Rendas A, Gamboa T. Tutors' performance evaluation: a feedback tool for the PBL learning process. Medical Teacher 2001;23;3:289-94.
29. De Grave WS, Dolmans DH, van der Vleuten CP. Profiles of effective tutors in problem-based learning: scaffolding student learning. Medical Education 1999;33;12:901-6.
30. Papinczak T, Tunny T, Young L. Conducting the symphony: a qualitative study of facilitation in problem-based learning tutorials. Medical Education 2009;43;4:377-83.
31. Farmer EA. Faculty development for problem-based learning. European Journal of Dental Education 2004;8;2:59-66.
32. Lim A, Choy L. Preparing staff for problem-based learning: Outcomes of a comprehensive faculty development program. International Journal of Research Studies in Education 2014;3;4:53-68.
33. Baroffio A, Kayser B, Vermeulen B, Jacquet J, Vu NV. Improvement of tutorial skills: an effect of workshops or experience? Academic Medicine 1999;74;10 Suppl:S75-7.
34. Baroffio A, Nendaz MR, Perrier A, Layat C, Vermeulen B, Vu NV. Effect of teaching context and tutor workshop on tutorial skills. Medical Teacher 2006;28;4:e112-9.
35. Baroffio A, Nendaz MR, Perrier A, Vu NV. Tutor training, evaluation criteria and teaching
environment influence students' ratings of tutor feedback in problem-based learning. Advances in Health Sciences Education 2007;12;4:427-39.
36. Ende J. Feedback in clinical medical education. JAMA 1983;250;6:777-81.
37. Kaprielian VS, Gradison M. Effective use of feedback. Family Medicine 1998;30;6:406-7.
38. Berk RA, Naumann PL, Appling SE. Beyond student ratings: peer observation of classroom and clinical teaching. International Journal of Nursing Education Scholarship 2004;1:Article10.
39. Huba M, Freed J. Using rubrics to provide feedback to students. In M Huba and J Freed (Eds.) Learner-centered assessment on college campuses: Shifting the focus from teaching to learning.
Boston, Ally & Bacon,2000. p. 151-200.
40. Peets A, Cooke L, Wright B, Coderre S, McLaughlin K. A prospective randomized trial of content expertise versus process expertise in small group teaching. BMC Medical Education 2010;10;1:70.
41. Allen D, Tanner K. Rubrics: tools for making learning goals and evaluation criteria explicit for both teachers and learners. CBE Life Sciences education 2006;5;3:197-203.
42. Hewson MG, Little ML. Giving feedback in medical education: verification of recommended techniques. Journal of General Internal Medicine 1998;13;2:111-6.
43. Cantillon P, Sargeant J. Giving feedback in clinical settings. BMJ 2008;337:a1961.
44. Cardinet J, Johnson S, Pini G. Applying generalizability theory using EduG. Routledge, editor:
Taylor & Francis; 2010.
45. Flynn SP, Bedinghaus J, Snyder C, Hekelman F. Peer coaching in clinical teaching: a case report.
Family Medicine 1994;26;9:569-70.
46. O'Keefe M, Lecouteur A, Miller J, McGowan U. The Colleague Development Program: a
multidisciplinary program of peer observation partnerships. Medical Teacher 2009;31;12:1060-5.
47. Sullivan PB, Bickle A, Nicky G, Atkinson SH. Peer observation of teaching as a faculty deveopment tool. BMC Medical Education 2012;12;26:2-6.
48. Thijs A, van den Berg E. Peer coaching as part of a professional development program for science teachers in Botswana. International Journal of Educational Development 2002;22;1:55-68.
49. Finn K, Chiappa V, Puig A, Hunt DP. How to become a better clinical teacher: A collaborative peer observation process. Medical Teacher 2011;33;2:151-5.
50. Soisangwarn A, Wongwanich S. Promoting the Reflective Teacher through Peer Coaching to Improve Teaching Skills. Social and Behavioral Sciences 2014;116:2504-11.
51. Plant J, Corden M, Mourad M, O’Brien B, van Schaik S. Understanding self-assessment as an informed process: residents’ use of external information for self-assessment of performance in simulated resuscitations. Advances in Health Sciences Education 2013;18;2:181-92.
52. Eva KW. Assessing tutorial-based assessment. Advances in Health Sciences Education 2001;6;3:243-57.
53. Eva KW, Cunnington JP, Reiter HI, Keane DR, Norman GR. How can I know what I don't know?
Poor self assessment in a well-defined domain. Advances in Health Sciences Education 2004;9;3:211-24.
54. Eva KW, Regehr G. Self-Assessment in the Health Professions: A Reformulation and Research Agenda. Academic Medicine 2005;80;10:S46-S54.
55. Pelgrim E, Kramer A, Mokkink H, Van der Vleuten C. Reflection as a component of formative assessment appears to be instrumental in promoting the use of feedback; an observational study.
Medical Teacher 2013;35:772 - 8.
56. Watling C, Lingard L. Toward meaningful evaluation of medical trainees: the influence of
participants’ perceptions of the process. Advances in Health Sciences Education 2012;17;2:183-94.
57. Eva K, Armson H, Holmboe E, Lockyer J, Loney E, Mann K, et al. Factors influencing responsiveness to feedback: on the interplay between fear, confidence, and reasoning processes. Advances in Health Sciences Education 2012;17;1:15-26.
58. Tigelaar DEH, Dolmans DHJM, De Grave WS, Wolfhagen IHAP, Van Der Vleuten CPM. Participants' opinions on the usefulness of a teaching portfolio. Medical Education 2006;40;4:371-8.
59. Dolmans DJM. Self-assessment and dialogue: can it improve learning? Advances in Health Sciences Education 2013;18;2:193-5.
60. van den Boom G, Paas F, van Merriënboer JJG. Effects of elicited reflections combined with tutor or peer feedback on self-regulated learning and learning outcomes. Learning and Instruction 2007;17;5:532-48.
61. Boerboom TBB, Jaarsma D, Dolmans DHJM, Scherpbier AJJA, Mastenbroek NJJM, Van Beukelen P.
Peer group reflection helps clinical teachers to critically reflect on their teaching. Medical Teacher 2011;33;11:e615-e23.
62. Pattison AT, Sherwood M, Lumsden CJ, Gale A, Markides M. Foundation observation of teaching project – A developmental model of peer observation of teaching. Medical Teacher
63. Orlander JD, Gupta M, Fincke BG, Manning ME, Hershman W. Co-teaching: a faculty development strategy. Medical Education 2000;34;4:257-65.
64. Bosse HM, Huwendiek S, Skelin S, Kirschfink M, Nikendei C. Interactive film scenes for tutor training in problem-based learning ;PBL: dealing with difficult situations. BMC Medical Education 2010;10;1:1-14.
65. Spallek H, O'Donnell JA, Yoo YI. Preparing faculty members for significant curricular revisions in a school of dental medicine. Journal of Dental Education 2010;74;3:275-88.
Table 1: Instrument for observation of PBL facilitation ;translated from the original French version)
The tutor 's contribution makes learning uncertain ;1) optimally promotes learning ;4)
1. Problem analysis 1 2 3 4 NN*
1 defining the problem does not make the group define the problem ensures the group defines the problem and raises relevant questions 2 prior knowledge does not encourage students to apply prior
knowledge stimulates students to exploit prior knowledge
3 links leaves the group to enumerate / make a list of
acquired knowledge / concepts encourages students to regroup acquired knowledge / concepts and schematise them 4 in depth analyses inappropriately interrupts group to seek or give
information, without considering group's own
reasoning encourages students to reason and develop their own hypotheses 5 structuring / synthesizing allows detailed discussion of minor or irrelevant
points helps the group to structure its reasoning and to summarise or synthesize when appropriate 6 time management poor time management ensures all aspects of problem discussed within allotted timeframe
2. Self-directed learning
7 learning objectives after analysing the problem, does not help group to
formulate its own questions / objectives helps group to formulate its own questions / objectives and advises on information required 8 ressources does not discuss appropriate sources of information discusses sources of information appropriate for the objectives
3. Group dynamics
9 working atmosphere reacts in a negative manner to students' errors establishes a working atmosphere that encourages student participation 10 student participation accepts no-contributing students ensures that all students participate 11 group regulation does not help the group manage inappropriate
student behaviour ;dominant student, no-contributing
student) helps the group manage inappropriate student behaviour
1. Problem synthesis 1 2 3 4
1 discussion of reference texts starts report without discussing self-learning phase discusses self-learning phase, references used and any problems encountered 2 validating student comprehension does not confirm correct interpretations
/comprehension confirms and compliments correct interpretations /comprehension 3 links leaves the group to enumerate / make a list of
acquired knowledge / concepts encourages students to regroup acquired knowledge / concepts and schematise them 4 structuring / synthesizing allows detailed discussion of minor or irrelevant points helps the group to structure its reasoning and to summarise or synthesize when appropriate 5 depth of knowledge allows the group to explain the problem without
defining the depth of knowledge required ensures with appropriate questions that the group has attained its objectives / level of comprehension
6 knowledge common thread and/or transfer of does not stimulate links with other problems /
teaching units or application to other situations stimulates discussion of links with other problems/ teaching units or application to similar cases 7 return to case does not incite the group to reconsider the case stimulates group to use newly acquired knowledge to explain the case 8 time management poor time management ensures report covers all objectives within allotted timeframe
2. Discussing group process
9 attaining objectives does not stimulate group to analyse if they have
covered objectives stimulates group to analyse if they have covered objectives, gives feedback on and compliments the
10 group functioning does not discuss how the group functioned stimulates group to analyse how they functioned
;interactions, atmosphere, behaviour, etc) and gives
3. Group dynamics
11 working atmosphere reacts in a negative manner to students’ errors establishes a working atmosphere that encourages student participation 12 student participation accepts no-contributing students ensures that all students participate 13 group regulation does not help the group manage inappropriate
student behaviour ;dominant student, no-contributing
student) helps the group manage inappropriate student behaviour
*NN: intervention of tutor not required or spontaneously done by group Comments
24 Table 2: Study design
;facilitation skills rated with instrument)
;students’ratings of tutors)
BEFORE INTERVENTION ;YEAR-1)
Students complete systematic
program and tutor evaluation pre-
intervention student ratings INTERVENTION ;YEAR 0)
Orientation Aims and steps of the study,
Presentation of instrument
Live PBL session ;yr0)
Tutor facilitates students' group PBL process and rates how he thinks he used facilitation strategies
pre- intervention self-ratings
Tutor watches video-recorded session and rates how he observes he uses facilitation strategies
3 peer observers watch the video-recorded session, rate how tutor uses facilitation strategies, discuss their observations and prepare the feedback session
Peer observers give feedback to the tutor on his facilitation
strategies ;« guided reflection »)
;yr0) Explores tutor’s views about the feasibility of the approach
videorecording on tutorial process;
instrument appropriateness and easiness of use;
procedure acceptability and utility
Live PBL session ;yr+1)
Tutor facilitates students' group PBL process and rates how he thinks he used facilitation strategies
post- intervention self-ratings
;yr+1) Explores tutor’s views about the effectiveness of the approach
practice; Improved teaching
Students complete systematic
program and tutor evaluation post-
intervention student ratings
Table 3: Summary of the content analysis of tutors’ answers to 2 semi-structured interviews
semi-structured interview 1 ;immediately after the intervention) n=18 tutors
Questions asked Content analysis of tutors’ answers N tutors ;%)
Process of video recording
Did videotaping impact you as tutor? no effect 10 ;56)
it changes something but it is difficult to say what 4 ;22)
spoke less 2 ;11)
feel some emotion or tension 2 ;11)
Did videotaping impact the functioning of
your student group? no effect 11 ;61)
more participation 5 ;28)
less participation 2 ;11)
Did videotaping impact the tutorial process compared to your previous experiences?
representative of usual tutorials 12 ;67)
modest changes 6 ;33)
video room is not optimal 12 ;67)
Are the items appropriate and relevant?
are they missing or useless items? items are relevant 18 ;100)
missing items 2 ;11)
useless items 1 ;6)
Are the items feasible and easy to record? items are easy to record 8 ;44)
some items need explanations 4 ;22)
scale unfamiliar 3 ;17)
easier when watching the video than immediately
after tutorial 3 ;17)
Is the use of an instrument relevant? useful and relevant 6 ;33)
check list, guide for tutor role in PBL steps 6 ;33)
initiates reflection 4 ;22)
Does the process of looking yourself in action bring you to become aware of something in your functioning?
no, it confirms mental image 8 ;44)
yes, identify problems 7 ;39)
yes, positively impressed 3 ;17)
Does the intervention reinforce your
tutoring practice? yes, feel positively reinforced 10 ;56)
Does the intervention motivate you to
change? yes 11 ;61)
If yes what? adapt interventions to better guide student learning 7 ;39) help silent students to participate 4 ;22) discuss group process and give more feedback 3 ;17)
not tolerate non-professional behaviors 1 ;6)
Faculty development approach
Do you consider the approach as useful? self-observation is useful for self-evaluation 7 ;39) peer feedback is useful, formative and reassuring 5 ;28)
useful to progress 3 ;17)
look from the outside is helpful 3 ;17)
necessary to reflect on how you are facilitating 1 ;6)
necessary for quality control of facilitation 1 ;6) could serve as a pedagogical support for new tutors 1 ;6)
I did not learn a lot about myself 1 ;6)
Should this approach be part of tutor
training? yes 18 ;100)
If yes, how often? as advanced training 9 ;50)
as basic training 4 ;22)
for heads of teaching units 1 ;6)
Would you like exchanging observations
and feedback by pairs of tutors? yes, with a tutor of another teaching unit 5 ;28) yes, with a tutor of same teaching unit 5 ;28) yes, both solutions would be useful 2 ;11) would prefer to be observed by education experts 3 ;17)
videotaped tutorial only 1 ;6)
semi-structured interview ;1 year after the intervention) n=17 tutors Do you remember the points discussed
during the feedback session you were motivated to change?
yes remember precisely 16 ;94)
nothing to improve but felt positively reinforced 2 ;12)
better guide learning process 5 ;29)
discuss group process and give more feedback 6 ;35) establish small group functioning rules 2 ;12)
ask students about their self-learning 1 ;6)
Did the procedure induce changes in your
tutoring practice? yes 10 ;59)
Did the procedure improve your teaching
effectiveness ? yes 9 ;53)
What in the intervention has been most
useful to you? self-observation 11 ;65)
receiving peer feedback 5 ;29)
using the instrument as a reminder of the tutor role 1 ;6)
Table 4: Self and peer-rated facilitation skills of PBL tutors ;mean scores ± SD) using the instrument
observation pre- vs self- observation post-
intervention pre- vs post- intervention peer-
observation self- vs peer-observation PBL session
session MANOVA PBL session
year+1 MANOVA Video-
p size effect p size effect p size effect
N 18 18 16 16 14 12 12 18 18 18
analysis 3.1 ± 0.4 3.2 ± 0.6 .824 .003 3.5 ± 0.4 .000 .692 3.2 ± 0.8 .806 .004
learning 2.8 ± 0.9 3.1 ± 0.9 .333 .063 3.0 ± 0.7 .026 .373 2.3 ± 1.1 .003 .420
dynamics 3.1 ± 0.6 3.1 ± 0.6 .794 .005 3.3 ± 0.5 .231 .127 3.3 ± 0.6 .320 .058
synthesis 3.1 ± 0.4 3.1 ± 0.3 .764 .006 3.4 ± 0.3 .001 .627 3.0 ± 0.7 .259 .074
process 2.8 ± 0.7 2.6 ± 0.8 1.000 .000 3.1± 0.6 .121 .205 1.7 ± 0.9 .000 .610
dynamics 3.0 ± 0.6 3.0 ± 0.6 .531 .027 3.3 ± 0.5 .006 .513 3.2 ± 0.7 .099 .152
.434 .394 .041 .825 .001 .820
Table 5: Students’ evaluation ratings of PBL tutors ;mean scores ± SD; 95%CI) before and after the intervention
students' ratings of tutors
good ;n=12) needing improvement ;n=4)
pre-intervention 4.46 ± 0.15 ;4.34-4.58) 3.68 ± 0.31 ;3.47-3.89) post-intervention 4.48 ± 0.18 ;4.37-4.60) 4.21 ± 0.20 ;4.01-4.41)
ANOVA F p size effect power
good vs needing improvement
;inter-subject effect) 35.778 .000 .719 1.000
pre- vs post-intervention
;intra-subject effect) 16.601 .001 .542 .966
pre- vs post-intervention x good vs needing improvement
;interaction effect) 13.707 .002 .495 .930