shadow_tr

Teaching Portfolio

 

Craig William Clarkson, Ph.D.

 

SECTION 6: CURRICULUM ASSESSMENT

 

Curricular Analysis: Large Block Exams Result in Fatigue & Decreased Performance

Prior to 1996, our 2nd year course in Medical Pharmacology was taught between February and May, 2-3 hrs of lecture per day, with laboratories typically occurring on Tuesday & Thursday afternoons. Our course did not overlap with any other core courses, and was taught as a traditional "silo" course. During the 1996-97 academic year, we began to "integrate" our 2nd yr medical curriculum into thematic system-based blocks where topics in several specialties (Med Pharm, Path, Pathophysiology, Microbiology, Immunology, Clinical Diagnosis) were coordinated. As a result, in 1996-97 we moved ~25% of our Medical Pharmacology curriculum to the Fall block (Inflammation, Cancer Chemotherapy & Antimicrobials). The remainder of the course took place in March thru May after students had completed the Pathology course. The same design was used during the 1997-98 academic year as well.

During the 1998-1999 academic year the extent of curricular coordination & integration was increased further, and during the Spring of 1999 we conducted our first experiment where the content covered in all courses (e.g. Pathophysiology, Pathology & Medical Pharmacology) was assessed during a single “mega” exam given on a Friday morning. Because of the amount of material covered on the exam, this exam was very long (~160 questions), requiring ~4 hours for many students to complete. The questions from the different sub-disciplines were randomly arranged throughout the exam.


Noticing the visual signs of fatigue on our student’s faces during the latter half of the exam, I became alarmed and conducted a statistical analysis of exam performance following the exam. The analysis revealed several significant trends.

  • Student performance on the first 50 questions on the exam (Mean = 82.2% ± 8.8%) was significantly higher than on the last 50 questions (Mean = 73.3% ± 11.2%). The difference was highly significant (paired t test P < 0.00000001). 
  • The number of students with a failing grade on the last 50 questions (n=35) was 7-times higher than with the first 50 questions (n=5)
  • To determine whether the top performing students also exhibited a similar significant “fatigue”, I re-analyzed the data, looking only at those students having the top 20 scores on the exam (the brightest students). The results were qualitatively similar for the top performing students, as compared to the entire class. The average for the top 20 students dropped from 93.9% ± 2.3% on the first 50 questions to 85.8% ± 8.8% on the last 50 questions, a significant difference (P<0.00025). The number of top performing students having an “Honors” grade also dropped from 13 to 4 between the first 50 to the last 50 questions.

Following this analysis, as a partial solution to reduce fatigue, we began separating our exams by course, with 15-30 minute breaks in-between exams. However the Mechanisms of Disease (Pathology & Pathophysiology) and Pharmacology exams were still given on the same last Friday morning of a given coordinated block. 

Nevertheless, the use of large Friday block exams continued to result in decreased student performance (see attached figure), and an progressive increase in the number of students “failing” Medical Pharmacology with each block exam during the 1998-1999 academic year.

  • The cumulative class average began to decline significantly, exam-by-exam for 3 exams after the block exam design was initiated during the 1998-99 academic year.
  • The number of students in the class with a failing grade increased progressively with each exam from 1 to 4 after the 3 block exams.

My conclusion from these results is that “fatigue” is a real (reproducible & statistically significant) phenomena associated with the “block exam” model, even when mini-breaks are given between examination of different subjects.

 

Is it Valid to Use the NBME Exam As A Grade for the Final Exam?

One question that must be addressed before we can legitimately use scores obtained from the National Board of Medical Examiners (NBME) Shelf Exam as a valid assessment as a final exam is whether there is a good correlation between a student’s grade at the end of the course (before the final exam), and their performance on the NBME final exam. If, for example, we were teaching a course on plant biology, I would expect that there would not be a good correlation between the course grade & performance on a standardized pharmacology exam, since the two topics are “apples vs. oranges”. To assess this, I periodically compare the correlation of average student scores with their score on the NBME shelf exam. To date, there has always been a good correlation (R ≥0.70, P<0.0001), as illustrated in the analyses conducted at the end of the 2004, 2005, 2006, 2008 and 2009 academic years. I therefore conclude that the NBME score can be used as a valid assessment of a student’s knowledge of pharmacology, and that a students knowledge of pharmacology at the end of our course correlates well with their performance (on average) on the NBME exam.

 

Spreading Out the Medical Pharmacology Course Results in Decreased NBME Performance:

At the end of the first two years of curriculum integration (1998-99 & 1999-2000) a trend of decreased student performance in Medical Pharmacology was also observed when analyzing student scores obtained on the NBME shelf exam:

  • The class average on the end-of-course NBME shelf exam decreased significantly (P<0.001) after the 1998-1999 and 1999-2000 academic years when compared to the three previous years where there was minimal curricular integration & block exams were not used.

One might ask “why” there was a decrease in NBME test scores? It seems unlikely that longer “fatiguing” exams (a series of acute events) could decrease student performance on a standardized exam given at the end of the year. While there is more than one possible explanation, my best guess is that the decrease in NBME scores reflects the fact that the pharmacology course became significantly more “spread out” over the entire academic year (from August to May), resulting in lower student retention of the subjects covered at in the Fall semester. Prior to 1996-97, the pharmacology course began in February and ended in May (~ 3 hours per day), with no other significant competition in the curriculum. My hypothesis is that giving the entire course just prior to a standardized NBME exam most likely increases student performance, due to a more significant contribution of short-term retention. The fact that the end-of-course class average did not show a similar dip at the end of the 1999 & 200 academic year suggests that integration of the curriculum per se did not decrease exam performance (as one separate indicator of learning).

 

                                  “Insanity: doing the same thing over and over again and expecting different results.”
                                    Albert Einstein

 

Active Steps Taken To Enhance Student Review for the NBME & USMLE Exams:

After recognizing the “potential” negative impact of curricular change on exam performance in our Medical Pharmacology course I began an ongoing dialog (over several years) with our Owl Club representatives to try and determine what resources we could possibly provide that might increase student learning & help students to review & master our material. As a result of these discussions I made a series of changes to the Medical Pharmacology curriculum that included:

  • Development & implementing learning objectives for each lecture (actually initiated during 1998-1999, prior to noticing the dip).  
  • Development of on-line interactive self-assessment exams first piloted during the 2000-2001 academic year (described in Section 5). Owl Club feedback was highly favorable.
  • Developing “drug profiles” for each drug covered during the course. (Begun in 2003, completed in Jan 2005). These profiles are small tables containing a flash-card-like synopsis of each drug’s mechanism of action, indications, contraindications, drug-interactions & side effects. The references used to create these Drug Profiles was the course text (Katzung) & an on-line version of the Physician’s Desk Reference (www.rxlist.com). Once created, these drug profiles were distributed prior to each lecture block, and took over a year to develop.

After two years the downward trend in NBME scores reversed. As shown in the attached figure, our students performance on the shelf exam rebounded during the 2000-2001 academic year & remained between the 70th and 80th national percentile during the next 5 years until Katrina. While I would like to believe that the changes we made are directly responsible for the rebound, I know that education is an in-exact science. Part of the explanation could very well be that students (in response to feedback from classmates ahead of them) “adapted” their study habits, thus resulting in enhance performance on the NBME exam. Since we did not have a separate randomly selected “control group” that was un-exposed to our newly provided resources, we will probably never know the real answer. (I considered such a design impractical).

 

Assessing Active Learning & Learner-Centered Teaching Strategies:

During the 2008-09 academic year I initiated the development & implementation of active learning strategies in classroom instruction for both our graduate & medical curriculum. For the graduate curriculum I developed a one-credit pilot course on "Concepts in Pharmacology". In the medical curriculum I collaborated with two department colleagues to convert 7 traditional Medical Pharmacology lectures to "Just-in-Time-Teaching" sessions with "Peer Instruction". (These active learning strategies are described at the end of Section 5). The use of Peer Instruction significantly increased class performance on questions from 63.3% (1st vote) to 89.4% (2nd vote)(n=20, p<0.0001). Further expansion of these techniques is planned for the 2009-10 academic year.


Tulane University, New Orleans, LA 70118 504-865-5000 website@tulane.edu