Bok Assessment: Using Feedback to Improve Learning

June 30, 2016
BA 1

In fall 2015, Harvard T.H. Chan School of Public Health began its first “blended” master’s degree in public health, combining online course work withthree-week residences at the start, middle and end of the two-year program, the first degree of its type at Harvard.  The program was committed to ensuring excellence consistent with its residential MPH program in terms of academic rigor, students’ exposure to teaching staff and developing camaraderie within their cohort.  The 50 students, each with a master’s or doctoral degree, and 60 percent of them physicians, were strong academically, but most of them had demanding full-time jobs across the country and even abroad.  How would Harvard keep them engaged with coursework and facilitate collaboration across borders and time zones?

For the first six weeks of 2016, two versions of the same course, taught by the same faculty, were administered to two groups, one to blended degree students and one to residential students. The courses had similar syllabi and assignments and it was assumed that students had similar exposure to programming.  But evaluation of the course homework combined with student feedback for the online students soon showed that blended students were struggling more than might be expected with the programming assignments.

At the same time, the course was being evaluated by the Bok Center’s director of educational research and assessment, Jenny Bergeron.  With its mission for improving teaching for Harvard undergraduates, this was an unusual assignment for the Bok Center and Bergeron, but Bergeron, hired from Stanford in 2012, was among Harvard’s first assessment specialists and HSPH had already called on her to assess two of its courses, including one of the first blended courses that integrated content from a HarvardX online course.Based on that work, Bergeron was on-call at the beginning of the new MPH program to monitor seven courses: four residential, and three online. 

With the programming assignments, Bergeron’s identified two possible reasons for the discrepancy between the students in the blended and residential programs.  An analysis of admissions data revealed that students in the blended and residential courses had similar quantitative aptitudes as evaluated by comparisons of GRE’s and MCAT scores. However, student comments revealed that the blended students were not exposed to the same level of programming before starting the course. Students struggled to keep up with course content while learning new programming skills. In addition, the blended students also struggled with collaborating with their peers to trouble shoot and solve problems.  This was easier for the residential students, who could meet in groups and work out solutions together. “The differences in time zones, the busy work schedules of the students coupled with the asynchronous nature of online communication made group work more challenging,” recalled Bergeron. 

The data also suggested a means for change. “Sometimes faculty think what they’re doing is effective—until the evaluation.  Then comes reflection.”  As her colleague Courtney Hall put it, “Data can sometimes contradict intuition.”  The “intuition” had been that the same approaches and teaching methods would work for both sets of students.  The evidence supported a difference in prior coursework .  The solution, midway through the spring course, was a quickly prepared online coding tutorial for the blended program students.  And, for the “hole in the curriculum” that Bergeron diagnosed, a second fix—for the benefit of the second cohort to begin in June 2016—would be revising the online preparatory course that came first.

In spring 2016, Courtney Hall joined Bergeron on the Bok assessment team as an Educational Research Analyst.  Assessment reports for each course, derived from evaluation data and including many courses beyond HSPH, are long—often more than 40 pages—and immensely detailed, and now Bergeron and Hall split the work.  The Bok Center’s assessment portfolio extends beyond course evaluations to much larger projects—evaluating the entire program in General Education (comprised of over 100 courses), assessingthe Advising Program, and evaluating Harvard College’s Administrative or “Ad” Board, which among other things adjudicates cases of student misconduct.

Right now one of the biggest projects in Harvard College is the decades-long renovation or “renewal” of the residential Houses, where 98 percent of undergraduates live.  In 2015, Bergeron evaluated that, too.  This study was incredibly important to the College; House Renewal describes the $1-billion-plus “first-phase” program to modernize the eight Neo-Georgian undergraduate residences along the Charles River.  In conducting these focus groups and soliciting concrete feedback from student residents and Deans, Bergeron was able to offer suggestions that will be implemented in all new construction moving forward.

A principal finding was that students wanted more private social spaces for their gatherings and were disappointed with the disappearance of the en-suite common rooms typical of the old Houses.  So-called “cluster” common rooms were often under-used because of their physical structure: they just didn’t seem like the right kind of space for socializing, especially when students saw others studying in the small rooms. 

In some ways, the findings resembled the assessment of the two MPH courses.  Environment – whether physical or virtual – shapes student engagement.  The on-campus MPH students had significant opportunities to meet and talk, effectively sharing a space.  The online students would need help figuring out a virtual equivalent.  As Bergeron said, “What works in one teaching context, say the physical, doesn’t necessarily translate into another.” 

Early online courses at Harvard and elsewhere had lots of bells and whistles, or assignments integrating technology that students didn’t understand or felt were unnecessary or busywork.  For the early online course designers, Bergeron noted, it was easy to assume that “because these approaches were interesting and novel, the students were actually learning.”  But there is always a learning curve when implementing these new approaches in a class, and “technology may not always be the answer.” “This is why building in assessments should be a part of iterative course design.”  As a result of this work, a series of recommendations were made for using new approaches in blended courses. This has led to the revision of numerous courses across campus.

For Hall, conducting a focus group with the online MPH students, a nice surprise was “how bonded they were, such a close-knit group.”  The evaluation data showed that the strong cohesion could be pinned directly to the experience of starting together on campus in June 2015 for an intensive three-week experience where many of them even lived in the same dorm.  Bergeron noted that sometimes there is a significant need for physical environments when learners can meet and establish supportive relationships and lasting connections before they enroll in an online course. “We found this to be a critical component of the program,” says HSPH associate dean for professional education Nancy Turnbull, who has worked closely with Bergeron and Hall on a number of projects in the School.  “It’s not like joining an online class where you’ve never seen the other people.  They really like being together!  It’s forced us to be much more interactive in the classroom for their second time on campus this June.  They didn’t want to come all this way and hear lectures.  They want discussion and group activities.” 

Student orientation is a major part of starting a new cohort of students off right.  Last year HSPH began making changes to its orientation for all its major degree programs.  Evaluation data helped them with that, Turnbull says—data that came from the Bok team, too.

See also: Faculty