The past, present and future of Penn Course Review


In 1959, The Daily Pennsylvanian published the first undergraduate course review - a comprehensive guide that aggregated data and comments from student course evaluations to turn them into a complete book with information on all courses offered by Penn in a single year.

After many iterations and decades of evolution, Penn Course Review now stands as an independent website that allows students to compare courses and professors based on data from years of evaluations. Over the years, Penn Course Review has taken many forms, but its fundamental purpose has remained the same - to act as a guide and a tool for students in making their academic decisions.

The Undergraduate Course Guide is born

The idea of a course guide started as a student-run initiative. In 1941, students began the process of compiling data and comments from course evaluations into a book to sell to undergraduates.

In 1959, The Daily Pennsylvanian took over the publication of course reviews. At the time, the review was a 33-page pamphlet and cost students 25 cents.

The next year, the official Undergraduate Course Guide was born after student criticisms pushed for more comprehensive critiques of courses. By 1966, the Course Guide reached 96 pages and included data from more than 2,000 student surveys.

Over time, the popularity of the guide grew, and by 1971, more than 10,000 student surveys were filled out annually. That year, the Student Committee on Undergraduate Education took over publication from the DP, doubled the length of the book and added numerical ratings to the courses based on the reviews.

SCUE had just been formed in 1965, and the Undergraduate Course Guide was one of its major projects.

"The Course Review pre-dates SCUE, which, there aren't many things that are student-driven that pre-date SCUE, but the Review is one of them," Executive Director for Education and Academic Planning Rob Nelson said.

SCUE controlled publication of the guide until 1982, when the autonomous group Penn Course Review took over. Still for sale to students, the Undergraduate Assembly funded the publication of the book from 1986-1989.

Production of the Course Guide continued for more than a decade, with features like notable student quotes added as the survey got more and more responses. Starting in 1989, the Course Guide was first available for purchase in the Penn Bookstore for $5.50. In its last year of print production in 2002, the course guide was sold for $12.95.

Around the same time that the Course Guide moved to the Penn Bookstore, the survey itself underwent a facelift. A panel of students and faculty representatives from each school convened to redesign the course evaluation forms.

"They went through and decided that the blue form, which is everything other than a language course, had more questions to it. They wanted to sort of slim it down," Director of Institutional Research for the College of Arts and Sciences Bill McManus said. From the beginning, language courses were evaluated using a separate form that more specifically addressed the nuances of a foreign language course.

The new form emerged with fewer overall questions and other questions reworked or added to better assess student perspective. "These are the questions students were interested in," McManus said of the new questions present on the form.

The move online

In the fall of 2009, the course evaluations moved online, and student reviews took an unexpected nosedive toward the negative.

Since 2009, evaluations have been distributed to students via their Penn email addresses, and the data is compiled and accessible to students through a Penn Course Review website, free to students.

Along with that shift came a significant drop in data. Across all departments in all schools, courses and professors were consistently rated lower in quality.

"We could see [the drop]. We wanted to know why it happened," McManus said.

The University then began an intense process of testing the data, trying to attribute the drop to a specific factor.

McManus, who was part of the team investigating the drop, said that they managed to ascertain that the drop was not due to an increased response rate or the extended time students had to complete evaluations.

Ultimately, according to Nelson and McManus, the University concluded that the drop was likely due to two factors. The first: Students were now completing evaluations alone, on their personal devices, and so were more willing to be critical than when they were filling out evaluations in class. The second: Students were completing all of their evaluations at once, and were critically comparing all of their courses, instead of evaluating each course individually over the last several weeks of classes.

But, according to McManus and Nelson, they have no way to prove this hypothesis.

"I don't know how we would test that. There's limited ability to - I mean we could do a focus group or something like that, but even then, you're not really, you know, unless you're in their head in the moment, how would you know?" Nelson said.

What they do know is that the skewed results were not a result of a different form - the questions students were asked remained exactly the same in the transition to online evaluations.

Before the shift to online, Nelson also said the data was skewed because some professors either didn't hand out the course evaluations, didn't make the evaluation mandatory for students or didn't leave the room while students completed the evaluation. The shift online rectified some of those problems as well, creating data that Nelson classified as likely more representative of all undergraduate students and classes.



Student use of survey data causes concerns

From the beginning, the survey data was shared with students via the Penn Course Review website. Despite debates at other colleges about whether course evaluation data should be available to students, Nelson said the debate never caused issue at Penn.

"The reason is the historical framework through which this came into existence," Nelson said. "It was a collaboration."

Still, some faculty members have been frustrated by inaccuracies and error in the survey data, and think that the data can mislead students.

"The controversies I've been involved in were controversies where faculty members said, "This data is wrong, and we're not happy about it. Fix it,'" Nelson said.

Most recent among the errors that caused faculty backlash, Nelson said, was an issue that arose with special topics courses - where several small seminars within a single department are listed under the same course code but focus on different topics.

"Because the numbers are the same, even though the courses are very different, [they] somehow got in the system as if they were the same course and that was yielding really bad data," Nelson said.

Another problem in recent years concerns the written summaries of courses once synthesized by students: A mere two years after going online, in 2011, they stopped being produced.

"Gradually, over the years, [students'] involvement physically in the process lessened and lessened to the point where we ended up hiring temps to do the sorting and the packaging," McManus said.

Nelson attributed the lack of editorial content on the Penn Course Review website to the decline of student participation in the project in recent years.

"We had this great system, I thought," Nelson said. "Then I noticed that the numbers were up there but the qualitative stuff wasn't, and I asked, and the editor kept getting back to me but not really being responsive, and then he graduated and suddenly there was nobody home," Nelson said. "As a tool for students to help make good academic decisions about which courses to take, that's essential."

Looking ahead

Engineering junior David Lakata has been co-director of PennLabs, the organization in charge of Penn Course Review, since the fall.

Lakata said that the administrators PennLabs works with, and the ones who sit on their advisory board, have been instrumental in getting course evaluation data.

For now, Lakata said the Penn Course Review group is still in existence, and there are no immediate plans for the system to change.

"In the short term, nothing is really going to change in the next few years," Lakata said.

But, Lakata did mention that a University-wide project, Next Generation Student Systems, would potentially replace the service in coming years.

On its website, NGSS is described as "a project [that] will replace three core systems built on inflexible Reagan-era technology: Student Accounts and Billing, Academic Records and Registration, and Financial Aid."

PenninTouch, course registration tools, and Penn Course Review data will all potentially be included in the new system, Lakata said.

"It's unclear how exhaustive the new product will be, but there will be some room for Penn Course Review to exist," Lakata said.

Intertwined with this question of a new system is the question of the future of separate, student-run projects that work in tandem with Penn Course Review - things like Penn Course Plus, a Chrome extension that embeds Penn Course Review data for course quality and difficulty into PenninTouch course listings, and Penn Course Notify, an app that alerts students when a course they are trying to get into opens up.

Despite his concerns, Nelson said he is supportive of these student-run initiatives and projects.

"There's a concern about taxing the system," Nelson said. "When you've got lots of students who are querying the system, it's a threat to system integrity."

For Nelson, the biggest question mark is whether the editorial, qualitative aspect of the course guide will ever return.

"That's really the piece that's sort of disappeared. Hopefully we'll find a way to revive that idea of student involvement in some kind of qualitative evaluation of the courses," Nelson said.

Nelson dreams of building a tool that allows students to share their own qualitative judgments about their courses, and he thinks the days of synthesized blurbs are over.