Faculty, students discuss FCEs
As the fall 2011 semester at Carnegie Mellon draws to a close, Faculty Course Evaluations (FCEs) are made available to students to evaluate courses and professors. Students’ responses will be reviewed by the university, which uses them to make decisions regarding course curricula and faculty promotion.
In an email, University Registrar John R. Papinchak explained the significance of FCEs and the importance of students’ feedback.
“Faculty Course Evaluations are a major source of teaching feedback to instructors,” he said. “An overwhelming majority of instructors read and seriously take to heart what students report on the evaluations. Your feedback can lead faculty to revise teaching methods, change textbooks, revise assignments and make other changes to help you learn.”
Papinchak also spoke about FCEs’ repercussions for Carnegie Mellon faculty. “Department heads and deans also review the course evaluations and use this information as one measure of teaching that contributes to decisions concerning faculty promotions.”
FCEs allow students to rate courses and professors in nine core areas, in which they assign a rating of one to five — one is the poorest rating, while five is the most superior rating.
On the Carnegie Mellon website for Faculty Course Evaluations, there is specific information regarding the FCE process, including a “Frequently Asked Questions” (FAQ) section. Among other things, the site addresses student privacy. The site assures students that FCE results are published anonymously once students complete the survey.
Faculty members, such as professors and advisers, also have a vested interest in the FCE process: tenure and promotion decisions are, in part, dependent upon students’ course ratings. Jared Day, adjunct professor and research associate in the department of history, voiced support for the FCE surveys, but also expressed concern about their implementation. Day said in an email, “I think they are essential benchmarking tools in modern academics and at Carnegie Mellon, especially.”
He added, however, that “students are not required to fill them out. Thus, they are often magnets for the most disgruntled students in any given class. If I were going to reform one thing about the system, I would say students should be required to fill them out in order to get their grades at the end of the semester. At least then you have a full picture.”
According to past FCE results, the college with the highest rate of FCE return in spring 2011 was CIT, where 63 percent of possible FCEs were filled out. The college with the lowest rate of return was CFA, with 41 percent.
Joe Selinger, a junior chemical engineering major, agreed that FCEs can be misleading. “I don’t really think it’s a good measure, because professors’ quality can vary a lot. It would be good if [the evaluations] were unanimously awful, but that’s it. Professors are just hard, so they get negative reviews because of that.” Selinger said that he rarely takes FCEs seriously unless he especially likes or dislikes a professor.
He also said that he is not convinced of the usefulness of FCEs. “I think there’s a fair amount of belief that they don’t really have much value, especially with older professors who, if you give them negative feedback, still don’t really change their habits,” Selinger said.
Others cited different problems with FCEs. Danny Davis, a sophomore double majoring in linguistics and chemistry, believes that FCE surveys for new courses must be created sooner, and the results disseminated more quickly.
He said via an online questionnaire, “I find them very useful, but I find it kind of inconvenient that the results aren’t posted for such a long time. Half my classes are either new enough or have a new enough professor so that they don’t have any evaluations at all.”