A Seton Hall student found in her research project potential evidence of gender bias toward female professors on Rate My Professor.
Elizaveta Sidorova, a senior psychology major with a certificate in data analysis and visualization, gave a presentation on the sexism of the website at a session on women in STEM during the Conference on Women and Gender on March 29.
Rate My Professor is a website where students and peers can post ratings and comments on professors. According to the website, it is a site “built for college students, by college students.”
The website also said, “the site does what students have been doing forever – checking in with each other – their friends, their brothers, their sisters, their classmates – to figure out who’s a great professor and who’s one you might want to avoid.”
Sidorova went into details about what the research was about. “The research that I am assisting on aims to look at how student evaluation of women in several academic disciplines compare to men in those disciplines,” Sidorova said. She added that the research looked at two time periods, one ending in 2008 and the other ending in 2018.
“The hypothesis is that there will be a lower mean ratings of women compared to men in various aspects, and that this will be particularly true in women and men in STEM [Science, Technology, Engineering and Mathematics] fields,” she said. “We use Rate My Professor as the student evaluations.”
Sidorova provided background information on the research and said, “Student evaluations have been used quite often in very important ways, including deciding if professors should be eligible for tenure or not.
“With this in mind, it is important to see if they are accurate and unbiased measures of teaching effectiveness. When looking into it, we see that it is not the case because there are serious biases- and the one that this research focuses on is in gender, although race bias exists as well.”
Sidorova added that there is no doubt everyone has bias, but the important thing is to be able to acknowledge it and to grow from it.
Dr. Susan Nolan, a professor in the Department of Psychology, also commented on the research. “Previous research has suggested that there are gender differences in student evaluations,” Nolan said.
She explained an example of one study where researchers asked students to rate the professors of their online course. She said that the researchers then, “manipulated the name and photo of the professor so that some students thought they were taking a class with a female professor and some thought they were taking a class with a male professor.”
Nolan continued, “Students who thought their professor was a woman gave lower ratings than those who thought their professor was a man, even though it was the same professor– that is, those rating the ‘male’ and ‘female’ online professors were actually rating the same experience.”
Sidorova said this type of research is important because of the need for women in STEM. “This kind of research can help to show people the bias that exists and can help to put in place practices that encourage women to continue in their STEM path and be successful,” she said.
Nolan added, “If our research supports previous findings, we hope to shed light on the unreliability of online student evaluations which can impact professors’ careers.”
Dr. Andrew Simon, an associate professor in the Department of Psychology,commented on the research and its impact.
“Gender bias in the evaluation of professors is something that gets discussed on a personal, anecdotal level,” Simon said. “We are taking a systematic approach, studying evaluations across a range of professional specialties and a range of universities.”
Sidorova began working on the project in the spring semester of her sophomore year.
“I had absolutely no experience with research and wanted to explore as much about psychology as I could,” Sidorova said. “Fast-forward two years and here we are.”
When asked about the process of obtaining results, Nolan said that they are still gathering data.
“Lisa has been leading a team of research assistants as they gather comments from Rate My Professor and then code them into categories that include: comments about teaching well, teaching poorly, attractiveness, helpfulness, being funny, being boring,” Nolan said. “It’s a time-consuming process that requires careful work. Lisa has done an excellent job overseeing the process.”
“Rate My Professor is biased because only students who have had an incredible experience or a terrible experience go out of their way to post on this website,” Sidorova said.
She continued, “It just isn’t a representative enough sample. I discourage others around me from using it, and instead encourage them to come into a class with an open mind and be able to adapt to every professor’s instruction.”
Rhania Kamel can be reached at email@example.com.