Lucky_xtian/Shutterstock

Study which said 34% of female university students were raped critically flawed

A national survey of third-level students, which found that 34.2% of female third-level students had been raped, contains a fundamental methodological issue which renders the data collected from it nonrepresentative, meaning it does not accurately reflect the student population, and effectively meaningless. The survey was carried out by the Higher Education Authority (HEA).

Whilst the sample size of the online survey looks, on initially viewing, to be rather respectable, that sample size is undermined by the fact that the survey was emailed to roughly 245,000 students and that the response rate was only 3.2%.

That low level of response means the entire work is undermined by a non-response bias of an unknown, but certainly critical, severity. The fact that the non-response bias cannot be accounted for means the HEA cannot demonstrate that the results are representative of the wider third-level student population.

Non-response bias refers to a situation in which the people who responded to the survey differ from those who did not complete the survey in some material fashion. These differences can be obvious, for instance the HEA survey was disproportionately completed by women, or they can be difficult, if not impossible, to perceive.

In a basic sense non-response bias means that if you try and survey the general population about a subject, but some of those people are disproportionally interested in. or connected to, that subject and so more likely to respond to the survey, you can end up with a result which reflects those disproportionally interested people instead of the overall population. This means your survey cannot be used to draw conclusions about the subject in general, but rather can only be taken to reflect the views of those who responded, which is to say the survey results become, at best, basically useless, and, at worst, actively misleading. The larger the amount of people you survey, and the fewer who respond, the easier it is for a non-response bias undermine the results. It’s a constant concern when surveying large groups of people, but certain types of surveys, and certain topics, are particularly vulnerable.

In relation to a topic as emotionally charged as sexual violence it is not difficult to see that there are particulars groups who are likely to become much more engaged with surveys on the topic than the general student population. That goes some way to explaining why 77% of the respondents to the survey were women and only 20% were men, figures which do not represent the sex breakdown of the third-level student population.

Whilst it’s not possible to exactly determine the impact of the non-response bias seen in the HEA’s report, a 2021 paper, Differences in Nonresponse Bias and Victimization Reports Across Self-Administered Web-Based and Paper-and-Pencil Versions of a Campus Climate Survey, does give us some idea of how significant it can be.

The team behind that paper administered the same survey on violence against women in multiple different ways: an in-person anonymised survey given to all students in particular classes; an on-line survey emailed to students in particular classes; and an online survey accessed by a mass email sent to all students – the method used for the HEA study. The mass email had a response rate of roughly 8%, more than double that of the HEA study.

8.63% of students who completed the in-person survey said they had been raped, compared to 15.08% of those who completed the online survey, and 27.3% of those who completed the mass email online survey. That trend held true for every form of victimisation measured – those who responded to the mass-email survey were substantially more likely to report having been bullied, emotional abused, raped, and physically assaulted than those who completed the survey in person. A move from in-person to online surveying also caused women to become a disproportionate percentage of the respondents, which is exactly in line with the results of the HEA survey.

Differences in Nonresponse Bias and Victimization Reports Across Self-Administered Web-Based and Paper-and-Pencil Versions of a Campus Climate Survey Table 1

You can attempt to weaken the impact of a non-response bias by ensuring your surveys are easy to access and quick to complete, on the basis that high barriers to completion will lead to an even greater over-representation of the disproportionally interested group in your results, as they’re willing to deal with those barriers whilst the average respondent is not.

In the case of the HEA report that doesn’t appear to have been attempted. The survey itself was 70 questions long, and merely accessing the survey required reading through four pages of explanatory text. You put that all together and you have a survey which had an absolutely massive pool of potential respondents, had high barriers to entry, an incredibly low response rate, and produced results which would, if true, mean Irish third-level students are being raped at levels above that seen in the world’s most violent and lawless countries.

Gript asked the HEA a) what steps they had taken to minimise the risk of non-response bias in their results, and b) how confident they were that the results were representative given that risk and the exceptionally low response rate. The HEA told us that “These reports give a snapshot of the lived experiences of the 7,901 students and 3,516 staff members who responded to the survey. The HEA takes the issue of sexual violence and harassment on HEI campuses extremely seriously.”

It is worth noting that producing a snapshot of the “lived experiences” of those who completed your survey, and producing a piece of work which is representative and can be used to inform ourselves of the actual extent of issues surrounding sexual violence across the student population are two very, very different things. That distinction is of particular importance as the HEA has previously said that the survey was designed to “create a robust evidence base for further policy on these issues.” It is somewhat concerning that the HEA did not even attempt to say the survey was representative.

It is also worth noting at this point that the issue of non-response bias is not the only complaint one could make against this survey, it’s only the focus of this piece as it is a failure at a fundamental level which renders the other potential issues moot.

Dr Pádraig MacNeela of NUIG’s Active* Consent programme, who led the writing of the reports on the HEA surveys, told Gript, in response to a query asking about his confidence that the results were representative, that “the findings matched patterns to be expected on the basis of past research – e.g., in terms of differences by gender, by sexual orientation, year in college, and so on. I take from those observations that the findings presented a coherent picture which is grounded in what we know from past research.”

Whilst Dr MacNeela did not specific the past research he was referring to there was a 2020 study, carried out by the Union of Students in Ireland (USI) which found similar, although not identical results to the HEA survey. However, that work also relied on a mass-email online survey and participation in the survey was open to the public, which is to say it has almost exactly the same methodological issues as the HEA study. That’s not terribly surprising given that it was carried out in conjunction with NUIG’s Active* Consent programme, who worked with the HEA on this new release.

Interestingly a USI online survey of third-level students conducted in 2013, but which did not involve NUIG’s Active Consent programme, found 5.25% of women reported being raped, and 16% of students saying they had experienced some form of unwanted sexual experience whilst in their current educational institution. Those figures, whilst high, are dramatically different than what we see in the later research conducted in collaboration with the Active* Consent team.

A cursory examination of the methodology of the HEA research should demonstrate clearly to anyone that it cannot be taken as representative, but that hasn’t stopping people from doing so. Recent headlines have declared a third of Irish female students have been sexually assaulted based on the research, newspapers have written about the “widespread evidence” of sexual violence on college campuses based on the report, and national politicians have calling for policies like mandatory consent classes to be implemented on the back of this research. None of this is a serious response to a piece of research like this.

If the results of this research were accurate, and 34% of female student were being raped during their time in universities, the appropriate response would not be to start a committee or implement consent classes for students – it would be to shut the universities in the name of student safety. It seems clear, looking at the solutions that are being proposed based on this research, that those proposing them either don’t actually believe these figures are accurate, or they have a spectacularly lackadaisical approach to protecting young women from a rather imminent threat of sexual violence.

Minister Harris said that the survey was conducted as “we needed a robust evidence base” and that the survey “gives us vital information to inform further actions.” The Minister is absolutely correct that sexual violence is importance enough an issue to demand that we conduct serious research into the area, but research like this is, at best, meaningless, and, at worst, actively lessens the amount of knowledge we have about sexual violence. Basing further actions on a work characterised by such clear deficits will not assist in combatting sexual violence at the level of public policy, because any effective actions which may emerge from such policy must be based on accurate knowledge about the area.

Share mdi-share-variant mdi-twitter mdi-facebook mdi-whatsapp mdi-telegram mdi-linkedin mdi-email mdi-printer mdi-chevron-left Prev Next mdi-chevron-right Related
Comments are open

Do you agree with Senator Keogan that people on long-term unemployment benefit should have to do community service for the money?

View Results

Loading ... Loading ...