Taxpayer funded NGO, The Hope and Courage Collective, has told an Oireachtas Committee that social media recommended systems should be turned off so that certain posts on a range of issues are not “amplified.”
Speaking before the Justice Committee on Tuesday, Director Niamh McDonald, who was joined by Research and Communications lead Mark Malone, said she wanted to focus on “the dynamics of the far right,” along with the role of social media companies in spreading what she called “lies, hate and disinformation.”
Ms McDonald, a member of the Independent Left and former Chair of the Dublin Bay North Repeal the 8th Group, preciously gave a presentation in Leinster House claiming that “extremists” were “fermenting anti-migrant protests.” The presentation, given in November, called for politicians to tighten social media regulations.
Speaking last year, McDonald claimed that the far-right in Ireland was defined by the presence of Christian fundamentalism, anti-science belief and ethno-nationalism. She also claimed X, formerly Twitter, had become an unregulated space since its takeover by Elon Musk – a claim which the platform’s owner has strongly denied.
This week, McDonald said a key focus of the group would be on “how we can tackle” the business model of social media companies. Ms McDonald, on behalf of the NGO previously known as the Far Right Observatory, said she wanted to leave the committee in no doubt that the “severity” of what was being seen on social media “leaves us fearful for people’s lives,” including those seeking asylum, members of the LGBT community, and the Roma community.
She claimed that “the intensity of hate, lies and misinformation on social media has been on the increase for a number of years,” further claiming that “violence is now manifesting across society.”
Ms McDonald claimed there were a number of factors “aiding the growth” of the far right, and that extremists were weaponising the issues communities are facing “by sowing hate and division.”
She also criticised the government for its “row back” on policies for those seeking asylum.
The comments come following the government admitting before Christmas that it had run out of accommodation for international protection applicants, after warnings about the capacity to house rising numbers of people common to Ireland. Ms McDonald said that the government’s “get tough” narrative on immigration was “appeasing and emboldening the far right,” while also warning of an “exponential growth of lies, hate, and disinformation” over the past five years.
“This is part of a well documented ‘playbook’, with the purpose of creating the illusion that there is more support for the far right than actually exists,” she said.
The Hope and Courage Collective went on to accuse social media companies of having “provided extremism with such a massive platform”, adding that the business model of social media companies represents “a direct attack on our democracy.”
Irish NGOs have previously been accused by advocacy groups including Free Speech Ireland (FSI) of attempting to shut down debate in Ireland. Matt Treacy, writing for Gript last year, referred to the Coalition Against Hate Crime (CAHC) as one such Irish NGO “created to push for censorship of views and restrictions in conflict with their own.”
There are 22 organisations affiliated with CAHC, a group which was hosted by Green Party TD Patrick Costello in September which has called for the enactment of the Hate Speech Bill.
TRUSTED FLAGGER STATUS
Ms McDonald went on to refer to the Hope and Courage Collective’s status as a “trusted flagger” with all social media platforms. It follows the passing of the Digital Services Bill in January, with the Dáil hearing that the new EU law would have a “particular impact on free speech and the public realm.”
Under the law, organisations that wish to identify illegal online content can apply to qualify officially as “trusted flaggers.” Trusted flaggers operate as government-appointed entities – specifically not individuals – who take on the role of identifying and notifying platforms of content on their site which could potentially be illegal.
Speaking during Committee Stage on the Bill last month, Senator Sharon Keogan hit out at “the abuse of power” which she said had become “widespread” due to online fact checkers.
“Ever since 2016, the online spaces of social media have been plagued with the ambiguous fact checkers, the individuals and bodies which often, by some unknown mechanism, are granted the power to remove or suppress content that is deemed untrue or partially untrue,” the Independent Senator said.
“Obviously, the abuse of this power has become widespread almost immediately as partisan ideological moderators crack down on anything that challenges their worldview or paints their opponents in a favourable light.
“Mark Zuckerberg famously once told the US Congress that these fact checks were simply opinions given by individuals or bodies in order to not have Facebook liable as a publisher.
“The moniker of fact checking has now become a running joke online, synonymous with being merely an alternative opinion lending undue weight.
“Now that the fact checking frenzy has died down and largely gone away, Ireland wants to introduce this on a statutory basis and give people more power than ever before because we are always five years late and we seem to exclusively adopt policies that other countries are abandoning after seeing that they do not work,” she told the House.
“Many people in our communities are running a dangerous gauntlet of hate and extremism that is amplified by social media,” she continued, while referring to “those suffering the harms of recommender systems,” McDonald said.
Social media platforms, Ms McDonald said, “have directly supported white supremacists to spread hate in towns and villages, not only across Ireland, but across the globe.”
“The solution,” according to the NGOs, would be to turn off recommender systems by default.
Social media platforms have long used recommender algorithms to determine what users may like to see on their feeds, meaning people are directed to pages or services they may otherwise not have found on their own. Such algorithms provide relevant suggestions based on the choices social media users have made.
“This will not prevent people posting on social media, but it will reduce the amplification of hateful and dangerous posts. The right to freedom of speech is not the same as the right to amplification by global social media platforms,” Ms McDonald said.
The organisation further urged the committee to write to Media Minister Catherine Martin requesting her to set out an approach to hold social media companies to account.