Toxic Citizenship, Everyday Extremism and Social Media Governance

by MISSISSIPPI DIGITAL MAGAZINE


The violent rioting in the UK in early August 2024 and the decision by North East Ambulance Service (NEAS) to leave X (Twitter) announced on 16th August 2024 may appear to be totally unconnected events. Yet both reflect widespread concern that social media is actively promoting division and networks based on hate-fuelled violence. As NEAS stated on leaving X: “We feel strongly that the failure to police content on X allows the perpetuation of unacceptable and offensive content, which has seen a sharp rise in hate speech and misinformation that is not consistent with our values” (Mark Cotton, NEAS assistant director of communications and engagement). What NEAS and the riots alike have bought into sharp focus is the lived online realities of vast swathes of the public and the need to fundamentally re-think social media governance.

The UK government has suggested that it may relook at the Online Safety Act in light of the riots – this is vital. Misinformation, hateful rhetoric and the co-ordination of violent disorder on social media were integral to the spreading of the riots. However, the suggestion to extend the Act’s scope – so that social media companies are legally responsible for failures to police the algorithms within their platforms that allow misinformation to flourish – will do little to address the fundamental issues.

Instead, the UK government needs to review the foundations of social media governance – starting from an updated assumption that free speech online needs to be earned and is not an automatic entitlement. Otherwise, reforms centred on targeting misinformation will simply continue to uphold the existing security-centric approach to social media. Historically, the primary focus of government in the UK and elsewhere has been to root out would-be terrorists and protect children (in particular) from illegal, malicious and harmful content online (see also Yar, 2018). The current proposals continue this security-centric approach by placing social media companies as co-conspirators alongside harmful individuals or groups in creating a climate/pathway through which online violence morphs into real world violence via the spreading of misinformation. Instead, the governance of online interactions should be guided by the promotion of citizenship for all.

For too long defending free speech has been seen as the untouchable guiding principle, with the internet and social media defended as ‘public goods’. Initially, this resulted in widespread optimism that social media was integral to holding anti-democratic regimes to account and could yield positive and productive social change, with the high watermark of this optimism being perhaps the Arab Spring the early 2010s (Comunello & Anzera, 2012). Yet social media and the internet were never public spaces and have never enabled free speech in the ways the idealists claim. Social media corporations are big business, driven by ‘surveillance capitalism’ (Zuboff, 2019), with underpinning logics that generate profits through interaction with content – and hateful, antagonistic and divisive content drives greater interactions than the opposite (Munn, 2019; Ribeiro et al, 2020 for a discussion).

The riots and response of NEAS alike are thus reflections of the growth of what I here term ‘toxic citizenship’, made manifest in extreme forms in the rioting and violence that was unleashed on UK streets in August 2024. Social media has become a space in which widespread hate, misogyny and what I elsewhere term ‘everyday extremism’ are prices that have to be paid by the majority for the ‘benefits’ that social media yields. Members of the public frequently report trolling (being subject to hostile interactions by those who seek to provoke a response), doxing (unwarranted exposure of personal details online), and more widely being victims of abuse online, for example, in relation to their race, gender, or political views (Hannan, 2018; Burke, 2015). The riots are thus an extreme manifestation of experiences which are encountered by millions on a daily basis.

Aside from the mental health toll, the consequences for active citizenship are profound. Fear of being attacked by others results in the majority either not speaking on issues at all, or confining comments to forums populated by likeminded individuals within ‘echo chambers’ (Quattrociocchi et al, 2016). The expectation amongst large numbers of the public is that if they speak out on particular issues, they will be victims of extreme hostility. These effects are felt most acutely by women, ethnic minorities, ecological activists, trans-activists and other marginalised groups (Döring and Mohseni, 2020 for such experiences for journalists).

The starting point for a review of the governance of social media thus should be the assumption that all social media platforms are ones that NEAS would wish to be part of. In short, there are profound implications for citizenship which stem from the self-exclusion and silencing on social media platforms of so many of the population who are fearful of speaking out on issues due to concerns about being shouted down or worse by a highly vocal minority (Griffin, 2023).

One possible way forward would be to begin with thinking prompted by a parallel public sphere – that is, the current approach to regulation of UK football stadia. This may seem an unlikely comparator, but we need to start with radical thinking, not least because it will prevent the inevitable path dependencies which will emerge from changes which are locked into to a security-centric, industry-focused logic. But thinking inspired by UK football regulation is also highly instructive for other reasons. Historically, football was also highly toxic, with racism openly practiced in stadia – such behaviour was widely seen as integral to the culture of the game (Jewell et al, 2014). However, nowadays, whilst still not perfect, regulation of UK football is extensive – for example, courts can issue football banning orders, which prevent transgressors from entering stadia; while hateful, racist and homophobic chanting is illegal and can be met with criminal prosecution (Pearson, 2021). The result has been a change in football culture centred on growing self-policing of unacceptable behaviour (Pearson, 2012: 162-7).

The lessons are clear. Similar change to social media culture will take a long time but to have any meaningful implications law makers have to move beyond advocating incremental change and ask more searching questions: fundamentally, what sort of a society have we become when an ambulance service is driven off social media?

References

Comunello, F. & Anzera, G. 2012. “Will the Revolution be Tweeted? A Conceptual Framework for Understanding the Social Media and the Arab Spring”. Islam and Christian–Muslim Relations, 23(4): 453–470. https://doi.org/10.1080/09596410.2012.712435

Döring, N. & Mohseni, M.R. 2020. “Gendered Hate Speech in YouTube and YouNow Comments: Results of Two Content Analyses”. SCM Studies in Communication and Media, 9(1): 62-88. https://doi.org/10.5771/2192-4007-2020-1-62

Griffin, R. 2023. “Public and Private Power in Social Media Governance: Multistakeholderism, the Rule of Law and Democratic Accountability”. Transnational Legal Theory, 14(1): 46–89. https://doi.org/10.1080/20414005.2023.2203538

Hannan, J. (2018). “Trolling Ourselves to Death? Social Media and Post-truth Politics”. European Journal of Communication, 33(2): 214-226. https://doi.org/10.1177/0267323118760323

Jewell, R.T., Simmons, R., & Szymanski, S. 2014. “Bad for Business? The Effects of Hooliganism on English Professional Football Clubs”. Journal of Sports Economics, 15(5): 429-450. https://doi.org/10.1177/1527002514535169

Munn, L. 2019. “Alt-Right Pipeline: Individual Journeys to Extremism Online”. First Monday, 24(6). https://doi.org/10.5210/fm.v24i6.10108

Pearson, G. 2012. An Ethnography of Football Fans: Cans, Cops and Carnivals (Manchester University Press, Manchester)

Pearson, G. 2021. “A Beautiful Law for the Beautiful Game? Revisiting the Football Offences Act 1991”. The Journal of Criminal Law, 85(5): 362-374. https://doi.org/10.1177/00220183211007269

Quattrociocchi, W., Scala, A. & Sunstein, C.R. 2016. “Echo Chambers on Facebook” SSRN papers. http://dx.doi.org/10.2139/ssrn.2795110

Ribeiro, M.H., Ottoni, R., West, R., Almeida, V.A.F., & Meira Jr.,W. .2020. “Auditing Radicalization Pathways on YouTube”. FAT* ’20: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, pp. 131-141. https://doi.org/10.1145/3351095.3372879

Winkelman, S.B., Oomen-Early, J., Walker, A.D., Chu, L. & Yick-Flanagan, A. 2015. “Exploring Cyber Harassment Among Women Who Use Social Media”. Universal Journal of Public Health, 3(5): 194-201. https://doi.org/10.13189/ujph.2015.030504

Yar, M. 2018. “A Failure to Regulate? The Demands and Dilemmas of Tackling Illegal Content and Behaviour on Social Media”. International Journal of Cybersecurity Intelligence & Cybercrime, 1(1): 5-20. https://www.doi.org/10.52306/01010318RVZE9940

Zuboff, S. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books.

Further Reading on E-International Relations



Source link

You may also like