Participate / Doctoral Network

Social media platforms frequently recommend harmful content to young people: what parents needs to know about algorithms

Hand holding mobile phone displaying colorful human characters on social media platforms

19th April 2024

 

Prof. Debbie Ging and Deniz Celikoglu, Anti-Bullying Centre, DCU

 

Illustration by Willow Cahill

 

This week in Ireland, two studies revealed that young people are at risk of being served harmful content by social media platforms TikTok and YouTube Shorts. In the first, an investigative report conducted by the national public service broadcaster RTE, the journalists created three new TikTok accounts on three newly registered phones, each set with an age of 13 years. Without actively engaging with content or searching for specific topics, they observed and recorded the videos recommended by TikTok’s algorithm. Within 20 minutes, the accounts were shown videos which referenced self-harm and suicidal content. After an hour of engagement, the recommender system was serving a constant stream of videos referencing depression, self-harm, and suicidal thoughts. The findings are of serious concern to psychologists, particularly given that the age group with the most rapid increase in rates in self-harm is those aged 10 to 14 years.

The day after this programme was aired, our own Anti-Bullying Centre published a report on the role of recommender algorithms on TikTok and YouTube Shorts in promoting male supremacist influencers to boys and young men. The manosphere, which is an online network of anti-feminist and male supremacist groups, has been steadily growing and becoming more mainstreamed due to the amplification of its messages by social media algorithms. However, the social media companies do not reveal how their recommender algorithms work, and therefore they operate in ‘black box’ mode.

In response to this lack of algorithmic transparency, a number of recent studies have turned to experimental or ‘reverse-engineering’ methods to expose the ways in which TikTok, in particular, exposes young users to various types of extreme content. Two recent (2023) global Amnesty International reports (Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation and I Feel Exposed: Caught in TikTok’s Surveillance Web) highlight the ways in which TikTok exposes children and young people with pre-existing mental health issues to depressive and suicidal content, including videos that romanticize and encourage depressive thinking, self-harm and suicide. In 2022, Reset Australia also used experimental accounts to track the content that YouTube and YouTube Shorts most frequently recommended to boys and young men and, more recently, a similar study by Kaitlyn Regehr and colleagues in the UK found that after only 5 days of TikTok usage, there was a four-fold increase in the level of misogynistic content being presented on the platform’s ‘For You’ page.

The DCU Anti-Bullying Centre study, conducted by Professor Debbie Ging, Dr Catherine Baker and Dr Maja Andreasen, tracked, recorded and coded the content recommended to 10 experimental or ‘sockpuppet’ accounts on 10 blank smartphones – five on YouTube Shorts and five on TikTok. The researchers found that all of the male-identified accounts were fed masculinist, anti-feminist and other extremist content, irrespective of whether they sought out general or male supremacist-related content, and that they all received this content within the first 23 minutes of the experiment.

Once the account showed interest by watching this sort of content, the amount rapidly increased. By the last round of the experiment (after two to three hours viewing), the vast majority of the content being recommended to the phones was harmful (TikTok 76% and YouTube Shorts 78%), primarily falling into the manosphere (alpha male and anti-feminist) category. Much of this content rails against equality and promotes the submission of women. There was also a large amount of content devoted to male motivation, money-making and mental health. This material strategically taps into boys’ financial and emotional insecurities and is particularly dangerous in relation to mental health as it frequently claims that depression is a sign of weakness and that therapy is ineffective. The other toxic categories were reactionary right and conspiracy, which accounted for 13.6% of recommended content on TikTok and 5.2% of recommended content on YouTube Shorts

The findings of the report point to urgent and concerning issues for parents, teachers, policy makers, and society as a whole. For parents, the authors emphasise the need for open discussions with young people, without fear of rebuttal. Rather than expressing disgust or outrage at celebrity ‘manfluencers’, they recommend that parents should listen to and discuss why their child is attracted to them. The reasons may be complex and varied, and it is important to understand where boys’ anxieties are coming from if we are to address them effectively.

In addition to this, both children and parents need support and resources in critical digital media literacy to understand how influencer culture and the algorithmic architectures of social media platforms actually work. The research of one of our PARTICIPATE Doctoral Candidates, Deniz Celikoglu, responds directly to this need. As the above reports indicate, parents’ digital literacy skills could play an important role in mitigating the online risks associated with children’s social media consumption curated by algorithms. It is important to understand that children’s online and offline lives have become inextricably linked  and, while they may be intensely tech-savvy, young people are not necessarily aware of how algorithms are shaping their digital engagements. Despite the critical role of parents in their children’s social media consumption, research focusing on parents’ algorithmic literacy has been very limited (Das, 2023; Taylor & Brisini, 2024). Although there has been some information available online for parents about algorithms (Childnet), there is a need for more detailed information and resources which go beyond practical tips.

Many parents feel ill-equipped to navigate the complex system of algorithms. A recent study on parents’ understanding of social media algorithms revealed that they were aware of the risks of algorithms but believed that their concerns were an issue for the future (Das, 2023). This research also found that many parents misunderstood how algorithms worked and had incorrect assumptions about certain features of social media platforms. As the reports described above show,  many young people are at risk of being exposed to harmful mental health content, and boys are especially vulnerable to being indoctrinated into manosphere thinking. While parents can both create risk factors and provide protection for their children, there is currently very little guidance available to help them understand and navigate this complex digital landscape.

To address this lack of algorithmic awareness in parents’ digital literacy, PARTICIPATE’s Work Package 4 will investigate parents’ awareness of the technological affordances of social media and mobile technologies, and will produce a set of Digital Safety Guideline for Parents. Deniz’s work in particular will explore parental awareness of recommender algorithms as well as the push factors which might make their children more susceptible to indoctrination into these spaces. Based on the data gathered from her fieldwork, Deniz will create a toolkit for parents that specifically addresses how algorithms work, provides guidance on how to prevent online risks and utilise online opportunities (Taylor & Brisini, 2024), and identifies ‘red flags’ or signs that their children may be coming under the influence of harmful online groups and ideas.

The full-length report ‘Recommending Toxicity: the role of algorithmic recommender functions on YouTube Shorts and TikTok in promoting male supremacist influencers’ is available here: antibullyingcentre.ie/recommending-toxicity/

You can read more about the RTE Prime Time report here – https://about.rte.ie/2024/04/18/rte-prime-time-experiment-reveals-disturbing-content-recommended-to-13-year-old-tiktok-users-in-ireland/#:~:text=Prime%20Time’s%20experiment%20involved,videos%20recommended%20by%20TikTok’s%20algorithm

 

Useful Resources for Parents

 

Positive masculinity interventions

 

Webwise – https://www.webwise.ie/

The Positive Masc Project – https://positivmasc.ki.se/

Beyond Equality – https://www.beyondequality.org/

Hope Not Hate – https://hopenothate.org.uk/communities/in-schools/

BBC Trending: How to Exit the Manosphere – https://www.bbc.co.uk/programmes/w3ct5d95

Childnet https://www.childnet.com/blog/algorithms/

 

Statistics

 

European Institute for Gender Equality: Gender Equality Index, Ireland (2022) –

https://eige.europa.eu/modules/custom/eige_gei/app/content/downloads/factsheets/IE_2022_factsheet.pdf

Rape Crisis Network Ireland (RCNI) 2022 Statistics –

https://www.rcni.ie/wp-content/uploads/RCNI-Rape-Crisis-Statistics-2022.pdf

 

Podcasts

 

Who is Andrew Tate? The Journal.ie Explainer
https://podcasts.apple.com/ie/podcast/who-is-andrew-tate/id1452246930?i=1000595850526

Now and Men podcast: Men, Masculinities and Gender Equality
https://menengage.org/resources/now-and-men-podcast-men-masculinities-and-gender-equality/

 

 

Hand holding mobile phone displaying colorful human characters on social media platforms