I deleted my Instagram page yesterday (11 October 2022) – a protest against Instagram and its apparent overall obscene indifference to permitting posts, with rare exception, to go out worldwide promoting self-hate, self-harm and suicide, especially among teenagers.
Countless teenagers and others, younger and older, suffer due to the harmful posts they receive. Instagram avoids taking full responsibility for the vile content on its platform. Instead, it issues meaningless rhetoric in the form of apologies and claims the corporation work to make their AI/logarithms do better.
I read online news reports last week of the tragic death in 2017 of 14-year-old Molly Russell from Harrow in north London. Last week, the Coroner’s Inquest into her death concluded she died from “an act of self-harm while suffering from depression and the negative effects of online content.”
Photo shows Molly Russell, 14.
After years, a person with authority in a Coroner’s Court stated what many of us have believed for years – certain posts in social media, such as Instagram and Pinterest, can contribute to negative effects including, self-harm and suicide of teenagers, such as Molly.
Does Instagram stand in the gutter of social media platforms?
In her last six months, Molly used her Instagram account on average more than 120 times a day. She also liked more than 11,000 pieces of content and shared material more than 3,000 times, including more than 1,500 videos. Molly wrote a note before she died,
Despite all its efforts to deny the impact of online content, Instagram made no real effort to stop being the agent for the global distribution of information containing hate and harm to teenagers. Governments and the Judiciary still permit the owners of Instagram to engage in self-regulation.
The Court revealed some of the content/videos found on the 14-year-old’s Instagram account. The Coroner warned that these videos were almost impossible to watch. A member of his staff had to leave the room while they were viewed.
Examples of Harmful Posts on Platforms and Responses from the Vulnerable
- A photo of blood spattered on a tiled floor and a picture of a coffin with words over the top: “Now everyone loves me”.
- A cartoon of a person who committed suicide.
- A photo of a lighter with the words: “We’re all addicted to something that takes the pain away.”
- If parents ask a depressed child/teen about their feelings, posts advised teen to smile and say “I’m fine. Everything is OK.”
- A post described fresh cuts as ‘perfect.’.
- Posts romanticised acts of self-harm by young people on themselves.
- Posts discouraged discussion with those with those who may have been able to help.
- One vulnerable teen said the three main images she received in posts were cutting, burning and overdosing.
- A teenager shared pictures of her fresh cuts with 8,000 followers.
- A subscriber said: It’s OK. It doesn’t matter how bad it gets because they’re not dead, it hasn’t killed them yet.
The family of Molly had expressed “frustration and disappointment” at delays by Meta, the parent company of Instagram, in providing evidence to the inquest into her death.
Instagram did not provide the information from algorithms containing names of accounts containing harmful material sent to Molly. The Court heard the family, found it very difficult to understand how social media, like Instagram, could send material “unilaterally to a child but cannot send it to the family or coroner after her death”.
Instagram and Pinterest tried to wiggle out of their responsibilities. Instagram argued at a pre-inquest review their evidence should be given via video link from America. Their lawyers said Instagram’s head of Health and Wellbeing policy would require “significant travel and time in London” to attend in person. Pinterest argued one of its executives could not travel to London due to an important business meeting.
Only a month before the Inquest and five years after Molly’s death, Instagram finally provided vital information on harmful posts. This long delay surely increased the number of vulnerable teenagers who engaged in self-hate, self-harm and suicide. The Court needed to meet years earlier.
Senior Coroner Andrew Walker ruled against the tech companies and ordered their witnesses appear in court in person to give evidence.
Meta seems to have made the bereaved Russell family wait as long as possible to get answers regarding their child’s death. Zuckerberg and his empire reveal a lack of empathy and compassion for those in grief. Ian Russell, the father of Molly, told the media such platforms “prioritise profit by monetising the misery of children.”
Coroners’ Court heard that Molly saved, liked or shared 2100 posts related to depression, self-harm and suicide. She checked Instagram hours before she killed herself. The coroner ruled she saw images which “shouldn’t have been available for a child to see”.
Power, addiction to growth and profit appear to take priority for Meta over mental health. Meta reveals its own mental health issues – corporate denial, corporate narcissism and an inability to take full responsibility as merchants of misery. Molly’s loving father referred to the ‘life-sucking content’ of social media.
Parents and guardians listening to the loving and thoughtful words of Ian Russell could sense the heartbreak and the anguish within him. He said social media had created a ‘monster’ and Meta CEO Mark Zuckerberg had to listen to the conclusion of the Court.
Parents/guardians/schoolteachers and others need to provide as much guidance and support to the young so they can protect themselves from the abuse that social media permits. That is a major undertaking but necessary due to repulsive corporate behaviour, who must bear responsibility.
Pedlars of Pain
Algorithms of Instagram ensure binge periods of posts for teens regardless of contentment and teens not even requesting it.
This crude platform refuses to engage in a fundamental change of the algorithmic systems and design features set up to recommend all manner of posts for followers of Instagram to read.
An Instagram spokesperson said: “Our thoughts are with the Russell family and everyone who has been affected by this tragic death. We’re committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers.” Committed? Committed?
Mark Zuckerberg, Meta’s chief executive, has claimed social media was more likely to have positive mental health effects. So that makes self-hate, self-harm and suicide acceptable?
Adam Mosseri, head of Instagram, said he had seen research suggesting its effects on teenagers’ mental health was probably “quite small”. That is like saying Putin’s war on Ukraine is probably quite small.
Facebook Vice-President of Global Affairs and Communications Sir Nick Clegg said the company would do “whatever it takes” to make the platform safer for young people. He is a former British deputy prime minister. Where is the evidence to show Meta is doing whatever it takes to protect users of Meta’s platforms? It sounds like a diversionary platitude.
Meta makes its money by sending endless advertisements of products to users of its platforms. Having collated what users like, they endeavour to ensure Instagram users get sucked into advertisements. More clicks will come to convert into sales. Nothing must interfere with the business model including mental health of teens. Algorithms also hook people into increasing time on the platform. Depression is not far away from unhappiness.
One Inquiry surveyed more than 3,000 young people showing that social media can also lead to
• damaged sleep patterns
• body image issues
• online grooming
• child abuse.
When the media flag a harmful post to Facebook, the social platform Facebook immediately deletes it to show they are responding. It is the entire body of algorithms that need fundamental change not tinkering with a handful of posts out of possible tens of thousands or far more. Meta refuses to reveal outcome of its research into harmful posts.
One senior researcher wanted to see how fast she could get to a graphic image with blood, obvious self-harm or a weapon involved.” The associate professor said, “It took me about a minute and a half.” Many of the posts were made under hashtags including slight misspellings of banned or restricted searches.
We read reports and watch documentaries that AI/algorithms have taken over and are beyond control. Social media wants the public to believe this so they can continue to avoid the application of ethics and integrity to their platforms.
One documentary titled their report into algorithms as A Social Dilemma – an irresponsible name for a documentary investigating abuse/use of AI and algorithms. There is no dilemma between ethical and unethical corporate behaviour. It requires action. The documentary offers no solutions including interviews with former bosses in social media and senior engineers of AI/algorithms.
Global Influence of Instagram
• Instagram generated an estimated $47.6 billion revenue in 2021, accounting for almost 50% of Facebook’s total revenue.
• Instagram only employs 450 people with a handful of staff concerned with posts and mental health.
• Over 70% of Instagram users are under 35 years old
• Over two billion people use Instagram once a month
• Since 2018, Instagram itself spends the largest portion of its budget on marketing teenagers.
Instagram obsesses about keeping teens on the platform as most teens have long since exited Facebook. Teens can spend up several hours a day on Instagram. A report said 32%r cent of teen girls said Instagram made them feel worse if they felt bad about their bodies.
A study found a total of more than 1.2 million Instagram posts over a period contained one of five popular hashtags related to self-injury: #cutting, #selfharm, #selfharmmm, #hatemyself and #selfharmawareness.
Ten Actions towards Eradicating Harmful Behaviour of Instagram and other platforms on under 16-year-olds and others
1. To pass laws to end self-regulation of Instagram and social platforms
2. To appoint an independent Board of Governors for each of the platforms.
3. Experts in the field of social media/social justice etc regulate social media with the power to advise the Crown Prosecution Service (CPS) to launch prosecution against Instagram etc for criminal corporate behaviour as accessories to causing mental suffering and grievous bodily harm to vulnerable subscribers to the platform.
4. To sustain critiques of Instagram and encourage whistle blowers to expose the suffering Instagram permits on teenagers.
5. To expose Instagram so that no self-respecting individual would wish to work for Instagram
6. A Board of Governors on Corporate Behaviour to be created to advise the board of Meta and senior managers they will be held accountable for distribution of posts that contribute to self-hate, self-harm and suicide.
7. Instagram to increase its staff 10-fold so the fresh staff ensure algorithms change to maximise public mental health worldwide.
8. Governors set a time period for its resolution. Failure to act will mean the closure of Meta in its entirety.
9. All social media to hand over to authorities all their data on impact of harmful posts on users of their platforms instead of cherry-picking research.
10. Since any youngster can join Facebook, without permission from an adult, governments must pass laws that under 16-year-olds require authorisation from a parent/guardian. to join a social platform. The parent/guardian must provide confirmation of identity, such as their passport photo, and confirmation of living with the young person. Platform provides parent/guardian with password that young person uses. Court imposes a fine on parent/guardian and deletion of user’s page of adult and youngster if any forgery.
Mr Russell concluded his remarks outside the Coroners Court by paying tribute to Molly. He thanked her for being his daughter – a brief, touching and precious response. Parents, guardians, teachers and many more empathised with his heartfelt words reminded us all to be vigilant with online content and regular abuse of the young and hold these platforms accountable.
Social media are forms of surveillance agencies to market products and get their subscribers addicted to their platform in the long term.
Social media let his daughter down, her family and countless numbers of others. That’s what happens when greed matters more than grief.
For those needing advice and support. From the BBC. I
In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email email@example.com. In the UK and Ireland, Samaritans can be contacted on 116 123 or by emailing firstname.lastname@example.org or email@example.com. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at www.befrienders.org. You can contact the mental health charity Mind by calling 0300 123 3393 or visiting mind.org.uk
Christopher is the grandfather of four children.