How Instagram Encourages Self-Harm

*Potentially triggering* Please be careful xx

Instagram is under increasing pressure to make their platform safer after the death of Molly Russell, who took her life aged 14 in 2017, after being immersed in self-harm & suicide content on social media.

Following an inquest, Molly’s father has said that he believes Instagram is partly responsible for his daughter’s death.

In response, Instagram said that it “does not allow content that promotes or glorifies self-harm or suicide and will remove content of this kind.” Their terms & conditions state that content that embraces or encourages self-injury will be removed – or accounts will be disabled – if it is reported to them.

I know this statement from Instagram is entirely false. Content rarely gets deleted when reported & there is some extremely dangerous content easily available for young or vulnerable people – especially on private accounts. I once came across a video of a young woman self-harming that I found so distressing that I reported it several times a day for about a month before realising nothing was going to be done about it. As far as I know that user is still active & still posting self-harm content.

I will always advocate for honest conversations as a way to break down stigmas and improve mental health, & whilst social media can be a place to vent to likeminded people & somewhere to turn when you’re struggling, I see no benefit in uploading or viewing self-inflicted injuries. When you’re entrenched in the addiction of self-harm, it can only serve to normalise & perpetuate the behaviour.

I speak from my personal experience with self-harm, & from the experience of using Instagram to share it. Whilst I have some shame in admitting this, I feel it is important that I share my experience in the hope it will help others.

I have a separate, private account that I use purely to document or vent about my own self-harm. Initially I used it because I had no one ‘in real life’ to talk to, or who understood, but it soon became a way for me to justify my behaviour & convince myself it was okay because ‘all these other people are doing it’ & ‘these people accept it’ & ‘if they’re not getting help then I don’t need it either’. It took me a good couple of years to realise & acknowledge that that’s what I was using the account for, & even longer than that to stop lying to myself that I was using it to get support.

I believe that a lot of users will hurt themselves to receive the validation these accounts offer – whether they realise it or not – especially if they are feeling invalidated or unheard offline. That invalidation is perpetuated by social media’s tendency to make some people feel inadequate if they don’t get enough likes/comments/followers/shares.

Users are setting up their phones to record themselves cutting, burning or bruising their skin, or filming blood dripping from their bodies. There’s an element of competitiveness – the worse you hurt yourself, the more attention you get. That’s an incredibly powerful message to send to a young or vulnerable mind. Captions such as ‘I’ll do better next time’ or ‘my cuts are pathetic’ are plentiful.

I have had users comment ‘I wish I could cut as deep as you’ & others asking me how & what I used to do it. I never answered the questions, only discouraged people from doing it but there is A LOT of ‘tip sharing’ going on about how to ‘cut deep’, how to avoid medical attention, & how to keep your behaviour a secret from family & friends. Despite Instagram’s claims that its “… community cares for each other, and is often a place where people facing difficult issues such as eating disorders, cutting, or other kinds of self-injury come together to create awareness or find support.”

There are many people behind these accounts who are in genuine danger of permanent damage or accidental death from their behaviour (not to mention suicide.) I have been distressed, triggered & terrified by some of the content & felt powerless to intervene beyond words of advice & support.

Instagram’s intervention for promoting self-injury, suicide or eating disorders relies on other users reporting their content – something which isn’t very common on private accounts where users can select their followers based on having a similar account. Users can identify these accounts by profile images & certain keywords in their bio

If reported, users will receive this pop-up when they next view their accountScreenshot 2019-03-01 at 02.02.06.pngClicking ‘See Support Resources’ takes users to a page where they are advised to talk to a friend or talk to the Samaritans (UK.) It also provides very basic tips for relaxation, including going for a walk, having a snack or looking at clouds. I find this particularly patronising & over-simplistic, especially when considering the complex set of issues which lead to self-harm & emotional distress.

Clicking ‘Skip’ allows you to carry on using your account as normal, with no repercussions, such as a temporary ban.

I have received this pop-up on countless occasions, only once was an image deleted. No other images were deleted, & the account remained active. 

I now use the account more to offer support & advice to the users I am following – & to serve as a warning to those just starting to self-harm of how addictive & dangerous it can become. It never seems to make a difference or even sink in. I have offered to ring or email parents or teachers on user’s behalf & get them some support but they’re not interested, they’re too entrenched in the behaviour. A behaviour accepted & validated by their Instagram activity & content shown to them by Instagram’s algorithms.

I find that I have urges to self-harm far more when I have been on that account. I strongly believe I would be further along in my recovery if I’d never created it. If it has that affect on me as a recovering adult, then it is definitely having that affect on younger users, or more vulnerable adults.

Obviously there are many reasons why people struggle with their mental health, why they may turn to self-harm & why they are not getting the support that they need. Instagram cannot be held solely responsible for this, but they do have the power to put more controls in place on their platform to protect vulnerable users. It should not be a go-to platform for these behaviours to be learned about, validated & encouraged.

.
You can find more information about self-harm & recovery resources at MIND.

If you are a parent or carer of a child who uses Instagram, please aim to be aware of what they are using the platform for. Create an open environment for discussing what they are doing online, as well as any worries or difficult feelings they may have. Harmful content is out there for children to see – whether on Instagram or another platform – & the only way to protect children from normalising this content it is to make sure we have open conversations with them & encourage good mental health. ⠀

Instagram provide tips for parents HERE

And a downloadable parents’ guide to Instagram HERE.  

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s