Instagram has launched new technology to recognise self-harm and suicide content on its app in the UK and Europe.
The new tools can identify both images and words that break its rules on harmful posts.
Instagram has launched new technology to recognise self-harm and suicide content on its app in the UK and Europe.
The new tools can identify both images and words that break its rules on harmful posts.
Molly Russell, 14, took her own life in 2017. When her family looked into her Instagram account they found distressing material about depression and suicide.
Molly's father Ian says he believes Instagram is partly responsible for his daughter's death.
Natasha MacBryde was a beautiful, clever girl hoping to be a vet or paediatrician, who seemed perfectly happy at school until a few months after she turned 15, when she suddenly wanted to dye her hair brown.
Her mother Jane discovered, over the next few weeks, that Tasha (as the family called her) had become the target of nasty messages from fellow pupils on Formspring, a website (which has since closed) where the senders could remain anonymous.
She was further upset at being rejected by a boy she liked. On the evening before Valentine's Day, she looked at Formspring one last time - with its latest, poisonous message - then slipped out of the house and walked to a nearby railway track. She was hit by a train.
A powerful and upsetting interactive site displaying 600 messages that led a young girl to taking her own life.
Facebook is rolling out a new feature across the UK to help users who feel suicidal.
The Suicide Prevention tool has been developed in connection with the Samaritans.
It aims to try and provide advice and support for those struggling to cope, as well as for their friends and family.
People can now report posts they are worried about in a more direct way.
Comments
make a comment