اردو
  • All graphic self-harm images will be removed from site: Instagram

    instagram vows to remove self-harm images file photo instagram vows to remove self-harm images

    All graphic images of self-harm will be removed from Instagram, the head of the social media platform.

    The move comes after the father of 14-year-old Molly Russell, who took her own life in 2017, said Instagram had "helped kill" his daughter.

    Molly's family found she had been viewing graphic images of self-harm on the site prior to her death.

    Adam Mosseri said Instagram was trying to balance "the need to act now and the need to act responsibly".

    He added the site was "not where we need to be on the issues of self-harm and suicide".

    When asked by the BBC's Angus Crawford when the images would be removed, Mr Mosseri replied: "As quickly as we can, responsibly."

    Molly's father Ian Russell welcomed Instagram's commitment and said he hoped they would act swiftly to implement their plans.

    "It is now time for other social media platforms to take action to recognise the responsibility they too have to their users if the internet is to become a safe place for young and vulnerable people," he added.

    Health Secretary Matt Hancock described the death of Molly Russell as "every parents' modern nightmare".

    He said it was right for Instagram to take down "the most graphic material" but added that "we need to be led by what the clinicians and experts say need to be taken down".

    Speaking after a meeting with social media companies as well as the Samaritans, Mr Hancock said he wanted to see a duty of care for all users of social media and that he was "perfectly prepared to legislate if necessary".

    Digital minister Margot James told BBC Radio 4's PM programme the government would "have to keep the situation very closely under review to make sure that these commitments are made real - and as swiftly as possible".

    Instagram currently relies on users to report graphic images of self-harm, but Mr Mosseri said the company was looking at ways that technology could help solve the problem in the future.

    He added: "Historically, we have allowed content related to self-harm that's 'admission' because people sometimes need to tell their story - but we haven't allowed anything that promoted self-harm.

    "But, moving forward, we're going to change our policy to not allow any graphic images of self-harm."