Social media apps will have to shield children from dangerous stunts | Social media


Social media firms will be ordered to protect children from encountering dangerous stunts and challenges on their platforms under changes to the online safety bill.

The legislation will explicitly refer to content that “encourages, promotes or provides instructions for a challenge or stunt highly likely to result in serious injury” as the type of material that under-18s should be protected from.

TikTok has been criticised for content featuring dares such as the blackout challenge, which encouraged users to choke themselves until they passed out, and a challenge which encouraged users to climb precarious milk crate stacks.

The app has banned such stunts from its platform, with guidelines state that the platform does not allow “showing or promoting dangerous activities and challenges”.

The bill will also require social media companies to proactively prevent children from seeing the highest risk forms of content, such as material encouraging suicide and self-harm. Tech firms could be required to use age-checking measures to prevent under-18s from seeing such material.

In another change to the legislation, which is expected to become law this year, social media platforms will have to introduce tougher age-checking measures to prevent children from accessing pornography – bringing them in line with the bill’s measures for mainstream sites such as Pornhub.

Services that publish or allow pornography on their sites will be required to introduce “highly effective” age-checking measures such as age estimation tools that estimate someone’s age from a selfie.

Other amendments include requiring the communications watchdog Ofcom to produce guidance for tech firms on protecting women and girls online. Ofcom, which will oversee implementation of the act once it comes into force, will be required to consult with the domestic abuse commissioner and victims commissioner when producing the guidance, in order to ensure it reflects the voices of victims.

The updated bill will also criminalise the sharing of deepfake intimate images in England and Wales. In a further change it will require platforms to ask adult users if they wish to avoid content that promotes self-harm or eating disorders or racist content.

Once the law comes into force breaches will carry a punishment of a fine of £18m or up to 10% of global turnover. In the most extreme cases, Ofcom will be able to block platforms.

Lady Kidron, the crossbench peer and campaigner on children’s online safety, said it was a “good news day for kids”. The government also confirmed that it is adopting changes allowing bereaved families easier access to the social media histories of deceased children.

Richard Collard, associate head of child safety online policy at the NSPCC, said: “We’re pleased that the government has recognised the need for stronger protections for children in this crucial piece of legislation and will scrutinise these amendments to make sure they work in practice.”

Paul Scully, the technology minister, said the government aimed to make the bill the “global standard” for protecting children online: “This government will not allow the lives of our children to be put at stake whenever they go online; whether that is through facing abuse or viewing harmful content that could go on to have a devastating impact on their lives.”



Source link