The UK government has not ruled out further tightening existing online safety rules by adding an Australian-style social media ban for under-16s, Technology Secretary Peter Kyle has said.
Back in the summer, the government warned it could tighten laws on tech platforms following riots believed to have been fueled by online disinformation following a knife attack that left three girls dead. did.
It has since emerged that some of those charged in the riot were minors, raising concerns about the influence of social media on sensitive and developing minds.
Appearing on BBC Radio 4's Today program on Wednesday, Mr Kyle was asked whether the government would ban under-16s from using social media. He replied, “Everything is on the table with me.”
Kyle was being interviewed as the Department of Science, Innovation and Technology (DSIT) sets out priorities for implementing the Online Safety Act (OSA), passed by Parliament last year.
The OSA cracks down on a wide range of online harms, from cyberbullying and hate speech to intimate image abuse, fraudulent advertising and animal abuse, and has led British lawmakers to make the country one of the safest online in the world. He said he wants to make it an accessible place. The most powerful driver is the child protection driver, with lawmakers responding to concerns that children are accessing harmful and inappropriate content.
DSIT's Strategic Priorities Statement continues this theme, placing child safety at the top of the list.
Strategic priorities for online safety
The full text of DSIT's five priorities for OSA is as follows:
1. Safety by design: We incorporate safety by design to provide a safe online experience for all users, especially children, to tackle violence against women and girls, and to protect against illegal content, including fraud and child sex acts. We will strive to ensure a safe haven for our activities. Exploitation and abuse, illegal disinformation.
2. Transparency and Accountability: Ensure industry transparency and accountability from our online safety outcomes platform, foster greater trust and a broader evidence base, and create a safer experience for users. provided to you.
3. Agile regulation: Enable an agile approach to regulation and robustize the framework to monitor and address emerging harms, such as AI-generated content.
4. Inclusion and resilience: Create an inclusive, informed, and vibrant digital world that is resilient to potential harms, including disinformation.
5. Technology and Innovation: Fostering innovation in online safety technology to improve user safety and foster growth.
The reference to “illegal disinformation” is interesting because the last government removed a clause in the bill that focused on this area, citing concerns about freedom of speech. However, in the wake of the summer riots, the government said it may review the OSA's powers and strengthen them to take into account the use of social media during the turmoil.
In a note to ministers accompanying Wednesday's statement, Mr Kyle also wrote:
“An area of particular focus for the government is the huge amount of misinformation and disinformation that users may encounter online. Strong policies and tools must be in place for services to combat misinformation and disinformation, given the need to maintain legitimate debate and free speech online. However, the growing presence of disinformation poses a unique threat to the UK's democratic processes and social cohesion and must be decisively countered. Services are also agile against new information threats. We need to remain flexible and able to respond reliably and minimize the negative impact on users, especially vulnerable groups.”
DSIT's intervention will determine how the law is enforced by requiring Ofcom to report on the government's priorities.
Ofcom, the regulator tasked with overseeing the compliance of internet platforms and services with the OSA, has been providing consultation and producing detailed guidance in areas such as age verification technology for over a year to ensure compliance with the OSA. We have been preparing for the introduction of
The scheme is expected to come into force next spring, when Ofcom will proactively extend its powers to potentially fine tech companies of up to 10% of their global annual turnover for failing to comply with their legal duty of care. will be responsible for this.
“What I want to do is look at the evidence,” Kyle told his children and on social media, noting that he would also begin a “feasibility study” that would “look at areas where there is a lack of evidence.”
According to DSIT, the study “examines the effects of smartphone and social media use on children, strengthening research and the evidence needed to create a safer online world.”
“We have a hypothesis about the impact.” [social media] “It's affecting children and young people, but there's no hard peer-reviewed evidence,” Kyle also told the BBC, suggesting any ban on children's social media use in the UK must be evidence-based. .
During an interview with the BBC's Emma Barnett, Mr Kyle was also pressed on what the Government has been doing to address the gaps he previously suggested would be included in the Online Safety Act. He responded by flagging proposed amendments that would require platforms to be more proactive about the abuse of intimate images.
Tackling intimate image abuse
In September, DSIT announced that the sharing of intimate images without consent would be a “priority crime” under the OSA, making abusive behavior mandatory for social media and other in-scope platforms and services. They called for a crackdown or risk hefty fines.
“This move effectively raises the severity of intimate image abuse sharing violations under the Online Safety Act, requiring platforms to proactively remove content and prevent it from being displayed in the first place. ” confirmed DSIT spokesperson Glenn McAlpine.
In further comments to the BBC, Kyle said the changes would require social media companies to use algorithms that prevent intimate images from being uploaded in the first place.
“They had to proactively prove to our regulator, Ofcom, that the algorithm would stop that content from going forward in the first place. They also had to issue a warning if the image was posted online. “It must be removed as soon as reasonably expected after receiving a complaint,” he said, warning of “heavy fines” for non-compliance.
“This is one of those areas where you can see that harm is being prevented, rather than actually going out into society and then us dealing with it. This has happened before,” he said. added. “Thousands of women are now protected, and avoided humiliation, humiliation, and even suicidal thoughts because of that one mandate I instituted.”