Finished reading Keeping Children Safe in Education and looking for a new 120,000-word read? Then how about the Online Safety Act? It may not sound like a barrel of laughs, but there is so much in the new law that will impact the digital world your students inhabit and the online harms they face, so it might be worth keeping an eye on developments.
Six years ago, I took a group of teachers to Whitehall for a focus group on the online harms seen by schools. The green paper consultation this was part of led to a white paper, various drafts and revisions of a bill, lots of controversy in the meantime, and eventually the act. Today, a few weeks after the law received royal assent, regulator Ofcom has released guidance and proposed codes of practice for the online platforms it now regulates.
Why do schools need to be interested? Well, Ofcom’s latest research shows that three in five secondary-aged students have been made to feel uncomfortable online, with nearly one-third reporting unwanted friend requests and one in six either being sent or asked for nudes or semi-nudes. The regulator’s CEO, Dame Melanie Dawes sums it up well: “If these unwanted approaches occurred so often in the outside world, most parents would hardly want their children to leave the house.”
The new proposals put flesh on the bones of the legislation, with lots of measures to protect children. These include hiding child accounts from adults, keeping children’s friend lists private along with their location, as well as not ‘suggesting’ friends under the age of 18. Given the scattergun approach many groomers use to befriend children online, this is a great step.
Like many other measures however, this will only work where platforms know the age of their users. Currently, it is pretty easy to give a false date of birth or answer yes to “are you over 18?”. But the new law brings in a duty for platforms to have “highly effective” age checking, especially for the most harmful content (currently ‘primary priority content’ areas are pornography and the encouragement of suicide, self-harm and eating disorders).
As the new codes of practice are developed, we can hopefully look forward to a new ecosystem of apps which are appropriate for children, keeping them safe from many other harms too. This is helped by new duties of care that apply to all sites likely to be used by children. Up until now, it has been enough to say that your site is not intended for under-18s.
We won’t see change overnight, but even before enforcement begins towards the end of 2024 companies are already reacting, for example by quietly releasing new systems of (mostly voluntary so far) age and identity verification to test the waters.
It would be naïve to think this will mean schools will be able to climb off the merry-go-round of class WhatsApp crises, letters home about under-age app use and concerns over inappropriate sexual material accessed and shared online. But the new legislation means the future is brighter than it was a few weeks ago. There is hope that the previous reliance on parents and schools will be bolstered by industry best efforts and regulatory teeth. The potential fines of up to £18 million or 10 per cent of revenue should certainly make most pay attention.
There is plenty still to be worked out over the next year, and personally I would like to see more about parental controls, which are often hard to use. I’d also like to hear more about exemptions for smaller sites, which can do disproportionate damage. For example, sites that encourage eating disorders have by nature a niche user-base but can cause enormous harm.
In the meantime, schools can get on the front foot by informing pupils and parents of the new provisions under the act. A responsibility to keep children safe continues to fall on schools, and honest conversations about risks and harms can go a long way to shift a culture that continues to put children at risk.
Your thoughts