In 2010 the Conservative Party manifesto pledged to “reverse the commercialisation of childhood”. Three years after the 2019 government white paper argued the existing regulatory and voluntary initiatives had “not gone far or fast enough”, they have published 225 pages of new draft law to regulate the internet: the online safety bill.
Companies will be required to comply with an overarching duty of care principle enacted through a raft of obligations borrowed from the offline legal framework for health and safety. There are threats of enormous fines and of holding executives criminally liable for content. But while this might sound simple, it simply isn’t.
Clear, enforceable definitions are missing so far. What’s legal offline might become unacceptable online where content is labelled “lawful but harmful”, and much of the detail is delegated to ministers or to Ofcom for further legislation or yet-to-be-written codes of practice. And some things that are clear are ill-thought-through. For example, the proposed obligations to act on complaints appear to ignore the real risk that groups with opposing views could weaponise account take-downs. Edu-Twitter beware!
More fundamental than even these consequences for ministerial powers and freedom of speech, it takes our responsibility for assessing our own capability, risks and choices about the content we access away from us as internet users and gives it instead to companies. By encouraging the use of “behaviour identification technology” and profiling, it treats us merely as consumers, not creators. In the effort to protect children, the bill infantilises us all.
Yet when it comes to children, the proposals are neutered or tackle the wrong end of the problem. The section dedicated to the promotion of media literacy has been cut altogether from a previous draft and, instead of banning the systemic use of children’s data to target advertising, it creates a duty to stop children encountering inappropriate content in search engine results.
But online ad space is often sold via complex automated workflows to buy access to readers based on a variety of data, such as shopping patterns, in real-time. Will companies at one end of the chain be prosecuted for non-compliance of content created by someone else at the other? Achieving a workable definition of who is ‘responsible’ for content “likely to be accessed by children” will be hard.
Some global social media companies already use algorithms to try to retrospectively identify self-harm or suicide, but the intent of content is extremely hard to contextualise. Posts about recovery or online counselling services or sites with evidence of war crimes and human rights abuses are inevitably over-removed by automated tools that fail to make accurate distinctions.
Further, the bill demands age verification. While presented as protection for children, in practice AV can mean everyone must give up their right to privacy and hand over personal details to commercial companies to create a credential for identifying themselves to websites as not-a-child or a child of a defined age group. Not only can AV be worked around, it drives the opposite of the original intent.
The upshot is that the bill will not reverse but embed “the commercialisation of childhood” from an early age. The integrity of our identity-for-life will be dependent on those companies’ security. Either that, or some websites might decide not to offer services to children here at all rather than incurring the costs of installing digital gatekeepers or tools to determine users’ location.
Young people’s own views have been ignored. A 2019 survey by the Australian e-safety commissioner found that over half (57 per cent) were uncomfortable with background monitoring processes, and 43 per cent were unsure about these tools’ effectiveness in ensuring online safety.
Meanwhile, the departments jointly behind the bill pursue their policy goals. While other countries ban general surveillance, the Home Office stands accused by the Internet Society of pursuing “an internet that is more insecure and unsafe” in its efforts to undermine encryption. And the Department for Digital, Culture, Media and Sport is busy promoting the UK as a world-leading seller of safety tech, with a highly anticipated export market.
Commercialisation of childhood, indeed. Now with added British values.