Opinion

Supporting students to take control of leaked explicit images

17 Aug 2021, 5:00

A new tool helps young people remove explicit images of themselves from the internet, but it is only a final step in the safeguarding process for victims, writes Suzanne Houghton

Recently, the Internet Watch Foundation (IWF) and Childline announced the creation of a ‘world first’ service that allows minors to request leaked nude images or videos of themselves to be removed from the internet. For the first time, people under the age of 18 who are concerned that a nude photo of them is online – or could potentially end up there – can now flag up the content and it may be removed if it breaks the law.

How does the tool work?

The Report Remove tool is available through Childline’s website and can be used by any young person under the age of 18. In order to report content, a young person verifies their age, with Childline ensuring they are safeguarded and supported throughout the process (unless the person wants to remain anonymous).

Once the content has been flagged, the IWF will examine the image and try to remove it if it breaks the law. If the content has not yet appeared online, but a person is worried it might be, a digital fingerprint or a ‘hash’ can be created, which is then shared with tech companies and law enforcement around the world, to prevent it being uploaded and shared.

The IWF keeps a ‘Hash List’ that is updated regularly and sent daily to subscribers which contains these digital fingerprints. The aim is that through the list, companies will be able to automatically match known images and videos before they appear on their services, thus preventing them being shared online. Subscribers to the Hash List will include tech providers, companies such as filtering providers, hosting or file sharing services, social media companies, chat services and data centres.

How can schools support students in these situations?

Conversations must emphasise that this type of image sharing is not the norm

The practice of sharing explicit images among young people unfortunately seems to be becoming more prevalent. The IWF reported that 38,000 self-generated images were registered in the first three months of this year, over double the amount compared to the same period last year.

While use of Report Remove is limited to those that are subjects of the content, schools and education providers will play an important role in promoting the use of the tool among students, where it can be accessed and how it can be used.

Often, the motivation to share explicit photos comes from external pressure, so it’s vital that any conversations on the topic emphasise that this type of image sharing is not the norm and, contrary to popular belief, not ‘everyone does it’. Remind students that if they feel they are being pressured to engage in swapping nude images, or have been a victim of it, they must alert a trusted adult, whether family or teacher.

Every school and college must have an effective child protection policy in place that should include the steps taken in these situations. Government guidance recommends that in these circumstances, the incident should be reported to a designated safeguarding lead, who should then hold a review with all appropriate staff, including the staff who initially heard the disclosure.

Following this, interviews with the young person in question should be held, with parents and carers informed at an early stage, unless there is reason to believe that this would place the young person at risk of harm. A referral should also be made to children’s social care and the police, if there is concern that the young person has been harmed or is at risk of harm at any stage.

As part of the interviews with the young person involved, it would now be appropriate to recommend they access the Report Remove tool to take action to remove the image from the internet.

The sharing of explicit images is a growing problem among an increasingly digitally-savvy generation, and education providers need to respond quickly and with sensitivity. These incidents can leave those affected feeling powerless, with their own images out of their control; let’s hope Report Remove helps to provide them with some sense of control back.

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *