A “toxic” TikTok trend has left tormented pupils reluctant to attend school and their teachers yet again facing a drawn-out process to remove “derogatory and defamatory” videos.
In a new “Guess Who” TikTok trend, anonymous profiles, which in many cases include the school name, post videos with clues to a classmate’s identity.
These clues usually have the pupil’s initials or year group, but also often contain references to physical appearance, sexuality and personality. Some accounts also encourage other pupils to tag the victims in the comments section.
The Association of School and College Leaders (ASCL) will raise the problems with TikTok as school leaders say they are left “banging our heads against a brick wall” trying to get videos removed.
Schools Week has uncovered accounts for seven schools that have racked up more than 210,000 views.
TikTok says it is monitoring the situation and will remove all videos found to violate its bullying and harassment policies.
The social media giant faced scrutiny last year for its failure to quickly remove offensive videos that targeted teachers.
TikTok ccounts rack up thousands of views
Brockington College in Enderby, Leicestershire, battled for weeks to get three offensive “Guess Who” accounts removed.
Despite descriptions such as “big nose”, “overweight”, “bad teeth” and “has chlamydia”, the school said TikTok found these did not violate its terms of use.
TikTok advises schools to contact the Professionals Online Safety Helpline (POSH) run by the UK Safer Internet Centre.
Following the teacher scandal, TikTok provided undisclosed funding to POSH, saying it was the fastest way for teachers to ensure reported content was investigated.
But Michael Jones, Brockington’s IT manager, said he contacted POSH on April 19, but the accounts were only removed this week – after he included the media and MPs in follow-up emails.

The helpline told the school the issue had been logged, but that this did not necessarily mean the accounts would be removed quickly, Jones said.
He said the “time-consuming” process felt “a little bit futile” and left the school “banging our head against a brick wall”.
Andy Burrows, the NSPCC’s head of child safety online policy, said TikTok needed to demonstrate it was on top of new trends that could harm children “and take appropriate action”.
A TikTok spokesperson said bullying and harassment had “no place on TikTok” and accounts that violated community guidelines had been removed.
The longer the accounts remain active the more views they rack up – with the Brockington accounts accumulating more than 63,000 views.
Dr Mary Bousted, the joint general secretary of the National Education Union, said the trend was “toxic’. It was “ridiculous” that schools had to go to “inordinate lengths” to get the material removed.
Victims fearful of attending school
Last week, online safety app Safer Schools published a briefing alerting leaders to the trend, which it said was used to “hurt, humiliate and bully”.
Schools Week uncovered examples of homophobic and transphobic insults, with rumours that individuals cheated on their partners.
Safer Schools said it found clues that included racist comments and accusations of sexual assault.
It warned that cyber-bullying could contribute to mental health disorders, substance misuse and suicide.
Jones said the accounts were “having a really damaging effect” on pupils’ mental health and left victimised pupils “concerned about coming into school”.
Last month, it was reported a teenager in St Helens was left “terrified” of going back to school after being targeted, while a Surrey parent whose child faced unfounded allegations of sexual misconduct said the trend created a “desperation” among victims and could lead to “self-harm”.
Attempts to remove the content have so far proved fruitless.
Geoff Barton, the general secretary of ASCL, said it was “disturbed” to hear about the videos.
“There is no excuse for posting anything online that amounts to bullying. We will be raising this with TikTok and asking them to take down the offending accounts.”
Online Safety Bill
Social media sites are supposed to be responsible for identifying and removing harmful content.
The proposed Online Safety Bill, currently at the committee stage, will appoint Ofcom as the regulator for online safety. It will have the power to tell companies which content is acceptable.
Those failing to protect people would answer to Ofcom and face fines of up to 10 per cent of their revenues.
In the most serious cases, sites could be blocked in the UK.
The government said the new laws would ensure stronger protection from harmful activity for children, such as bullying.
Your thoughts