TikTok Restructuring Puts Hundreds of UK Jobs at Risk Amid Shift to Automated Moderation

Web Reporter
3 Min Read
Disclosure: This website may contain affiliate links, which means I may earn a commission if you click on the link and make a purchase. I only recommend products or services that I personally use and believe will add value to my readers. Your support is appreciated!

Hundreds of jobs in the UK are under threat after TikTok confirmed plans to restructure its content moderation operations, shifting work to other European hubs as part of a wider overhaul of its Trust and Safety division.

The social media platform, which has more than one billion users globally, said the move is part of an ongoing reorganisation aimed at consolidating operations in fewer locations and increasing reliance on automated systems to police harmful content.

“We are continuing a reorganisation that we started last year to strengthen our global operating model for Trust and Safety, which includes concentrating our operations in fewer locations globally,” a TikTok spokesperson said.

The Communication Workers Union (CWU) sharply criticised the decision, accusing the company of putting profit ahead of responsibility. “TikTok workers have long been sounding the alarm over the real-world costs of cutting human moderation teams in favour of hastily developed, immature alternatives,” said John Chadfield, CWU National Officer for Tech. He added that the announcement comes at a critical moment, as staff prepare to vote on union recognition.

TikTok defended the restructuring, saying it would boost the “effectiveness and speed” of moderation while reducing the mental toll on staff exposed to harmful or distressing content. The company noted that its automated systems already remove around 85% of posts that breach platform rules before a user sees them.

Workers in London’s Trust and Safety unit, along with colleagues in Asia, are expected to be most affected. Staff facing redundancy will be able to apply for alternative roles within the company and will be prioritised if they meet basic criteria.

The shake-up comes as the UK tightens regulation of online platforms under the new Online Safety Act, which came into effect in July. The law requires technology firms to improve protections for users, including robust age verification, and threatens fines of up to 10% of global turnover for non-compliance.

TikTok has rolled out additional parental controls in response, allowing guardians to block accounts and manage teenagers’ privacy settings. Yet concerns remain over the platform’s safety record and handling of user data. In March, the UK’s Information Commissioner’s Office launched a major investigation into the company’s practices.

The restructuring underscores a growing dilemma for social media firms: balancing the efficiency of automated systems with the judgment and nuance offered by human moderators. Critics warn that over-reliance on technology risks missing subtle or emerging threats, while companies argue automation is vital to cope with the sheer scale of content.

For TikTok, the decision lands at a sensitive moment, with regulators increasing scrutiny and unionisation efforts gaining momentum. Whether its strategy to cut human moderation in favour of automated tools can reassure both workers and watchdogs remains an open question.

TAGGED:
Share This Article