COVID 19: ‘Ghost workers’ and the content moderation industry

As the global population reacts to the spread of the coronavirus, it is obvious that people are spending more time online.

As the global population reacts to the spread of the coronavirus, it is obvious that people are spending more time online. Internet traffic is up by 70 per cent in virus-hit Italy and 50 per cent in the UK. As more people stay home and self-isolate, users are spending time on social media sites such as Facebook, Instagram and TikTok, as people seek virus-related news updates, connection with friends and family, as well as comedic relief from the anxieties of living through a global pandemic.

However, less obvious is the virus’s impact on the ‘ghost labourers’ who work for the social media industry, namely the content moderation workers who keep our feeds free of violent, harmful and pornographic content. Roberts (2017) defines commercial content moderation’ as “the organised practice of screening user-generated content posted to internet sites, social media and other online outlets, in order to determine the appropriateness of the content for a given site, locality or jurisdiction”. Content moderators work around the clock to remove harmful content to ensure a ‘seamless’ experience for users. A documentary on content moderators in Bangalore revealed that workers are expected to screen 2,000 photos per hour, with around 20% of these breaking site guidelines. Many moderators are exposed to horrific and disturbing images and videos, including beheadings and child pornograpy, leading to workers suffering from PTSD. Content moderators for Facebook and Youtube are even required to sign a disclosure stating their work may impact their mental health.

Much content moderation is outsourced to agencies that hire contract workers in parts of the United States and Europe, but more increasingly in India and the Philippines. These two countries have become major content moderation outsourcing hubs partly due to the existing call centre infrastructure and the english-language capacity of the population. In India, these workers are based in cities such as Bangalore, Hyderabad and Gurugram, and are contracted by companies such as Accenture or Genpact. It is notoriously difficult to find official statistics related to content moderation due to the secretive and invisibilised nature of the work and platforms desire to present an image of ‘unedited’ content. This has meant that ‘basic facts around the industry remain a mystery’. However Facebook recently stated that it employed 15,000 content moderators globally.

As a reaction to the coronavirus pandemic, Facebook required its own staff to work from home. But initially, this policy did not include content moderators, who are contracted on an hourly wage basis. Treated as ‘second class workers’, Facebook expected its outsourced moderators to continue coming to work as they are unable to work from home due to privacy and legal concerns.

However, after one week, moderators were also told to stay home, to limit the spread of the virus. Subsequently, much content moderation is now being conducted by AI technology. While it was always assumed that AI would eventually take over content moderation for social network sites, there are several reasons why this has not happened yet: Building algorithms that identify images, videos and text that have not yet occured is difficult; There are vast country-differences on what is considered acceptable content, meaning each jurisdiction would require its own system; Finally, access to cheap labour in the global South means that companies are able to continue to hire human moderators who are more adept at identifying unacceptable content, than current AI technologies.

Now that AI is being increasingly used for content moderation, sites such as Youtube are warning users that they may experience increased video removals even if they do not violate policies. Regulating user-generated content involves nuanced decision making that is difficult even for human moderators, who must balance site policies, cultural differences and ambiguous content when deciding what to remove. Automated systems take time to translate algorithmic learning into accurate moderations. However, one significant advantage of AI systems over human moderators is the scale with which AI can comb through images. But is quantity over quality really the best strategy for content moderation? As Yuval Noah Harari reminds us, during global emergencies, “immature and even dangerous technologies are pressed into service, because the risks of doing nothing are bigger”. While this may be necessary in times of crisis, it should not be the long term solutions when the world starts to recover.

What is the future for human content moderators? For those contracted as content moderators in the Philippines and India who rely on this income, there are concerns that their jobs may be automated sooner rather than later. As AI is being heavily relied upon during the coronavirus pandemic and may be rolled out in the near future, content moderators working for Accenture told Reuters that they were worried they would be receiving an email in a few weeks saying “you don’t have to come back to work because you don’t have a job”. Facebook has stated that, for contract workers sent home as a reaction to the coronavirus pandemic, “We’ll ensure that all workers are paid during this time”. However, it is not clear how long workers will be compensated or whether jobs will be secure in the future. Other platforms have not clarified what the situation is for their content moderators.

Although content moderation is—in many ways—a substandard job that exposes workers to psychologically damaging content whilst invisibilising their labour, workers are dependent on this income. As content moderators are usually hired as contract labourers instead of full-time employees, they are not afforded the full status, security or pay of a regular employee for a big-tech company such as Facebook or Google. This means they are in a precarious financial position.

In India, youth unemployment has hovered around 10 per cent since 2012, and will undoubtedly increase as a result of the coronavirus: industry groups are estimating jobs will be lost in the tens of millions. Therefore those who lose their job may find it very difficult to find another income source in the current climate of uncertainty.

Although no one knows what will happen over the next weeks regarding the impact of the coronavirus, content moderators have long known that their jobs are at risk of being automated. This may just happen sooner than anyone could have imagined.

The publication was first published by the authors under auspices of erstwhile Tandem Research.

Share this:

Related: