fbpx

Study: Consumers want humans moderating content online, not just AI

by | Nov 9, 2022 | Public Relations

The vast majority (92 percent) of consumer respondents in a new survey from digital CX innovator TELUS International believe that it is very important or somewhat important to have humans reviewing content online vs. AI alone. Of those respondents, nearly three-quarters (73 percent) felt that AI doesn’t understand or can’t distinguish context and tone as well as a human.

“AI continues to become increasingly sophisticated in detecting digital content that goes against brand standards and community guidelines and has proved to be a great first line of defense against harmful content—but, with new content types constantly emerging and the increased use of algospeak, it is practically impossible for AI to keep pace,” said Siobhan Hanna, managing director, AI data solutions at TELUS International, in a news release. “There continues to be a need for humans to handle more contextual decisions, as AI can only go so far in accurately making the sometimes difficult decisions about the intent behind a particular phrase or image. By employing a human-in-the-loop approach, brands can benefit from the speed and efficiencies of AI, while at the same time, ensuring nuanced content is accurately reviewed.”

Content moderation becoming increasingly difficult

The survey also found that more than half of respondents (53 percent) believe it has become harder for brands and social/gaming platforms to monitor the content on their sites over the past year. The top reasons as to why they believe it has become harder are:

  • There are more people on each platform/channel (66 percent)
  • It’s becoming more commonplace to air grievances online (54 percent)
  • Younger generations are more digitally inclined (50 percent)
  • Content is being posted in more languages (29 percent)
  • 5G connectivity has enabled increased access to digital channels around the world (19%)

“With more people online across a variety of platforms and in many different languages, content moderation cannot be done effectively by AI or humans alone,” continued Hanna. “A robust content moderation strategy that leverages a blend of AI whose algorithms have been built on a foundation of trusted datasets by a diverse team of annotators helps ensure that the data is accurate, context is properly taken and bias is responsibly mitigated.

“AI content moderation tools will only continue to improve, but human moderators will always be a necessary and valuable resource in ensuring safe online spaces for all. For this reason, it’s important that brands support content moderators with a robust wellness program to enable and empower them to perform their best work, while at the same time protecting their mental and physical health.”

The survey findings are based on a Pollfish survey that was conducted on Aug. 11, 2022, and included responses from 1,000 Americans.

Richard Carufel
Richard Carufel is editor of Bulldog Reporter and the Daily ’Dog, one of the web’s leading sources of PR and marketing communications news and opinions. He has been reporting on the PR and communications industry for over 17 years, and has interviewed hundreds of journalists and PR industry leaders. Reach him at richard.carufel@bulldogreporter.com; @BulldogReporter

RECENT ARTICLES

4 steps for scaling your Shopify store for international expansion

4 steps for scaling your Shopify store for international expansion

Did you know Statista predicts worldwide eCommerce revenue will reach $6.46 trillion by 2029? Your Shopify store might be at the center of this exponential growth. Ecommerce is rapidly propelling global business expansion into new markets and tapping international...