How AI Tackles Emerging Content Challenges


December 18, 2023
3 min read
By Cogito Tech.
310 views

Today there is a pressing need for social media management in the online space due to the increasing one-on-one interaction with third parties. These third parties could be followers, customers, or just strangers.

Content moderators have to ensure that real people get real response by screening online comments for profanity, obscenity, and offensive language on religion, political extremism, or even hate speech.

Content moderation involves putting a control in place to moderate various types of content on social media platforms including Facebook, Twitter, LinkedIn or Tumbler, etc. which is not suited for the general audience.

Once the content is posted, the content moderators search and remove them before it is viewed and becomes viral.

1. Pre-moderation: This type of moderation is used primarily in blogs or online forums than on social media. In this case, the comments are queued up for approval prior to posting.

2. Post-moderation: This involves reviewing of comments after they have been posted.

3.Automated moderation: This involves using a content moderation tool for blocking a list of specific words or phrases.

The growing requirement for content moderation has led companies to invest in machine learning algorithms, natural language processing, and other automation tools. Automation has led to deployment of more number of people than reducing it.

Companies require workers for annotating image datasets or other types of media to be used for training the tools and humans need to cross-check if the algorithms got the decisions correct. It’s been aspired that automation will take over repetitive, annoying, or unpleasant labor. However, in the case of content moderation, this aspiration has simply been shoved aside.

Automation can only take care of rote cases like spam or content that already exists in a database. Moderation requires skills that are nuances and need linguistic and cultural competencies. For instance, in case of symbols – Do certain symbols denote special meaning or it’s simply a symbol like any other? A black sun (Nazi symbol) may be seen by someone as simply a geometric design, unless they are familiar with its context. So, machines cannot be entrusted to make this distinction.

Challenges of Using AI in Content Moderation

Artificial intelligence (AI) is indispensable in the given context owing to the vast nature of online content moderation need and also due to its ability in identify coordinated inauthentic behavior i.e. integrated networks of “bots”. Nonetheless, there still remains issues with respect to over-dependence on automated tools.

According to David Kaye, co-director of the UCI Fair Elections and Free Speech Center, social media companies “use the power of artificial intelligence to drive these systems, but the systems are notoriously bad at evaluating context.”

As algorithmic identification is not precise, it will not permit any contextual cues which may be essential for distinguishing extremist speech from parody, documentary footage, or legitimate protest. Ultimately, certain speech that intends to challenge or lampoon hate speech may be done away with. AI can only be as impactful as the data that it analyzes and relies on large datasets which might take into account only information that’s produced using biased methods. As a result, AI needs to reproduce bias against disadvantageous populations.

Deployment of AI for flagging content also entails risk of over-censorship. It also plays a limiting role with regards to the length or extent to which human moderators take context into account. As human moderators generally take less than a minute for reviewing a post, it does not allow them to take a wider perspective of the circumstances. This is because moderators are able to review just one post at a time and do not have sight of the plethora of posts in a location. Moderators might not be appreciative of the fact that posts can be repetitive and in wide numbers resulting in cumulative effects.

Steps to Overcome Challenges Posed by AI

1. Pausing the shift to AI by reviewing its Implications: Companies must pause their shift towards AI and conduct an in-depth, transparent, and independent review of its implications. The rate of false positives should be disclosed by companies in their flagging algorithms. Also, an investigation needs to be done regarding the amount of hate content present online due to its coding.

2. Companies must be encouraged to shift away from one-size-fits-all model: Different models must be built by programmers taking into account the conditions prevailing in that country. Machines and humans must be integrated leading to a hybrid social ordering system.

3. A pluralistic approach to content moderation must be taken by companies: A pluralistic strategy to content moderation must be undertaken by companies comprising of sub-regional list of slurs and hateful expressions that come up post consultation with sociolinguists and experts. While doing content review, moderators must be permitted to consider external information regarding the political and cultural context of the country and particularly indications of risk of violence against an individual or group.

In summary

Even though content moderation has its merits and demerits, it is logical for companies having digital platforms to invest in it. Even if the content moderation process is enforced scalably, it permits the platform to become a repository for large volume of user-generated content. Content moderation will permit for publishing content in vast quantities and also ensure that its users are protected from malicious and undesirable content like political extremism, hate speech, etc.


Discover more from reviewer4you.com

Subscribe to get the latest posts to your email.

We will be happy to hear your thoughts

Leave a reply

0
Your Cart is empty!

It looks like you haven't added any items to your cart yet.

Browse Products
Powered by Caddy

Discover more from reviewer4you.com

Subscribe now to keep reading and get access to the full archive.

Continue reading