cookies

This website uses cookies. By continuing to browse, you agree that cookies will be stored on your devices. You can change the cookie settings in your browser.

UGC-based services where content moderation is a necessity

UGC - a problematic cornerstone of Web 2.0

 

In the era of consumer-generated media, creators no longer have to be sought after. Today it’s exactly the opposite - content creation has become a piece of cake, and the tools are available to everyone. User-generated content is the cornerstone of Web 2.0.

There are plenty of means to make the real the virtual: taking a photo or making a video with a smartphone, or capturing a Snap, or, for the sake of immersion, using a GoPro camera for live event streaming. Reality captured this way needs a platform - and the whole world can view it.

While the switch to Web 2.0 is generally considered a valuable breakthrough, this new paradigm posits next level challenges, too. It is especially the case with content generation features for mobile devices. With these on board, however private or elusive a human experience is, it can be turned into a piece of content and, one way or another, can make the news.

In the process, some ground rules can be broken. Any type of content can be therefore published, including the most disturbing types, usually labeled as NSFW (Not Safe For Work). This general qualification includes nudity, porography, gore, hate speech, expressed with words or images, and much more.

Therefore, managers of Web 2.0 platforms need to employ some level of content moderation - images, videos, posts, etc. Let’s see which types of Web 2.0 services and features have an explicit need for moderation.

 

Features that need content moderation

 

  • Services using personal profile feature: a virtual avatar is at the very heart of Web 2.0 and is essential to personalizing the experience. On the other hand, certain user’s drive for publicity may result in posting objectionable content that would make the news. Attention from a virtual avatar might be no less real than from a human in the real-world. The general rule is that the more controversial the content, the more attention it draws.
  • Online chatting: There is a certain kind of thrill of talking to a complete stranger, face to face, via an online cam. This was the main source of inspiration for the creators of online services such as Omegle or Chatroulette. The latter also learnt the necessity of content moderation the hard way. Randomity is a wonderful aspect of our big world and experiencing it is a great opportunity. However, if a service is flooded with NSFW, the experience becomes ethically objectionable - and, at often times, simply dull.
  • Knowledge bases, encyclopedias, etc.: for example, those which employ an omnipresent wiki model. Since wiki is an open-source software, anyone can start such an encyclopedia where image moderation is optional. As the intention of wiki encyclopedias is to be collaboratively developed, a danger exists that the website will be flooded with objectionable content.
  • What about instant messaging apps and features? This may be an exception to the rule, because these apps allow private conversations between consenting people of an appropriate age. Therefore, for instance, there seems to be no need to moderate all nudity. On the other hand, some objectionable content is at the same time illegal and the manager of the service needs to make sure that its dissemination is not possible.
  • What about dating sites and apps? After all, this type of software is explicitly only for adults. However, that does not mean that they should offer a free-for-all space! Whatever the users are after, some rules on what kind of content is acceptable, are usually set. At the same time, adult character of such apps might necessitate a tight and effective moderation system. 
  • Ready for a ride off the rails? There is a type of social media that specifically features anonymity of the content shared via the app. Apps like Whisper or Secret . They embrace the users’ need to expose themselves and provide an infrastructure to do so. “The type of space we’re trying to create with anonymity is one where we’re asking users to put themselves out there and feel vulnerable,” - Whisper’s CEO told Wired.

 

Beyond black and white


To blur the image a bit more, we may add that even the web 2.0 shadow is complex. Some of the content that’s made public is definitely disturbing, but nevertheless worth showing. Blowing the whistle on all kinds of content counts here. Twitter and Facebook are full of evidence of atrocities happening to the civilians in unstable regions or even warzones around the globe. Many cases of the impact made by of user-generated content and social media are seriously discussed. Check out this wikipedia entry about social media’s role in Arab countries’ revolutions in the 21st century.

 

How to moderate effectively?

 

xModerator is an online service providing fully automatic image moderation. What are the implications of our automation?

Firstly, one ethical dilemma is swept away. xModerator bypasses the human factor in image moderation. Instead, algorithms as effective as humans’ eyes and brains do the job. If the content is undoubtedly objectionable, there’s no need for anyone to view personally, in order to make an obvious judgement of deleting it.

Also, automatic moderation allows for real-time processing of image recognition and verification to determine compliance with content policy. Dissemination of user-generated NSFW content through your social media tool changes the whole image of your product or service. It may severely hurt YOUR brand and therefore, a swift and highly accurate solution is exactly what you need.

An additional advantage of using xModerator’s automatic image moderation service is that the moderated content is not shared with a third party. Therefore, what was flagged as meeting the criteria of objectionability and subsequently kept at bay, stays there for good.