cookies

This website uses cookies. By continuing to browse, you agree that cookies will be stored on your devices. You can change the cookie settings in your browser.

App Store Content Policy - is your image moderation solution enough?

Image moderation can find many applications. However, currently, visual content moderation is no longer a courtesy to anyone who wants to launch monetize an app via Apple's App Store and Google Play. Both of these platforms put safety first as regards objectionable content, making a visual content moderation a necessity.
 

 Visual content - The Ace of Spades

 

User-generated content lies at the heart of the massive success of all social media. These platforms serve as simply as frameworks that allow people to publicize content they made on their own. Moreover, users are encouraged to generate as much content as possible and are provided powerful tools to do it. As a result, content creation and editing is a piece of cake. Take Instagram or Snapchat as exampleshere.

And if content king, then visual content is the ace of spades.

 

 Users are messy

 

Checks and balances are required to moderate users and these tools. To oblige any user to follow the rules before granting access is not enough. It's usually done with EULA (End User License Agreement), where "objectionable content" is defined and users must promise they will not generate any.

However, developers need to keep it real. Intentions of the people may be hard to discern. The accounts may be hacked one day, too. No one knows. If such an emergency happens and an inappropriate content is spread with the use of an app, a platform, a service or whatnot, then it's the developer's problem.

For allowing a mere possibility of such emergency, developers are held accountable by the app stores. Apple, for that matter, is known to have been strict about content moderation policies and measures undertaken by the developer to keep the peace.

 

Apple App Store on content moderation 

 

Apple gives it to you straight. As the guidelines say:

"Apps with user-generated content present particular challenges, ranging from intellectual property infringement to anonymous bullying. To prevent abuse, apps with user-generated content or social networking services must include:

  • A method for filtering objectionable material from being posted to the app
  • A mechanism to report offensive content and timely responses to concerns
  • The ability to block abusive users from the service
  • Published contact information so users can easily reach you"

Read the full guidelines here.

 

Google Play Store content policy

 

Google Play Store is no less determined to stay away from having any share in promoting apps that could - even potentially - be a cause of a distress. As examples of inappropriate content, the Play Store Developer Policy Center, enumerates sexually explicit content, hate speech, violence, content capitalizing on "natural disaster, atrocity, conflict, death, or other tragic event" or facilitate bullying and harassment.

Furthermore, Play Store formulates content moderation requirements for app developers:

"Apps that contain or feature UGC must:

  • require that users accept the app's terms of use and/or user policy before users can create or upload UGC;
  • define, in a manner consistent with the spirit of Google Play's Developer Program Policies, UGC that is objectionable, and prohibit that UGC via the app’s terms of use and/or user policy;
  • implement robust, effective and ongoing UGC moderation, as is reasonable and consistent with the type(s) of UGC hosted by the app;
  • provide a user-friendly, in-app system for reporting and removal of objectionable UGC;
  • In the case of live-streaming apps, problematic UGC must be removed in as close to real-time as reasonably possible; and
  • remove or block abusive users who violate the app's terms of use and/or user policy."
     

 Is your content moderation appropriately structured?
 

Clearly, having a human-only moderation support is no good. Firstly, it's not scalable. If your app, hopefully, grows big, you would need to keep expanding your department of moderation. An automated image moderation service is the only way out here.

Secondly, human moderation can only work as a response team. Your users will not accept any time lag for moderation, as an immediate publication is not only the standard but also a cornerstone for some social media (e.g Twitter as used by journalists or news media.

Therefore, having a reliable AI-based visual content moderation is a must, too, bearing the requirements of the two biggest app stores out there: Google Play and Apple App Store. It seems that only an automatic moderation service will perform well and fast enough to guarantee the safety.

On the other hand, only a quality AI will do. One that cannot be fooled, risking inappropriate content leaks. Therefore, cost-effectiveness is one thing, but if you are considering several services, the robustness of the algorithms needs to be put first, always. xModerator's human-like accuracy guaranteed by our artificial intelligence contributes to the quality of your service. Make sure that the environment you are developing - be it innovative or simply fun - is safe.

It's the app store priority. Clearly, it should be yours, too.