Artificial intelligence continues to separate people into two groups - the first one is afraid of new technologies, claiming that the excellence of modern developments will eventually enslave humanity unable to win with much stronger and insensitive machines. On the other end of the spectrum, the latter group is extremely excited about any information concerning technological advancements.
Content moderation helps to monitor spammers and supervise user generated content in order to control their impact. It is about applying pre-determined guidelines and code of behaviour to the content to verify, whether a comment of feedback is publicly allowed or not.
User-generated content is said to be more trustworthy and more memorable than other types of media content. However, preparing the photos by the authors themselves may carry some risk – without proper moderation one never knows if the images are fine with all the legal regulations.
Web 2.0 brought about a certain breakthrough. Namely, it blurred the line between content creators and the audience. Tearing down this wall was a done deal after mobile devices became universal. However, such empowerment by no means equals a free-for-all. Content moderation tools are a necessity.
Image moderation can find many applications. However, currently, visual content moderation is no longer a courtesy to anyone who wants to launch monetize an app via Apple's App Store and Google Play.
Many websites allow their users to add images. Some of the uploaded images should be rejected because they are obscene, indecent, or otherwise inappropriate. There are several methods to handle this issue...