This website uses cookies. By continuing to browse, you agree that cookies will be stored on your devices. You can change the cookie settings in your browser.

Various approaches to the moderation of user-provided images

moderation solutions


Many websites allow their users to add images. These include social media, image sharing services, dating websites, advertising websites, internet fora, etc.
Some of the uploaded images should be rejected because they are obscene, indecent, or otherwise inappropriate. There are several methods to handle this issue. Below you will find a short description of each of them with pros and cons.


In-house moderation
The seemingly most obvious solution is to hire people who will moderate user-provided images in real time.

- significant control over the whole process
- high quality of the moderation

- the highest cost
- moderators have to be employed and their work organized

Outsourced moderation - companies

Some companies provide moderation services. You can hire a company whose employers will moderate images uploaded by your users.

- no need to employ moderators or organize their work

- high cost
- you rely on moderators you have no control over

Third party moderation by Internet users
It is a similar approach to the one described above but the company does not employ permanent moderators. Moderation is provided by Internet users who are online at the moment. They are usually rewarded for moderating a specific number of images.

- lower cost than with the previous solutions

- the cost is still higher than for the next approaches
- no quality guaranteed
- no control over who watches your user-created content


Moderation after user report
Some websites don't moderate images as they come in. They wait for the user to report inappropriate content. Only then do the moderators assess them.

- relatively low cost

- you can never be sure how fast the inappropriate content will be reported. Until then, it is available on your website and can be viewed by your users.
- Some inappropriate images will probably never be removed.

Moderation by artificial intelligence (AI)
The era of advanced technology brought automated image moderation services. In this case, the moderation is provided by artificial intelligence. As the quality of the moderation differs from solution to solution, you should focus on the quality of results. Today, xmoderator offers the best results.

- very low cost
- immediate reaction
- scalability: multiple images can be moderated simultaneously

- poorer moderation quality possible (depending on the platform)

Mixed approach
A mixed approach involves employing more than one solution simultaneously. This helps achieve a better quality of moderation, to make sure the system decision was correct.
Virtually any solutions can be combined:
- In the case of human moderators, an image can be assessed by several people.
- AI moderation can be combined with human moderation. In the latter case, an image first undergoes automated moderation. Next, images classified as inappropriate are assessed by a human. AI systems return results indicating the level of inappropriateness. By setting a threshold, you can determine the number of images to be assessed by a human.
- You can also use results of several AI systems. An example would be a vote whether the image should be rejected or accepted. If, for example, an image was moderated by three AI services and one gave the green light while the two others flagged it, the image is rejected.

- the best quality, least errors.

- higher cost as compared to a single solution.