THE CAMPAIGN

Don’t Delete Art: A Gallery of Art Censored by Social Media Platforms

Digital gatekeepers control the world’s largest social media platforms and have enormous power to determine what content can freely circulate and what should be banned or pushed into the digital margins. Not only is content removed because of overly restrictive and sometimes unclear community guidelines, but, unbeknownst to users, material vaguely defined as “objectionable” is made to disappear from search and or explore functions, and hashtags. All this has a dire effect on the work of emerging artists, those living in repressive regimes and, in general, on all who have no museum or gallery representation. Work can be erroneously removed and whole accounts deleted with thousands of followers lost. With no possibility of appeal, an artist can feel fearful and powerless and opt to censor themselves. To show some of the exemplary work whose circulation is currently arrested by social media platforms – whether due to faulty algorithms and a lacking appeals process or due to ill-considered community guidelines – we have launched this online gallery of Art Censored by Social Media Platforms.

With this gallery, we call on social media companies to adopt a set of principles guiding the regulation of art online and allowing art to circulate freely in the online environment.

Notice and Appeal Principles for Social Media Platforms

Don’t Delete Art has laid out the below principles as practices that should be adopted by all social media platforms.

Special protections for artistic content:

  • Platforms should take steps to ensure that artist accounts are not repeatedly unduly silenced, and that content is not too broadly and unnecessarily restricted.

Notifications:

  • All users should be notified every time content is removed or “downranked.” This includes content removed from platform-specific functions ensuring higher visibility, such as search functions, hashtags, explore features, and others.

  • Automated notices should include the following information:

    • details about the specific content removed;

    • details about how the specific content was identified as being in violation of platform guidelines (such as artificial intelligence, user reports, or government action);

    • a description of the exact actions taken against the account or against the visibility of its posts through hashtags, etc.;

    • reasons for the actions taken, including the specific rule violated;

    • as relevant, acknowledgement of specific involvement by state actors in flagging or ordering an action against the account;

    • notification if there is risk of permanent account deletion.

  • Notifications should be accessible even if a user’s account is suspended or terminated.

Appeals:  

  • Every notification of removal or downranking should also contain clear information on how to appeal the decision.

  • Appeal should be available even if a user’s account is suspended or terminated.

  • The appeals process needs to include:

    • an opportunity to present additional information to be considered in the review;

    • review by a person or a panel that was not involved in the initial decision;

    • notification of the results and clear explanation, offered within a period of time not longer than seven days;

    • availability of a last instance review by an independent external oversight mechanism.

Both notices and appeals process should be:

  • in the nationally recognized languages of the countries in which the platforms operate;

  • available in the company’s’ Terms of Service; and

  • accessible even if a user’s account is suspended or terminated.

Further suggestions regarding artistic content:

  • Platforms should take steps to make sure artist accounts are not repeatedly silenced: one option is to verify artist and arts organization accounts and then subject them to a different level of algorithmic scrutiny.

  • Platforms should be consistent in their implementation of content moderation rules. For example, if they allow photographic nudity in a “clear artistic context” for one medium or post , they should do it for all mediums and posts that demonstrate a clear artistic context.

  • Platforms should not be censoring artistic expression for the sole reason that it contains nudity. Whereas there may be problems associated with establishing consent or making sure no illegal material is circulated, the human nude has always been one of the central subjects of art. Platforms should develop mechanisms that assure imagery that is present in the world’s museums can also be seen and shared on social media.