Is NSFW AI Compatible with Artistic Freedom?

The conflict between not being able to show NSFW AI to maintain good artistic freedom is a tough debate. Critics charge that content moderation tools, particularly AI ones, frequently misunderstand artistic expression for adult material. Digital artists, in a 2021 poll for example, encountered inappropriate-outcome challenges almost as frequently. × Watching all of this gives further rise to the worry that platforms such as Instagram and Facebook employ NSFW AI on a very large scale, scanning millions upon millions of stills each day, automatically taking down images because they might contain some nudity or provocative scenes — even when those are an intended part of artistic expression and not actual sex.

Nuanced context (and from which artistic freedom truly springs) is probably too much to demand of AI. Many NSFW AI datasets are composed of thousands or millions labeled examples — yet these labelers annotate based on existing notions often biased towards Western standards. As a result, content that originates from non-Western artistic traditions is disproportionately impacted. A 2019 controversy surrounding the removal of naked body sculptures from social media is another case in point– revealing just how poorly moderated AI algorithms perform identifying when work straddles the boundary between art and obscenity. The problem here is that this limitation can leads to the question of whether AI is capable of fairly evaluating creative expression.

Alluding to the possibility of an algorithm restricting human creativity rather than augmenting it, Elon Musk has famously stated "AI is the greatest risk we face as a civilization." The artists are always concerned that the NSFW AI is essentially about ultimately encouraging censorship to ensure safety and thus silencing any art which really dares speak up against societal norms. Contextual analysis tools are implemented by developers to enhance the algorithms, but it does not fully solve art which is subjective consumer sentiment. Figure 1 does highlight the intractable inner struggle of humans, but when we inspect results from AI models that categorize whether or not suggestive content is artistic vs. erotic — human accuracy falls just below 80%.

Adding a layer of human supervision to support AI decision-making also has considerable costs and time-to-operation drawbacks. On top of that, having a hybrid model answering some text-based questions from users yet not others means needing to 25% more for your content moderation budget — which can kill most smaller platforms. The tension surfaces, however, when those same creators and platforms aimed to use it but not at the expense of their art. DeviantArtWith over 61million registered users, large platforms are reaching a tipping point where they need to balance enforcing community guidelines with maintaining artful freedom of expression.

However, NSFW AI is innovating nonetheless. The emergence of context-sensitive algorithms, which consider a post or link's historic trappings, artistic format and intention in the broader sea is something that tech companies are curious about. These approaches are encouraging but expensive and time consuming to implement. Because when it comes down to recognizing what is art and creativity, until AI can sense intent correctly enough the artistic community may just find technology of this sort a two-edged sword.

Curious readers wishing to place NSFW AI in the wider art, content moderation intersection may want toe delve a bit more here at this nsfw ai link.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
Scroll to Top