Naked on the Net: How Algorithms Strip Our Images

Naked on the Net: How Algorithms Strip Our Images

The days when humans decided on the appropriateness of images are long gone. Today, algorithms scan our photos for every square centimeter of skin. And it's frighteningly simple, as I'll show you in this article.

Reading time: 2 Min.

Recently, I came across a formula for automatic skin detection in photos. JavaScript, really just a few lines of code — and it was done. You can try it yourself below in this blog post. I was also surprised when I realized how simple the basic technology actually is.

Complete Simplified Skin Detection Code

Of course, major platforms like Instagram & Co. employ much heavier artillery. Their AI-powered systems don't just recognize skin colors; they also scan for nipples, classify body regions, and have facial recognition in their back pocket anyway.

Modern content moderation systems are far smarter than a simple color filter. They recognize body shapes regardless of color, analyze textures and contours, and thanks to deep learning, they've been trained on millions of images in all imaginable variations.

Even semantic segmentation allows them to understand the entire image context. AI is no longer fooled by this. The same technology that can detect skin cancer in medicine is being repurposed here to censor artistic expression.

Easily detected skin …
… but not when the room is colored in shades of skin

Wait a minute. Have you noticed that we now live in a world where algorithms decide whether a photo is "decent"?

The irony: The more platforms filter, the more they actually criminalize normal behaviors. Even harmless snapshots become potentially suspicious material. We all become suspects.

Completely harmless photos get blocked because too much skin is visible. A portrait on the beach? Could be suggestive. Nude photography with censorship bars? Watch out!

The question is: What came first? The compulsion for excessive control or the "offensive" material? And who makes these decisions? And who is being protected?

The system reminds me frighteningly of times we thought we had overcome.

Constant surveillance, automated suspicions, preemptive obedience — ring any bells? The Stasi would have been delighted with today's technical possibilities.

As a photographer, I experience daily how this automation restricts artistic freedom. It would be so simple: understand context, apply human judgment, distinguish art from provocation. But algorithms can't do that — no matter how complex they are.

With my simple JavaScript tool, I can show you how easily skin detection already works. Red-marked areas that the algorithm classifies as "critical" — that's basically all there is to it. Of course, as mentioned, the major platforms use more sophisticated systems, but the basic principle remains the same.

Test it yourself with the tool here in the article. And be surprised by what the algorithm classifies as "questionable."

Simplified Skin Detection

The tool works completely locally in your browser. Your images don't leave your computer when you try the tool.

Navigate