Just now, Norton, one of the big players in internet security, has once again categorized my website as pornography.
That's about as accurate as calling a zen garden a sandbox with a rake fetish.

Funny thing is, the law is actually crystal clear on this. The German Federal Court laid it out back in 1969 (and yes, I see the irony in that year): Pornography is defined as content that "pushes sexual acts to the foreground in a crude, sensational manner while disregarding other human aspects." The judges added that it must "exclusively or predominantly aim at arousing lustful interest in sexual matters."
Hold up. Crude? Sensational? Apparently, this algorithm was trained by someone who blushes at the sight of a banana. My work is the exact opposite: It's about subtle suggestion, about body language that conceals more than it reveals, about confident and desirable women who own their allure — not about crude exhibition. It's about imagination, not in-your-face exposure.
But the artificial intelligence behind these systems only knows two states: "family-friendly" or "gates of hell".It's like categorizing all food as either "edible" or "poison" — and then labeling a Michelin-starred restaurant as a poison kitchen.
Sure, I could file a formal complaint. The chances of success? About as likely as a robot understanding the difference between a hug and a headlock.
What can I do but laugh about it? I'll keep doing what I do best: photographing people in their natural expressiveness. With respect, artistic integrity, and the firm belief that true eroticism happens in the mind, not in a database of yes/no questions.