Who decides what inappropriate content is?

OK, so maybe googling “Japanese school uniform bondage” wasn’t the best idea I’d had. I needed a bit of inspiration (no, really) for a short story I was working on and turned to the world’s favourite search engine. Needless to say a veritable cornucopia of oriental ladies in varying scenes of distress and discomfort floated past my eyeballs.

Only when I reached the bottom I found a curious message:

“Some results omitted as suspected child pornography”.

Good, was my initial reaction. But then I thought about this a little longer and wondered what the definition of “child pornography” was and who got to decide if results should be filtered.

My reasoning isn’t that child porn shouldn’t be blocked – it should be and those who create and consume it thrown to the wolves. It’s a question about how are decisions being made about what is and isn’t acceptable. Did someone inspect those filtered results and decide they were indeed child porn (and one assumes promptly reported them to the authorities, in which case why even post the message)? Or was it some algorithm that made the decision? An algorithm that is unlikely to be refined further because few users are likely to click “show filtered results” when such a troubling message appears.

I don’t have answers, I just hope that we’re not throwing the proverbial baby out with the bathwater. I hope we’re not becoming over dependent on technology to decide what we should – and should not – see.

Spread the love...

About Razz

I'm a creative dominant type with a love of BDSM and fetishism. This blog is an outlet, so don't take anything you see or read too seriously.

Next