If you're not in the habit of frequently Googling things like "boobs" during the workday, you may have missed the Internet's mini-meltdown over the engine's new search result censorship for all its pervy American users.
Luckily, some enterprising masturbator on Reddit was quick to capture the fact that Google's "SafeSearch" filter seemed to be stuck on for his account -- and that his image search for a pretty explicit sex act returned nothing more unsafe-for-work than a few well-placed lollipop-fellation photos. RIP OFF.
For those of you who haven't been a bored 13-year-old on the Internet for a while, allow me to explain how Google's automatic search filtration used to work. Up until a day ago, Google used to let users set their own preferences for "SafeSearch:" strict, moderate, or fun and fancy free. A decade ago, when SafeSearch was still being perfected, most middle schoolers whose parents had installed their desktops firmly in the living room preferred the "moderate" option. This more-or-less prevented the classic, "I searched for the War of 1812 and got a married couple doing it doggy-style, honest!" conundrum but also allowed a tween with imagination to get thoroughly worked up over some PG-13 cleave. Everybody won!
Of course, being a relic of the early-to-mid-2000s, SafeSearch also assiduously blocked things like the BBC Education website and BruceSpringsteen.net while totally letting unmitigated vadge slip through under the homograph "beaver." Whatever, it was 2003. We all did things we regret.
These days, most adults I know don't bother with SafeSearch, if only because putting any kind of filtration on one's web searches tends to futz with results in irritating ways. Personally, I'm unlikely to be Googling things in public that could possibly lead to the old throw-yourself-at-the-monitor maneuver -- and even if I were, those thumbnails are so small that they'd be no big deal. Even so, I understand the appeal of having the option to more heavily censor my searches, if I so chose.
But yesterday, Google decided that it would be better for all its US users if it just took away that option altogether.
It was a weird move, largely because it wasn't a global initiative -- users in the UK could, if they chose, still bring up pages of unfettered crotchitude without having to awkwardly specify that they wanted it to be X-rated. Meanwhile, if US users wanted to find images with an "explicit" rating, they were going to have to look damn hard for them. Instead of searching, say, "peen" to see pixelated nudity, one would have to search "naked peen" and face their perviness head-on, so to speak.
Also weirdly, Google didn't announce anything until almost the end of the day, as if no one would notice that their morning routine of searching for "dick" had been hijacked by a bunch of photos of this guy.
When they did make a statement, it was to assure users that it wasn't as if their SafeSearch was stuck on -- rather, they said, they wanted to make really sure that what you actually wanted was explicit content. No more cutting spank-bank corners with this search engine, y'all!
I know it seems kind of dumb to get worked up over this. It is, after all, probably saving thousands of 14-year-old eyes from the horror of their friends linking them to "blue waffle." But it makes me nervous all the same.
For one thing, I am a near-Silicon Valley resident who, despite my harrumphing about tech!boys, trusts our Dark Rainbow Lord's algorithms. The more of a monopoly Google has on the search engine market, the more power it'll have over what content is released to its millions of users, and on what timeline. Sometimes I forget that Google isn't actually a neutral, benevolent force whose only motive is equitably distributing information. It's a company that's trying to make money, whether that means censoring porn from China or allowing companies to buy higher placement in their search results.
More significantly, the definition of the term "explicit" is so murky these days that Google may have accidentally sliced out a whole chunk of its image content without even meaning to. We all remember when "bisexual" was considered taboo, right? What about when the website TV Tropes started self-censoring in order to stop being Google-slapped for their discussions of rape tropes? Meanwhile, search terms like "corpse" are still turning up some pretty nightmare-inducing images. It's not exactly a straightforward filtration process.
Sure, Google may be preventing us from seeing surprise!dick, but it also seems to be blocking images from websites that, say, use the word "fuck" too often.
Don't believe me? Check this difference in the UK's existing results for a certain xoJane writer (and, apparently, a blonde Australian actress whom I clearly have to wrestle someday):
No more mermaid makeouts for you, Kate Conway! Correct me if I'm wrong, but a photo of a fully clothed woman pretending to kiss a Christmas ornament isn't exactly porn. But as far as Google's concerned, I might as well have been participating in a real-life threeway.
From a company like Google, that does strive to project an image of neutrality, it seems a bit like legislating morality.
It's not as if I'm personally invested in being able to see nudes on the Internet. There are plenty of other search engines if I really wanted to pepper my day with front-butt. But I still don't think companies should take it upon themselves to make it harder for adults to peruse their chosen material.
Am I overreacting? Are you guys less prone to searching NC-17 things on your smartphones and therefore don't give a flying fuck what Google does with its results? Let me know in the comments. In the meantime, I have to go clear my entire Internet history.
Kate is 2 explicit 4 Google at @katchatters.