Don’t read the comments…don’t read the comments…don’t read the comments…

It’s become the mantra of every sane internet user, but too many of us can’t help but look. Whether it’s nasty remarks on a YouTube video, a tweetstorm on social media, or any of the millions of hate-filled, rhetoric-laced political conversations taking place at this very moment, significant portions of the internet have become a wasteland of online trolls and harassment. And with First Amendment considerations and a sheer lack of manpower at any of the web platforms where the ugliness ensues, there’s not much that anyone can do to put a stop to it.

clx-hodges-stairwell-0911-lgn

But Google intends to try. With its new “trigger word”-based machine learning, Google’s Jigsaw division has come up with Conversation AI, a tool that’s trying to shut down hate speech and make the internet a happy place again. Unfortunately, a report from Mic.com reveals that users have already found a way to thwart the tool’s capabilities by creating “code words” for different racial, ethnic, religious, lifestyle, and gender groups that they wish to target [NOTE: not linking to the profoundly important article out of respect for these groups and a desire to not spread the “code words”].

AI automation has had some problems when it comes to social media users. Who can forget Microsoft’s AI experiment, “Tay?” The tweenaged chatbot was taught to be a neo-Nazi on her very first day online, and Microsoft swiftly pulled her plug. The Twitter account that was run solely off of interactions with other social media users–and included an unfortunate command, “repeat after me”–had spiraled into the quagmire within hours.

While the efforts at combating online trolls are admirable, it appears that there’s no end in sight. What makes the internet so powerful and so useful is the very fact that billions of people have access…and opinions. It would be nice to find a Star Trek-esque utopia of compassion and understanding every time you connected online, but it’s obviously not the case.