Artificial Intelligence Is Eating Our Language
When it comes to using A.I. to enforce inclusive language, be careful what you wish for
While working at a fairly evolved digital organization, my colleagues developed a customized bot within the business messaging app, Slack. Most people who use Slack for communication know you can utilize bots inside the application to set reminders, schedule meetings, or even make jokes.
Our team, however, decided to explore how we could use an automated bot to encourage more inclusive language. While the team was diverse, we didn’t interact in a way that enabled non-verbal communication, which is vital in synchronizing common meaning. I found this out firsthand during one such experiment when we created a “Guys Bot.”
The premise of the Guys Bot was simple. When someone typed the masculine word guys in chat (as in “Good morning guys!”), the bot would send an automated message to the original poster. The message would state, “Excuse me, it looks like you said guys. Maybe consider more inclusive language like friends, pals, or teammates.”
This experiment prompted a debate at our virtual water cooler on preferred alternatives to the word guys when addressing a mixed-gender group of people. Friends, teammate, or mates were all frontrunners.