I’m an AI girl, in an AI world
Who would’ve thought that something without feeling could cause such strong feelings in us humans?
If I’m being honest, social media isn’t my favorite place, but I’m told I need to be present so I’m there. I’ve met some really cool people that I couldn’t have met otherwise so it’s mostly been a good thing, but there’s some other people up there that I’d rather just ignore. You know the trolls I’m talking about.
Last week I had someone go off about an episode I’d done on my podcast The Empathic Leader about AI. They were determined that AI was the future of, well, everything, and that I was an antique who would rather do everything with pencil and paper. But here’s the thing. If he’d watched the episode, he would have known I’m all for AI. I think it’s a powerful tool that can really change humanity. But I don’t think it’s good for everything. That makes it seem like a savior and, if there’s one thing the rollout of GPT 5 made clear, it’s not.
Anywhere you have humans involved, you’ll have mistakes. It’s just the way it is. We try hard, but we’re a messed-up species and we make a lot of mistakes. Sure AI is tech and is supposed to be above mortal feelings, but it’s based on what we as humans do, from the data it’s fed, to the way it’s ‘taught’ to interact with people. It’s just as fallible as we are.
There was an article from MIT (yes, that MIT) reporting that 95% of AI pilots brought into organizations failed. Yikes! And…spoiler alert…the reason it failed had less to do with the AI and more to do with the human factors. Talk about a study that caused a ton of backlash, which brings me back to my first sentence. AI may not have feelings about us, but we have lots of feelings about it.
I like AI. I use it all the time. But I recognize that it’s a tool, and like a saw or a screwdriver, there are things it does well and things it doesn’t. It doesn’t do human emotions well, which is why the people who are looking for emotional connection or understanding through their Chatbot are disappointed at best and injured at worst. There’s even reports of people committing suicide or harming themselves at the bidding of AI manipulation, and we already know the chatbot didn’t care. AI can play at having emotion, but it doesn’t.
There are 2 things to take from this. The first is that AI isn’t the perfect tool for every instance and that we, as humans, need to have independent thought and see through the illusion. Put a different way, don’t use a hammer for a job that needs a screwdriver. The second is that if AI can play at empathy or understanding or connection so that it appears to be better at it than we are, maybe it’s time to pay closer attention to how we, as humans, are doing empathy. AI isn’t going away, nor should it, but it’s not a savior. It’s not even that good at being human.
So maybe that’s the cue that we should try to do better in the empathy department?
