I once had a not-great experience with a coach who made me think, This could easily be replaced by AI. It never felt like we connected: the coach came off more like a dispenser of questions and tools.
And so when I read this Guardian article, “Google DeepMind testing ‘personal life coach’ AI tool”, I wondered: How many coaches will have to step up their game?
As a former software engineer, I’m generally in favour of automating away the tedious, repetitive stuff so that we humans are freed up to do the more creative, meaningful stuff. If it’s just information, tips or “advice” you’re after, sure, let AI help you. (And that’s not really coaching anyway.)
Let’s look at this quote from that article:
One example of a prompt a user could one day ask the chatbot was a challenging personal question about how to go about telling a close friend who is having a destination wedding that you cannot afford to go.
AI can generate a lot of options, including ones you’ve never thought of and can indeed learn from. Maybe that’s helpful enough for you.
If you were my coaching client, we’d probably take a different route. I might say, “So this is someone you consider your close friend, and you find it challenging to talk with them about what you need. What’s that about?” Among the many things we could explore might be the extent of a genuine communication-skills deficit, a suboptimal relationship with oneself, a distorted sense of responsibility, a natural reaction to a friend known to employ guilt trips, or the weight of cultural expectations. I might also wonder what this might tell us about the relationships in your life and how else this might be constricting you. There are so many potential areas for inquiry and growth, all with different sets of possible solutions.
There’s also this point in the article that struck me:
... AI has already been successfully used in other environments for many years in areas which are sensitive and where humans actually prefer to ask an AI for advice.
Of course, there’s a time and place for everything. I certainly see cases where I’d prefer AI for accuracy and access reasons. But if it’s a problem of “I feel too much shame/embarrassment/discomfort to talk to another human about this” or “People are judgmental/invalidating/untrustworthy”, I’d hope that we could, both individually and collectively, address these very legitimate concerns, raise our relational intelligence, and improve our ability to have difficult conversations, not come up with more ways to avoid humans.
Going back to the example with the close friend, during our coaching, it’s possible that you’d feel a little uncomfortable, because yes, it’s personal. Maybe you’re not used to being that vulnerable. Maybe a part of you would still wonder whether I’d really be okay with you just being you. The thing is, you’d be working with this very human issue with another human. My job is to run a safe lab, and I think you’d like it here.