top of page
Search

Men's Mental Health Crisis: Is AI the Answer?

Updated: Sep 5, 2023


If you’ve been on Twitter in the past few weeks, you may have seen feminist Caitlin Moran calling out to her male followers to ask them what is difficult about being a man. I’m well aware that Twitter can be a pretty grim place at times, but the floodgates she opened with this question were astonishing. Men responded with touching replies about the pain of not being able to admit vulnerability; never having the opportunity to give birth; masculine competitiveness; wanting to grow their hair long without it being a ‘thing’; Piers Morgan and even Top Gear. One of my personal favourite replies (from @_shenton) was:

‘Men aren't conditioned to talk to each other about our lives. We meet up, have a four hour conversation about who would win in a fight between Inspector Gadget and RoboCop, but never get our worries off our chests.’

Although there is an inevitable truth to this, it feels to me like the conversation around men’s mental health is gradually beginning to defrost. Organisations like CALM (the Campaign Against Living Miserably) and the rather lovely and clever Movember (November moustache-growing to raise awareness) are doing their bit. There is a conversation where there wasn’t one before.

But there’s a reason that CALM describes men’s mental health as a crisis. Let’s look at the statistics for suicide. For men under 45, suicide is the leading cause of death. In 2015, 75% of suicides were enacted by men (although interestingly, women attempt suicide at roughly the same rate; they succeed less.) The Office for National Statistics started recording deaths by suicide in 1981, and since that time suicides by women have decreased - while the number of men dying by suicide has stayed the same.


In the face of stats like these, it’s natural to seek answers. We so desperately need them. Given Untapped’s work combining emotionally-intelligent support with AI technology, I was interested to see an article in the national press a couple of weeks ago about the power of AI - and chatbots in particular - to address the acute challenge in men’s mental health. Research suggests that while both men and women will open up to a chatbot, for men especially, the anonymity and lack of judgement promised by a ‘bot is a powerful incentive to talk about their inner worlds. The research found that the men surveyed were an astonishing 300% more likely to open up to a bot than a human. Respondents attributed this to the fact that a bot won’t judge, or cause them to feel shame for the alleged weaknesses they are showing. This tallies with research into finance ‘advice bots’ which has found that people are more honest with a robot than human financial and investment advisors - citing lack of judgement as the main reason for this.


I’m intrigued. Could it be that, as some are claiming, we have found the answer so badly needed? Wanting to get some male points of view, I asked a couple of male friends to try out Woebot for me. Woebot is a widely-available app providing mental health support and interventions, using long-established Cognitive Behavioural Therapy techniques. I’ve tried him out myself (and yes, for anyone acquainted with Woebot’s cute ways it is inevitable that the app quickly becomes a personified force in your life). He’s charismatic, well-pitched and the reflection space and quick interventions offered are lovely. It’s not therapy, but neither does it pretend to be.


So, to the great experiment. One of my male guinea pigs commented:

‘I was very skeptical at first, wondering if an app could provide any kind of empathy, or even if engaging in a dialogue with my phone would make me more disengaged with the world. I'm pleasantly surprised...Overall, as someone who lives on my own and can get lost in my own thoughts, I think it's really helpful. I wouldn't use it as a substitute for a real therapist (and they tell you at the start that it is not that) but as an AI 'friend' it's fun.’

This totally allies with my own Woebot experience: a pleasant, even fun, tool with some very useful reflective qualities.


But, given that my experiment was limited to people with no self-reported mental health problems at present, I was interested to delve deeper into the male experience of mental health, and what sources of support might work. I spoke to Vinay Patel, a playwright, who has also written some profoundly honest and moving pieces about his own experience of depression. Vinay has a large social media following for himself, and Instagram which heavily features the two cats who have helped him move forward. He agreed that AI might provide an important alternative to the self-presenting that is necessary in the current mental health system:

“I can see the arguments for using AI to help with men’s mental health, especially in the early stages of catching problems. It feels really hard to go to a doctor and suggest that you might be feeling depressed and then be made to answer a checklist of questions. It creates a sense of vulnerability that’s not easily entered into. However, knowing you’d be talking to AI can make it feel a lot more like filling in a form online - anonymous, no judgement.”

It’s this lack of judgement offered by a chatbot that research suggests men so badly seek. Vulnerability is such a taboo for some men still, and so the possibility of discussing it with a bot rather than another human provides an enticing entry point into mental health support. Now this is an incredible finding, and one that we should indeed be exploring. For the fact seems to be that if we can produce the right apps and get them to market, then apps designed to help men in particular to reflect and open up about their mental health could change, even save, lives. They could be a vital lifeline for men in crisis.

But it also strikes me that this is an astounding finding in another sense. It seems that many men in our culture feel such acute shame in admitting to their vulnerabilities that talking to a bot is what is needed to open up. I spoke to Vinay about his experience of asking for help via more traditional methods. He was sanguine:

“I think, for men, admitting to any sort of long-term mental weakness is still really difficult. There’s a better understanding of the need to talk about mental wellbeing amongst the men I know, for sure, but it’s still brought up mostly in the context of a joke or something that will eventually be sorted out. I like to think of myself as a relatively empathetic human being - it’s a big part of my job - but I still took a long, long time to seek professional help. In the end though, it was going to group CBT sessions that ultimately made me feel less alone and able to handle what I was going through. Nothing was more gratifying than meeting people far more successful or seemingly better put together who also felt destabilised. It made what I was going through feel not just human but common too. I don’t think I’d have been able to even approach healing without that connection.”

For Vinay, it was establishing a shared human experience and making connections to others through a group, that helped him move forward from his depression. And while our current chatbots can do a very good impression of a human - Woebot’s sunny California ‘personality’ being a great example - the key factor for many in opening up to them is the knowledge that, ultimately, they are bots dressed in human clothes, doing a really decent impression.


This knowledge may help us open up initially, but when it comes to the deeper human contact that Vinay (and many therapeutic clients for the past 100 years) have found so ultimately healing, we need to look to human-to-human contact. While AI chatbots might provide a way-in and a key lifeline for men in crisis, longer-term healing needs the human touch to help us integrate the learning that leads to growth, and to help us emotionally heal through building a relationship with a therapist.


Given this, I do worry about any rush to proclaim that AI is particularly helpful for men. While the statistics suggest that AI can be a very helpful intervention for men in getting them to initially open up, this idea on its own perhaps also reinforces the idea that men have less inherent ability to open up to another human (than women, for example.) While we clearly have a cultural problem with allowing men to be vulnerable, it strikes me as important that we combine the use of tools like AI with hard work to break apart the long-held misconceptions about men’s emotional abilities as somehow ‘lesser’ than women’s, if we are to serve both genders well. It is through using technical tools alongside our innate human capacity for emotional connection that we can create real change. If we can combine AI tools with an open conversation and campaigning about men’s abilities - and even their right - to be vulnerable with other humans, whether male or female, then we will be on the road to longer-term healing.


Written by Claire Lamont

4 views0 comments

Recent Posts

See All
bottom of page