It’s unfashionable to encourage anyone to use AI for mental health and that probably comes from a good place. As of this writing, LLMs are still new and we’re stumbling around with them trying to figure out _what we should do_ with them. They are reminiscent of the World Wide Web in the late 90s with our page counters and invitations to sign guestbooks. Meanwhile, mental health is a serious subject and giving improper care can have devastating consequences.
> [!note]+ Try my prompt
> [Here is my main prompt I use day-to-day](https://codepen.io/ajkochanowicz/pen/jOQZRgM), and you can edit it with a quick form to personalize it to you.
> I often tweak and improve this and will keep this resource up to date as well.
*In case you're wondering, no. I did not write this article with AI. But that's also what a well-prompted AI would say, wouldn't it?*
Folks are quick to point out there's no replacement for a real human being on the other end serving as a therapist. But why do we keep referring to this one single resource path for mental health? My friends aren't therapists either, should I stop confiding in them? I think cookies are good for my mental health now and then. Are cookies therapists?
Like the old Geocities landing pages of the early Internet days, I think we're just in that awkward phase where we haven't developed as a society some compelling use cases for AI-powered mental health interactions that tackle specific and concrete areas. I think I've found a few personally so rather than get on a soapbox about this, I wanted to share them in case they could help anyone else.
## Use cases
### Adam University
As mentioned above in the customizable prompt, my go-to is Adam University, a group of experts who are ready and waiting to give me expertise. ChatGPT decides who will show up to answer my questions based on the nature of them.
You don't necessarily need a prompt to get advice, but there is something special about getting advice that is tuned to the way different actual people living or dead would approach a problem. If you don't believe me, try it out for yourself. Also note that I've **only had success with this prompt using GPT-4** which is only available to paying users at the time of this writing.
### Attitude Eraser
There is something so self-manipulating about our own anger that it finds a way to slip in to our speech and writing in a way that we are convinced is scrubbed of any evidence of the emotions we feel.
I use the attitude eraser to outsource what I want to communicate to an LLM that is emotionally detached from anything going on.
> [!note]+ Try my prompt
> Here is my [Attitude Eraser](https://codepen.io/ajkochanowicz/pen/qBQxzdB) prompt.
### Social Cues
Those of us who struggle to read social cues may not realize how unique this gift really is. Misreading the room or missing cues is very common in the human species. Here are some examples of how ChatGPT could be helpful when you have the ability to respond offline:
#### Disguised request
> User: My friend is staying with me and keeps asking me, "it's a little cold in the house, isn't it?" I don't understand why they keep saying this. Can't they feel for themselves how cold or hot it is? It seems like they constantly need to check in with me to get my opinion.
> ChatGPT (abridged): They might be uncomfortable with the temperature but don't want to explicitly ask you to adjust it. By phrasing it as a question, they might be hinting that they want the temperature to be adjusted.
#### Faux pas, asking too much
> User: My colleague had a family emergency recently. When they came back to the office, I asked if everything was okay and what happened exactly. Every time I ask, they say "I'll tell you later." Why do we have to talk about it later? We weren't even busy at the time.
> ChatGPT (abridged): Some people are private by nature and may not feel comfortable sharing personal details, especially if the situation involves family matters. They may not want to discuss it in the office, even if it's quiet at the moment.
### Contextual ambiguity
> User: This is the thread of a conversation with my team on Slack (pasted here). My boss replied and asked me, "What do you think can you give it a go?" I don't want to look stupid by asking what they mean but I'm not following what's going on. Can you tell me what she might mean by this?
> ChatGPT (abridged): Your boss is referring to the prototype drawings from the engineering team. By "give it a go" they mean they want you to submit them to the leadership team and ask for a formal evaluation. The engineers want to know if the specs will be feasible before they commit to their ideal solution.
## Final Thoughts
In the end, using AI for mental health care isn't about replacing therapists, friends, or cookies. It's about providing another tool in our mental health toolbox that can offer its own unique benefits. The use cases I've listed above are just a few examples of how AI can help us manage our mental health and navigate our interpersonal relationships in a more effective and compassionate way.
Ultimately, AI's role in mental health is yet to be fully realized. But as we continue to experiment, adapt, and learn, we can find more ways to incorporate these powerful technologies into our lives in a meaningful and beneficial way. Just as the internet transitioned from the days of Geocities to the incredibly versatile and powerful resource it is today, AI is undergoing its own evolution. It's up to us to harness its potential for good and to help shape its role in our society.
The journey is an ongoing one, but the promise is certainly there. As we move forward, I encourage everyone to remain open to exploring these applications. Try out the prompts I've shared, see what works for you, and share your experiences. We're all in this together, learning, experimenting, and growing. I'm confident that together, we'll continue to find more ways to use AI to improve our mental health and our lives as a whole.
Remember, AI can be our ally in understanding ourselves and the world around us. Let's continue to innovate, explore and utilize this powerful tool as we chart our course through the vast landscape of mental health. After all, we're just getting started.
> [!note]+ Continue the conversation
> If you've found any of the AI mental health use-cases helpful, or have any of your own to share, I'd love to hear from you. Connect with me on [Mastodon](https://mastodon.online/@nicegoingadam). Let's grow, evolve and navigate this journey together.