With the rise of many apps geared towards mindfulness and attempting to impact mental health, what are your thoughts on using technology to impact mental health?
I think with apps like Headspace
it is a hard balance of clearing your mind and still being on a device, that is the opposite of focus.
I think it’s great, it opens up a lot of opportunities for people who maybe don’t have access to doctors’ appointments for either financial reasons, or distance reasons, or because they may have trouble leaving their house.
Becca that’s a great point which I hadn’t considered, more so for people that might have anxiety about leaving their homes. The level of access to some level of immediate help is greater than at any other time in history.
With the ubiquity of mobile devices (on us all the time, on the all the time) mental health tracking or mindfulness apps like Headspace
— are easy lifts in starting to use a new method. Download, review, and use. Fits very well into our mobile lives — Fills the gaps in between us doing other things. Which can also be a detriment as they require some dedicated time.
I think the idea of using technology to “disconnect” is a bit weird at face value, but it’s going to be a powerful tool when used correctly
given some of the challenges associated with accessing mental health professionals (cost, location, fear of sharing with another person, etc.). I think using a technology-based tool will be a great first step into the space for a lot of folks.
Overall I think there is a huge benefit to using technology in mental health apps from all of the points listed above. One thing that does worry me is the potential for self-diagnosis and then having that potentially spiral into overdiagnosis.
I think there’s a lot of potential for where apps could go, but require a bit of self-awareness and commitment from the user. In the case of Headspace, I actually downloaded it a few weeks ago but never actually got started with the sessions because I’ve felt too busy with school and work and so I feel like I have better things to do (which I think is ironic, considering Headspace is supposed to motivate you to set time apart to clear your head).
While tech can help bridge gaps in services, and bring helpful tools to a broader audience than may have access to professional services, it can’t replace human connection — and succeed when they’re more focused on supplementing offline services than on replacing them.
It’s really interesting that there seem to be two camps here, one in which it’s great that technology provides access to resources that someone may not have access to, and the other in which it doesn’t seem entirely comfortable that it’s via an app or phone.
In that same thread, why do some feel like the experience of using technology feels “at odds” per se with respect to using mental health apps?
I think the tech isn’t ever going to change — we’re going to use it. So are there some best practices in making the digital experience more impactful
? Is, as Becca pointed out, not making it an isolating experience (has humans involved) a principle people should follow?
I think of myself as being from both camps. On one hand, accessing these apps via a phone can be distracting but on the other hand, if people feel uncomfortable about meditating or coping with mental health issues in public, the phone provides a smart disguise.
I think it’s because technology in my mind could be the root cause or could exacerbate numerous mental health issues.
It feels like a lot of technology is designed right now to suck you in and keep you distracted—companies are motivated by engagement time, so technology generally feels counterproductive to focusing on what’s going on in a user’s mental landscape to sort.
Really interesting point, Philip, so does it feel like maybe the technology is less authentic because there is a potential lack of trust with the app, even if it’s saying it’s for a good cause?
I’m not even sure if it’s distrust in that particular app. More like there’s a carryover effect from other apps that leads to a distrust of all apps. Like if a social media app that I use frequently always feels like it’s sucking my time and just throwing information my way that I care very little about but still feel so drawn to, what’s gonna make this app different? What’s my motivation for just adding another thing to my list of things to do right now? There’s an investment barrier for me because so few apps feel like they add value to my life so I don’t feel like this one is really going to be any different.
It sounds like there are two different paths that we’re taking here with regard to technology. One path is using technology by yourself to “disconnect” from what’s going on and the other is using technology to more easily connect with others. It seems like we have differing ideas to each of their strengths and weaknesses.
I think there are different ways in which people can cope with mental health issues. We’ve discussed a lot of apps like Headspace that provide people the ability to meditate and stay present. There is also a human need to talk with other humans about the experiences we are facing in life – it is interesting to see how technology can support this need.
Changing gears just slightly, but another point around technology is just on the proliferation of knowledge and community around mental illness — being able to go online and not only better understand mental illness, either for yourself or others in your life, but also find large communities of peers helps bring a once taboo subject into the light.
There is lots of research
that peer-based methods are extremely effective in helping those with mental health issue.
That’s an interesting point, Nick. I think this definitely gives rise to questions about the implications of having a mental health issue and the “sensitivity” of these issues to the design of such online communities.
Aarti, I think finding these communities also provide a moment of “oh you? Me too!” When you find the others then suddenly it’s not taboo because there are others.
Using technology can bring people who struggle with similar things, together and feel that they’re not alone. It also allows them to communicate anonymously or “safely” at a computer and allows for communicating more thoughtfully because it doesn’t require an immediate verbal response like a conversation might.
Devil’s advocate comment: there’s a definite and clear advantage of anonymity or online profile. Just wondering if anonymity could potentially perpetuate any sort of remaining stigma.
There’s also, of course, the issue of trolls in online communities, whose anonymity allows for a sort of terribleness not as frequently seen in “IRL” communities.
Where do you think technology and apps are headed next specifically with respect to mental health?
I think VR for some areas of mental health or therapy could be useful. I mainly think of people who may have phobias or anxiety difficulties. Basically simulating moments to practice and ease fear or stress.
I see VR as a way to help individuals visualize themselves in serene locations…something that meditation apps do through words.
That’s a great idea, I use Calm to guide my meditations and to be able to see the scene and feel immersed would be amazing instead of having to look down at my screen to see it.
There are certain methods of treatment, such as EMDR
, that are prime candidates for AR/VR.
I would love to see an app that lets users get in touch with a trusted friend when they encounter a triggering situation or feel themselves going into a headspace where they’d benefit from some external support. Like right now there are apps that alert an emergency contact if the user hasn’t checked in after their walk home at night, maybe like an app where you tap a button when you feel a panic attack or something coming on so a friend knows they need to get in touch.
As far as next applications, I could definitely see something like mavenclinic.com
showing up, but for getting in touch with mental health professionals on demand.
There might be something to haptic feedback as a way to treat mental health – certain touch feedback could be helpful (e.g. massage, aromatherapy exists, acupuncture exists, weighted blankets).
In tech now we have mental health trackers, texts reminders, phone calls, video calls. If I had a bet to place it would be that remote mental health will become extremely more powerful.
For acute cases, this also seems like an area of potential for VR — the immersiveness could help create a break from problematic thinking.
Really interesting points about acute cases, but I wonder does that still rely too much on someone to self identify an acute case vs. technology being able to predict or anticipate…
Do others see any opportunities for personal assistants like Google Home or Alexa to help impact mental health since technically they are always listening? What if Google Home could sense when someone might be having a panic attack and could trigger a guided meditation – helpful or creepy?
Carl, maybe if it was a separate application…personal assistants like Google Home and Alexa often exist in our living rooms and bedrooms... I think it might be a bit uncomfortable if the same applications we use to command our TVs and listen to music would be used for this purpose. The other thing to think about is that being an active listener to someone who has mental health issues means being able to empathize with what they’re going through…. There are also some nonverbal cues that are important.
Does anticipatory technology become too much of an intrusion?
I think that might depend a bit on how accurate that design
is—is this thing calling a therapist every time my heart rate goes up because I’m doing exercise? That would feel super intrusive and inconvenient. Or is it legitimately able to filter out the noise and only give you what you need when you need it? I think I would really appreciate that.
I think in general relying too much on anything can become a crutch and not helpful. With some mental illness issues, learning different techniques with just yourself are really key. There is always the opportunity that things break, you lose internet, something malfunctions, etc. You need the ability to have techniques that don’t rely on all these things.
There’s an inherent danger there that’s different from more low stakes tasks — AI predicting you might want to add laundry detergent to your grocery list, if it fails, doesn’t really matter that much. If AI gets emergency support wrong the consequences could be more substantial.
The other issue is that people with mental issues have slightly different needs…it might be difficult to suggest or implement one outcome that works for all.
Alright, great stuff from everyone—think we’ll wrap it up there! Thanks!