Social Media and Artificial Intelligence as Tools for Therapy
Gen Z is “online and overwhelmed,” according to Esther Fernandez of the Made of Millions Foundation. It’s a fair statement, considering the unlimited access to digital content vying for a user’s attention aided by AI algorithms that responsively shape the virtual world around them. Gen Z is the largest population alive today and consists of those born in the late 1990s to the early 2010s. This generation is the first to be born and raised with access to the internet and digital technology, which makes them the current focus of corporations, news and media outlets, and researchers. According to The Anxious Generation book, the “phone-based childhood” has contributed to a sharp increase in depression, anxiety, self-harm and suicide among youth since 2010. Individuals have access to a constant stream of content that can trigger depression or fear from their back pocket, and the algorithms will feed the user more content related to their engagement. This could send those already struggling with their mental health into a spiral of content that exacerbates symptoms. For better or worse, our clients are engaging with content online, and young people are more likely to turn to the internet for answers to their mental health problems.
During our recent attendance at the 2025 Mental Health America (MHA) Conference, we focused our time on attending a variety of presentations about digital technology and mental health. The population in attendance and those presenting at the conference included representation from Gen Z, and they explored the pros and cons of social media, artificial intelligence (AI), and community involvement in the mental health field. The theme of this year’s conference was “turn awareness into action,” which was integrated into each presentation by providing the audience with action steps to take away. As the societal focus continually shifts online, mental health practitioners aim to stay present and aware. In this article, we explore questions related to using social media and AI as tools for therapy.
One question that stands out is how to navigate the availability and presence of AI for our clients. One presenter, Dr. Kibby McMahon of KulaMind, described a shift in family members seeking support online, especially after the COVID pandemic, which led to the creation of her caregiver support app. She provided an overview of AI platforms and apps being used in the field, such as mental health chat bots, digital therapy/coaching, real-time health monitoring, community/peer support, marriage/co-parenting, and caregiver support. One of the most well-known examples of a large language model (LLM) AI system is ChatGPT, which is trained to be complementary, lacks clinical judgment, is predictive but not always right, and has rights to users’ personal data. There is a big concern about the use of AI systems not intended to aid in mental wellbeing, some of which have resulted in negative outcomes such as suicide and other unsafe behaviors. However, there has also been data showing improved functioning in some users of these AI platforms. Dr. McMahon provided a few action steps, including trying out AI apps, centering lived experience in the use of AI for mental health, and joining tech/mental health partnerships to ensure mental health professionals are involved in the creation and management of the technology.
Another important question concerns the evolution of the current social structure and gradual movement of socialization online, also known as social media. Several presenters at the MHA conference framed social media as a “triage for mental health,” including Dr. Courtney, LCSW, PsyD of @the.truth.doctor, who prides herself as the top mental health influencer in the country. Along with her co-presenters, she proposes that individuals who turn online for answers are more likely to seek out services to confirm their suspected self-diagnoses. Dr. Courtney also described the benefits of social media users sharing their mental health symptoms and treatment experiences online to help others overcome the negative stigma and seek treatment. Another presenter, Israa Nasir, MHC-LP and author of Toxic Productivity, described social media and online content as a “misinformation machine” that promotes isolation and over-pathologization. Social media influencers and AI algorithms can spread misinformation, such as wellness fads and self-help narratives, and intentionally false “disinformation” can tout anecdotal stories as evidence and discourage trust in mental health treatment. Uma Chatterjee, MS, MHPS of Lived Experience Council-One Mind, said that “certainty is a red flag.” When an online source speaks in absolute terms, it may be an indicator of inaccurate or biased content, since the reality of mental health issues is more nuanced and uncertain due to known gaps in scientific evidence. Despite their differing perspectives, the action steps proposed by these presenters included strong encouragement of the critical evaluation of online content for our clients, ourselves, and our communities.
As we began turning our newfound “awareness into action” we learned the term “co-creation.” We heard from numerous active Gen Z community organizers, researchers, mental health clients, students and professionals, who relayed the importance of involving clients in developing and shaping their own care. Many high school and college students involved in mental health and community organizations, such as Headstream and Our Minds Matter, described the positive impact of co-creating on the youth involved and the organizations. Of course, during their presentations they asked us to co-create by sharing our perspectives and takeaways. One application of this concept for therapists in community mental health is to actively solicit feedback from our Gen Z clients about their perspectives, behaviors, and online culture, then integrate their input within their own mental health treatment and help bridge generational gaps between clients and therapists. On an organizational level, we could solicit input from Gen Z on the community advisory board and involve them in community outreach events.
Finally, here are some of our action steps that may benefit your clinical work, your clients, and you! Firstly, ask your clients what information they are getting on their social media feeds and AI platforms. This gives you an opportunity to understand what they engage with and offer clinical feedback. Help them distinguish between “feeling” content, that may have an instinctive or mood-based intent, versus “credible” content that includes legitimate sources and clinical recommendations. Secondly, teaching media literacy is an important step to helping clients distinguish between truth, misinformation, and disinformation. Israa Nasir proposes the “Quick 6” check, which includes analysis of the Claim, Credentials, Citation, Context, Consistency, and Care Path. Thirdly, for both you and your clients, scroll with intent. AI and algorithms learn based on the content that holds your attention and cannot determine if the content is ‘good’ or ‘bad’, so feeds can become filled with endless negativity and misinformation. One solution is to create multiple accounts for different moods and intentions, such as a ‘mood boosting’ account for proactive engagement and a ‘doomscrolling’ account for intentional engagement with potentially negative content. A great way to combat this on a larger scale is to be mindful about ‘liking’ or ‘sharing’ a video that may contribute to misinformation, as this will only help spread the content to others. Lastly, if there are AI apps, videos, online resources, or social media profiles that may be helpful for a client or their families, consider reviewing and vetting them for credibility before recommending them, and educate your client on the pros, cons and limitations of these tools for therapy.
References and Resources (October 2025):
- Mental Health Advocacy & Education | Made of Millions Foundation
- Generation Z – Wikipedia
- The Anxious Generation | Jonathan Haidt
- Home | Mental Health America
- KulaMind empowers family and caregivers of people with mental health or addiction challenges by providing real-time support, expert guidance, and community. – Home
- ChatGPT