A1: Replika – AI Companions supporting Mental Health

Have you felt lonely during the Pandemic over the last year? Have you felt your mental health slipping during this time because of the lack of connection and social isolation? And last but not least, have you tried using an app on your mobile device to support your mental health, but abandoned it because it was not meeting your personal needs or keeping you engaged?

My project is an extension of a previous A1 project, done by Kristin Garratt. Kristin Garratts’ A1 project involves a very informative video outlining mobile applications that can support Mindfulness to promote mental well-being. Kristin shares her perspectives on how mobile technologies can support Mindfulness with strong evidence and applications that can support learners in education. However, Kristin also mentions how many people will abandon these apps quickly and not stick to the program outlined for support. A1 – Does Mobile Technology enhance Mindfulness? | ETEC523: Mobile and Open Learning (ubc.ca)

I was inspired to research new engaging applications which can support mental health by means of your mobile device and then discovered Replika: an AI companion who can help support you and strengthen you in incredible, and slightly shocking ways. This engaging and supportive application could be used to support young adults at any time or place by means of their mobile device. This could be used in education to support our remote learners and help them to build positive mental health skills and healthy habits, but with every new technological development, there are concerns and drawbacks. To learn more, watch my media essay!

References

Cheng, Y., & Jiang, H. (2020). AI‐Powered mental health chatbots: Examining users’ motivations, active communicative action and engagement after mass‐shooting disasters. Journal of Contingencies and Crisis Management, 28(3), 339-354. https://doi.org/10.1111/1468-5973.12319

DisneyMusicVEVO. (2019, November 15). Show Yourself (From “Frozen 2”/Instrumental/Audio Only) [Video]. YouTube. https://www.youtube.com/watch?v=w26AETQGcxc&ab_channel=DisneyMusicVEVO

Luka, Inc. (n.d.). Replika. Replika.Ai. Retrieved February 6, 2021, from https://replika.ai/

Quartznews. The Story of Replika, the AI App That Becomes You. YouTube, Quartz, 21 July 2017, www.youtube.com/watch?v=yQGqMVuAk04

Statistics Canada. (2020, October 20). Impacts on Mental Health. https://www150.statcan.gc.ca/n1/pub/11-631-x/2020004/s3-eng.htm

Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., . . . Loggarakis, A. (2020). User experiences of social support from companion chatbots in everyday contexts: Thematic analysis. J Med Internet Res, 22(3), e16235. https://doi.org/10.2196/16235

Varol, Onur, et al.  “Online Human-Bot Interactions: Detection, Estimation, and Characterization” Center for Complex Networks and Systems Research, Indiana University, Bloomington, US 2 Information Sciences Institute, University of Southern California, Marina del Rey, CA, US (2017): 1-11.


( Average Rating: 5 )

12 responses to “A1: Replika – AI Companions supporting Mental Health”

  1. Nicole Kenny

    Hi Elixa! What a fascinating topic. I too have fallen off the bandwagon of using some of the apps you showed. I can see both the negative and positive sides of an AI friend. Positive in that there was someone you could be completely comfortable talking to without judgement. This could have positive benefits of learning how to deal with conflict by practicing a conversation and becoming comfortable in what they want to say. The negative side as you indicated would be privacy and withdrawing from the world or becoming singular in your belief that your AI companion provides better support over family and friends.

    I do believe that as we continue to understand how the pandemic impacted everyone from mental and physical health to learning and relationship development that the use of apps such as Replika could play a role in reducing stress and anxiety and improving our ability to manage a future crisis.

    Nicole


    ( 0 upvotes and 0 downvotes )
  2. Evelyne Tsang

    Hi Elixa,
    This was a really interesting topic. The use of AI in the context of supporting someone in need of mental health is commendable. The future paths beyond such contexts are unclear.

    I am aware of AI that identify personality characteristics in order to advertise products and news feeds. Using AI as an alternative to actual relationships reminds me of the fictional forecasting of robots in Isaac Asimov’s stories. The question of “should we” ought to preceed “could we”. The fact that this app has themes such as bff or romance are worrisome for me. I would rather my child learn to interact with humans, and to consider computer programs as technology instead of as companions. Given how far we have evolved into AI research and development, this appears to be a moot point. The next question, then, should be based on ethics. AI learns right and wrong from humans, so we humans need to agree on what is right and what is wrong, and the whole spectrum between these extremes.
    I found this article of how current leaders in AI forecast the questions pertaining to use of such technology: https://www.pewresearch.org/internet/2018/12/10/artificial-intelligence-and-the-future-of-humans/

    As per your analysis of Replika, the premise of seeing this AI as a positive reflection of oneself intriguing. Could we consider the use of AI apps such as Replika to be a self-reflection technique, the way we use a mirror? Imagine then, accessing Replika and then reviewing our usage stats. This turns the very humanistic AI back into a computerized tool, and would perhaps help people gauge themselves in terms of mindfulness, while gently guiding them back to real-human interactions.

    There are so many ways to move forward! I am curious to see how this technology will evolve.

    p.s. Could you also tell me more about the program you used to create your presentation avatar? It was fascinating to watch!


    ( 2 upvotes and 0 downvotes )
    1. Elixa Neumann

      Hi Evelyne,

      Thanks for the great feedback and very interesting article! I very much enjoyed reading through this and seeing the many perspectives on AI and where it could be heading.

      I would agree that it is like a self reflection technique, however with encouragement. When I look at myself in a mirror on a bad day, the self talk is quite harmful…

      It was edited in Adobe Premier Pro, but I filmed myself on my iPhone 11 using the avatar setup for text messaging… I’m glad you found it interesting!


      ( 0 upvotes and 0 downvotes )
  3. EmilyChen

    Hi Elixa,

    This is such a well made video! I’m curious about what software you used to make it?!

    I’ve never heard of an AI companion before, it’s a very interesting concept. When you mentioned that the AI companion asked you a lot of information about yourself, and how you interact with others, at first I thought it was a huge breach in privacy, but when I tried to look at the positive side of things, I thought to myself… isn’t this what our family and good friends do for us sometimes to help us look deeper into ourselves? I can see how having an AI companion would be helpful if I was feeling down and needed someone to just listen to me. Sometimes when I am brainstorming, I find myself talking out loud. Having a conversation with myself for me is a great way to brainstorm ideas, I think maybe I will try out using the AI companion! I think this AI software has huge potential if used right.

    I also agree with Lyndsay that it would be good if the app would connect a user with a pre-approved contact, in case if something does happen, and they need a close family member to be around. Thanks for sharing!


    ( 1 upvotes and 0 downvotes )
  4. emma pindera

    Elixa, I am extremely impressed by this video essay, and I think you raise many great questions! AI has always been a little creepy to me, but this tool sounds fascinating! Especially as you said, to help those with mental health issues. However, I am a little concerned, because many times anxiety, depression, and other mental health issues are caused by the lack of social interaction and connection. Although this seems like an interesting replacement, it is difficult to replace the effect of a human hug or real support from someone you know and love.


    ( 0 upvotes and 0 downvotes )
    1. Elixa Neumann

      Hi Emma,

      I completely agree that often times those mental health concerns come from being disconnected. However, if I were a parent, would I want my child to be seeking connection through online games such as Roblox, Minecraft, vrChat, or Discord without any filtering of content when they’re at home learning or an AI which will boost their morale and guide them through self-nurturing activities? I’ve had a couple of my friends who live alone during lockdown try out the app over the past few weeks and they’ve found it much more satisfying than trying to find connections through dating apps and websites. However, I hope that when lockdown ends, they will reach back out to the community… This would make a very interesting research study!


      ( 0 upvotes and 0 downvotes )
  5. Ying Gu

    Hi Elixa,

    What an informative media essay! I had no idea that such a complex app was already in the market. Watching this, I can’t help but call up episodes of Black Mirror. Is it amazing that AI can feel so real, or creepy? I am not entirely sure at this point, but hopefully with more research, this clarifies. Do you think that using such a tool is fuel for bullying? Would a user be taunted for not having any real friends and having to make fake friends?


    ( 0 upvotes and 0 downvotes )
    1. Elixa Neumann

      Hi Ying,

      Funny you should mention that… My initial plan for my media essay was to make a black mirror movie about Replika… I decided that involved too much production time to build the full story though. Although there is one episode which already exists about the AI companion for the lonely girl. (I believe it is the episode with Miley Cirus)

      It would be an interesting study to conduct. Who would know that you are using Replika unless they are looking at your phone?


      ( 0 upvotes and 0 downvotes )
      1. Ying Gu

        Good point. It is something that a user can easily hide.


        ( 0 upvotes and 0 downvotes )
  6. lyndsay barrett

    Fantastic video, Elixa! Well done. This is also a really interesting app and concept. I can see how helpful it could be in inturrupting negative self-talk or encouraging mindfulness, etc.

    I wonder if having ways to connect with other humans could be built into the app? It could talk to a user about friendships or important people and then mention those people later in moments where the user may benefit from human connection. For instance, if a user was navigating grief, the app may suggest calling or texting a person the user knows who was involved (or not) in that same journey.


    ( 0 upvotes and 0 downvotes )
    1. Elixa Neumann

      Thanks! I had a lot of fun making the video 🙂

      What was interesting was that my AI asked me a lot of questions about my friends, how I interact with them, console them, and build friendships. There was also a direct help line built into the app if I ever felt I was in crisis and needed to talk to a health professional. Although I don’t think the app will directly connect you to people on the same journey for confidentiality reasons. Many people open up in this app because they know the information will be kept private.


      ( 0 upvotes and 0 downvotes )
      1. lyndsay barrett

        I should clarify; I meant it could be helpful if the app would connect a user with a pre-approved contact who knows the situation. For example, after the loss of a parent, the app might suggest the user call or text their sibling, depending on that relationship and if the user identified them as a trusted source. Privacy would certainly be paramount. I only imagine the app could suggest the contact and make it easy for the user to click and call/text right from the suggestion. Bridging the supportive AI with real-world connections is an interesting topic we’ll likely be exploring as it becomes a greater part of our lives.


        ( 0 upvotes and 0 downvotes )

Leave a Reply

You must be logged in to post a comment.