An increasing suicide rate of students due to depression which is becoming worse by COVID-19.

What it does

Mental health is a complex issue and articulating how one feels can be extremely difficult even to a trained professional who can see the facial expressions, nuanced inflections in tone and even body language. Compared to that, using a chatbot may seem superficial. Conversely, there is an important issue that can arise in these relationships that digital mental health care is uniquely poised to address: stigma. People sometimes feel too ashamed to open up honestly about their struggles to another person, even when that person is an accepting, nonjudgmental therapist. In particular, out of fear that therapists will view them negatively, clients may engage in what social scientists call impression management: selectively choosing to only share information that portrays them in a positive light. By withholding information, consciously or not, clients prevent therapists from fully understanding their problems and, in turn, from providing appropriate solutions. Unfortunately, this behavior is especially likely during the first consultation, which sets the direction for all future sessions. With AI-powered virtual humans that exactly look like humans it will become possible to have the best of both worlds by simulating human connection through technology and designing interfaces that mimic body language, demonstrate engaged listening, express empathy, reciprocate with sharing personal stories, employ emotional intelligence, and make people feel cared for. These advances may bridge the divide between human and virtual therapists and empower certain people to get the help they need but wouldn’t otherwise get for fear of stigma. Technology enables people to bypass this barrier. Consider a fascinating study in which over 200 participants talked with a virtual human—that is, an artificially intelligent avatar who asked questions as a therapist would during an initial clinical interview and developed rapport through compassion (such as saying, “I’m sorry to hear that,”) and nonverbal behaviors (such as nodding).

What's next for Ira

For the challenge, we will build self-analyzing questions to track the severity of expression, later we will build complete low-level intervention using CBT techniques.

Try It out



cbt, cgi, machine-learning, natural-language-processing, unity, vision, voice

Devpost Software Identifier