New artificial intelligence app ‘reconstructs’ negative thoughts to treat depression

I can forgive people for being depressed these days. The future touted by world leaders is dire.

of UN Human Development Report 2021-2022 states that living standards have declined in 9 out of 10 countries around the world since 2020, with multiple crises “hitting us at the same time, interacting to create new threats, making us uneasy, It makes them feel safe and they can’t trust each other.”

According to the United Nations, the solution is to “recognise and treat the global mental health crisis that undermines human development, and the polarization that divides us deeper and makes collective response more difficult.” to recognize

In response to the United Nations, Klaus Schwab, founder of the World Economic Forum (WEF), said the world was facing “unprecedented crises” today. To address the “global mental health crisis”, the WEF uplink The program presented artificial intelligence (AI), a platform for supporting companies supporting the United Nations Sustainable Development Goals (SDGs), and a cure for depression and dissent.

A company called Wysa recently spoke at the recent WEF Davos Summit, demonstrating a phone app that uses AI to provide psychological counseling.

“This is the person who walks into the app and starts talking about how they are feeling, not necessarily about depression,” said Jo Aggarwal, India CEO of Wysa, who said AI text therapy sessions I mentioned it while showing an example of “AI helps reconstruct what this person is thinking and opens up,” she said. “People open up to AI three times faster than he does to a human therapist.”

According to Aggarwal, the Wysa app now has about 5 million users in more than 30 countries. “232 people wrote to me that they are only alive today because they found this app.”

According to Wysa, many companies have chosen to use its app, including Accenture and SwissRe. And so is school.

“Teenagers are our first cohort,” Agarwal said. “About 30% of our users are under the age of 25. We have a cutoff of 13+.”

Numerous attempts have been made to test and improve the program.

“We built it iteratively for three years,” Aggarwal said, adjusting the program when users had concerns. Younger users in particular were concerned about the “power gap” created by the app. “I don’t want to rebuild my negative thoughts because that’s my only control in this situation.”

“Then we changed the way clinicians tell us what else they can say,” she said.

prepare a child’s mind

The program aligns with another UN effort to align children’s minds in support of the UN’s SDG goals, called Social and Emotional Learning (SEL). SEL is now part of most public and private school curricula in the United States and other countries.

In a report titled “”SEL for SDGs: Why we need social and emotional learning to reach the Sustainable Development GoalsThe U.N. is concerned when what children see around them contradicts progressive ideologies presented by teachers, or when concepts such as systemic racism or intersectionality prove self-contradictory. They claim that children suffer from the “cognitive dissonance” that occurs.

The United Nations says that for children, “dissonance is unpleasant. Aversive arousal is because contradictory perceptions prevent effective and consistent action.” In other words, cognitive dissonance can allow UN-endorsed concepts to be questioned and may cause children to reconsider taking action to support the SDGs.

“The dual potential for dissonance undermines development goals by enabling compromise and inaction,” the report said. We propose two concrete ways to manage dissonance and achieve the SDGs: emotional resilience and prosocial behavior. ”

The psychological problems the United Nations sees don’t just give children headaches. The WEF says that the productivity of “human capital” is also undermined.

According to the World Health Organization (WHO), 12 billion work days are lose every year Depression and anxiety cost the global economy about $1 trillion. The report found that 15% of his global workforce has a mental disorder, with “bullying and psychological violence (also known as ‘mobbing’)” among its causes. I am adding.

According to Wysa, global mental health is deteriorating at an alarming rate. Currently, 1 in 8 of her suffers from a mental health disorder, with “Major Depressive Disorder” up 25% of him, and 42% of his surveyed employees suffering from mental health I am replying with a problem. has declined recently, with one-third of her employees saying they struggle with feelings of sadness and depression.

The WEF-supported pandemic lockdown seems to be the biggest culprit. However, declining living standards due to fuel and food shortages as a result of the WEF’s net zero carbon emissions campaign are also important factors.

Brain data risks

Regarding the pros and cons of AI therapy, report According to Psychology Today, it has the advantage that patients can get treatment anytime and at a lower cost. Furthermore, “machine learning could lead to the development of new kinds of psychotherapy.”

The downside is that patients may worry that the data they encounter will be used for marketing, including targeted advertising, espionage, or other malicious purposes. There may even be concerns that the data will be hacked or used for ransom purposes. ”

WEF report titled “”4 Ways Artificial Intelligence Can Improve Mental Health TherapyOne of the ways the AI ​​is “helping” it, it says, is monitoring a patient’s progress by tracking what they call “change talk active” statements made by the patient. A “Change Talk Exploration” where the client contemplates how to move forward and make a difference.

“Not hearing such statements during therapy is a warning sign that the therapy is not working,” wrote the WEF. We can also explore the language used by therapists who have worked with us, opening up opportunities for training other therapists in the field.”

Questions from WEF attendees during the Wysa presentation included whether AI therapy apps could be programmed to offer specific “values ​​such as services and communities,” It included whether to check and assess how they are suffering. may be a patient.

Aggarwal replied, “People started to lose their sense of security when they analyzed their voices.”

“If you use their data to say that you didn’t seem to sleep very well last night based on their phone calls, they would start to feel less secure. would say So we stripped away all the cool AI and gave it what it needed to make this feel private and safe. ”

Speech recognition programs may be added in the future, provided they are able to do so in a way that the app owner deems to be “clinically safe”.

Aggarwal said Wysa has been working to create a truly equitable app so that “people in sub-Saharan Africa can have the same access that people who work at Goldman Sachs have.” Some languages, such as French and German, have a “commercial track” for using the app. As in Hindi, there is a ‘non-commercial track’.

Aggarwal explained that she herself struggles with depression, which inspired her to create an app that can help others.

“I wanted something that could teach me how to restructure negative thoughts, all the evidence-based techniques that I could feel supported by,” she said. When you think of AI, don’t think of it as a separate entity, think of it as your own resource to process things in your own head.”