How Artificial Intelligence Uses Voice and Vital Signs to Predict Double-Fight

0
240

Suppose your relationship went up on the roof. You’ve been trying to solve your problems for two with double therapy, but in the end, you wonder if the effort will be worth it. Will things get better or worse?

The recommendation may seem obvious: stop for a second and listen to your partner. Do it. When we talk to each other, our voices contain all sorts of information that can reveal the answer. Subtle intonations in tone, pauses between phrases, the volume you’re talking about – all of this conveys hidden signals about how you really feel.

Most of this we learn intuitively. And we use this information to adjust the meaning of our words. Think about the difference between these questions:

This shift in emphasis is one of the most obvious ways of giving meaning to our speech. But there are many more layers we add without realizing it.

There is a way to extract this hidden information from our discourse. Researchers have even developed artificial intelligence that can use this information to predict the future of couples’ relationships. Technology is already more accurate in aspects like this than professional therapists.

Therapists x algorithms

In a recent study, researchers monitored 134 couples with relationship difficulties. Over two years, couples had two 10-minute therapy sessions. Each spouse chose a topic about their relationship that they considered important and discussed them together. The researchers also collected data on whether couples’ relationships improved or worsened and whether they were still together two years later.

Trained therapists watched videos of the recordings. In assessing the way couples talked to each other, what they said and their appearance as they spoke, therapists made a psychological assessment of the likely outcome of their relationship.

The researchers also trained an algorithm to analyze couples’ speech. Earlier research provided the team with clues that certain features were probably involved in human communication, such as intonation, speech duration, and how individuals took turns talking. The work of the algorithm was to calculate exactly how those resources were linked to the strength of the relationship.

The algorithm was purely based on sound recordings, without considering visual information from the videos. He also ignored the content of their conversations – the words themselves. Instead, the algorithm detected features such as cadence, volume, and how long each participant spoke.

Surprisingly, the algorithm also captured features of speech beyond human perception. These features are almost impossible to describe because we are not normally aware of them – such as spectral tilt, a complex mathematical function of speech.

“Using a lot of data, we can find patterns that can be indescribable to human eyes and ears,” says Shri Narayanan, an engineer at the University of Southern California in the United States who led the study.

After being trained in couples’ recordings, the algorithm became marginally better than therapists in predicting whether couples would be together or not. The accuracy rate of the algorithm was 79.3%.

The therapists, who had the advantage of being able to understand the content of couples’ speech and observe their body language, reached 75.6% accuracy.

“Humans are good at decoding a lot of information,” says Narayanan. “But we can not process all aspects of information available.”

‘Leaking’ emotions

The idea is that we are “leaking” more information about our thoughts and emotions than we as humans can understand. But the algorithms are not only restricted to decoding the voice features that people tend to use to convey information. In other words, there are other “hidden” dimensions in our discourse that can be accessed by artificial intelligence.

“One of the advantages of computers is the ability to find patterns and trends in large amounts of data,” says Fjola Helgadottir, a clinical psychologist at Oxford University. “Human behavior can give insight into the underlying mental processes,” she says.

“However, machine learning algorithms can make the job difficult to sort, find pertinent information and make a prediction about the future.”

An algorithm that predicts whether or not your relationship is doomed to failure may not be the most attractive idea. Especially because it can only achieve 75% accuracy. Such prediction could change the course of your relationship and how you feel about your partner.

But deciphering the information hidden in the way we speak – and in how our bodies function – can be used to improve our relationships.

Discussion thermometer

Theodora Chaspari, a computer engineer at Texas A & M University in the United States, has been developing an artificial intelligence program that can predict when conflicts are likely to escalate in a relationship. Chaspari and his colleagues collected data from non-intrusive sensors – such as a fitness bracelet – that 34 couples used for a day.

Sensors measure sweat, heart rate, and voice data, including voice tone, but also analyze the content of what couples said – whether they used positive or negative words. A total of 19 of the couples experienced some level of conflict during the day they used the sensors.

Chaspari and his colleagues used machine learning to train an algorithm to understand the patterns associated with arguments that couples reported having. After receiving training on these data, the algorithm was able to detect conflicts in other couples using only the sensor data, with an accuracy of 79.3%.

Now the team is developing predictive algorithms that it hopes to use to detect signs of a possible fight and warn the couple before the fight takes place.

By monitoring your perspiration levels, heart rate and the way you are talking, the algorithm would calculate the likelihood of facing friction with your partner.

The way it works is this: you’ve had a terrible journey at work, a stressful meeting out, and you’re on your way home. Your partner’s day was equally difficult. By monitoring the two levels of sweating, the heart rate, and the way you have been talking in the last few hours, the algorithm would calculate the likelihood of a fight between you guys happening.

“At this point, we can intervene to resolve the conflict in a more positive way,” says Chaspari.

This can be done simply by sending a message to couples before a discussion comes to light, says Adela Timmons, a project psychologist based at the Center for Clinical and Quantitative Psychology for Children and Families at Florida International University in the United States.

“We believe we can be more effective in our treatments if we can manage them in people’s real lives where they need it the most,” she says.

Pre-conflict intervention

The traditional model of therapy is not able to meet the goal of direct intervention. Usually a session can occur for an hour a week, when patients remember what has happened since the last session and discuss problems that have arisen.

“The therapist can not be there when someone really needs the support,” says Timmons. “There are many steps in the traditional process, where any intervention may be less effective.”

But an automated alert based on consistent monitoring of people’s physiology and speech could fill the dream in real-time of therapeutic intervention. It could also allow for a more standardized form of treatment, says Helgadottir.

“No one really knows what happens between four walls in a therapy room,” says Helgadottir, who developed an evidence-based platform using AI to treat social anxiety. “Sometimes the most effective techniques are not being used because they require more effort on the part of the therapist. On the other hand, the clinical components of AI therapy systems can be completely open and transparent.

“They can be designed and reviewed by leading researchers and professionals in the field, and computers have no days off, and there is no difference if 1,100 or 1,000 users are benefiting at the same time.”

However, there are risks. There is no guarantee that a message on your cell phone advising you or your partner about the susceptibility of a discussion will not “add fuel to the fire.” The timing of the intervention is crucial.

“We probably do not really want to intervene during a conflict,” says Timmons. “If people are upset, they will not be totally receptive to requests on their phones to calm down. But if we can pick people up as the discussion is beginning to gain momentum, they have not lost the ability to regulate their behavior – this is the ideal point of intervention. “

There are many technological hurdles to overcome before such an application can be implemented. The team needs to refine its algorithms and test its effectiveness on a wider variety of people. There are also big questions about privacy.

A data breach of a device that stores data in your relationship with your partner would put a lot of sensitive information at risk. In addition, what would happen to the data if there was an alleged crime, such as domestic violence?

“We have to think about how we would handle those situations and ways to keep people safe by protecting their privacy,” says Timmons. “These are broader social issues that we will continue to discuss.”

If this model of therapy is truly successful, it can also open doors to similar ways of improving other types of relationships – such as family, work, or physician-patient dynamics. The more our different body systems are monitored – from the movements of our eyes to muscle tension – the more it can be revealed about the future of our relationships.

There may be many more layers of meaning in addition to our speech and basic physiological reactions that can be better decoded by machines.

LEAVE A REPLY

Please enter your comment!
Please enter your name here