“We saw a sharp increase in mass shootings in the US in recent years, and that can cause increases in the need for mental health services,” says Yang Cheng, assistant professor of communication at North Carolina State University and first author of the study in the Journal of Contingencies and Crisis Management. “And automated online chatbots are an increasingly common tool for providing mental health services—such as providing information or an online version of talk therapy.

“But there has been little work done on the use of chatbots to provide mental health services in the wake of a mass shooting. We wanted to begin exploring this area, and started with an assessment of what variables would encourage people to use chatbots under those circumstances,” Cheng says.

The researchers conducted a survey of 1,114 US adults who had used chatbots to seek mental health services at some point prior to the study. They gave study participants a scenario in which there had been a mass shooting, and then asked them a series of questions pertaining to the use of chatbots to seek mental health services in the wake of the shooting.

The survey was nationally representative and the researchers controlled for whether study participants had personal experiences with mass shootings.

The researchers found a number of variables that were important in driving people to chatbots to address their own mental health needs. For example, people liked the fact that chatbots were fast and easy to access, and they thought chatbots would be good sources of information. The researchers also found that people felt it was important for chatbots to be humanlike, because they would want the chatbots to provide emotional support.

But researchers were surprised to learn that a bigger reason for people to use chatbots was to help other people who were struggling with mental health issues.

“We found that the motivation of helping others was twice as powerful as the motivation of helping yourself,” Cheng says.

Read the full article about mental health by Matt Shipman at Futurity.