Social Forums Offer Data Insights for AI-Driven Cannabis RehabilitationResearchers from CSU-Pueblo’s Institute of Cannabis Research are using discussion forums and artificial intelligence to create social support for rehabilitation.
Since the turn of the 21st century, through the information super-highway, we have made people halfway across the world our “Friends,” digitized our thoughts, feelings, and ideas, and put an immense amount of data online. Ingrained in nearly all our daily activities, there is little doubt social media has changed the way we live as humans.
In addition to simple social functions, many people are turning to online support groups in place of more traditional face-to-face support groups for self-help and rehabilitation. Funded by the Institute of Cannabis Research at CSU-Pueblo, Yuan Long, Associate Professor at the Hasan School of Business, and her colleague Kuangyuan Huang together with a team of student researchers, began investigating how online social networks can impact those seeking help with quitting cannabis, or any other substance.
In an interview with Cannabis Tech, Long stated, “The project stemmed from the ICR as an academic project to explore what kind of social support a quitter needs, analyze the data, and develop it for a practical meaning.”
AI Harnesses the Power of Our Words
Language is emotive, and certain words trigger particular emotional responses. Throughout literary history, authors have learned to manipulate the written word to stimulate an emotional response in the reader. In fact, emotional connotation is the backbone of modern content marketing schemes.
As such, when users participate in online discussion forums like Reddit and Quora, the type of language used may indicate the emotional state of the person behind the keyboard. Using a word bank that categorizes words based on a positive and negative connotation, Long and her team, compiled and analyzed the type of support people searched for in online forums and the impact of the responses they received.
First, they defined the different roles with the discussion forums. The initiator is the person who started the thread in the first place, while those who commented in the thread are the respondents.
While it may seem logical to conclude that positive responses result in an increase of positive emotion, what they found was any response triggered a rise in emotion from the initiator. Whether positive or negative, the simple act of acknowledgment from respondents alone caused an increase in emotion.
Types of Support for Quitting Cannabis
Through this ongoing research, Long and her team collected data from online social networks and analyzed more than 1000 messages where people shared their experiences, sought help, or supported one another in the public forums. Using comprehensive qualitative and quantitative research methods, the researchers discovered there are two main categories of support received through the online support groups.
- Informational Support – support that comes in the form of a suggestion, reference, or anecdotal story. For example, if the initiator posts that he/she is suffering from insomnia during withdrawal, the respondents may answer with suggestions that helped them, links to research about blue light at bedtime, or a recipe for an herbal tea. These are all examples of informational support.
- Emotional Support – sometimes support isn’t about fixing the problem, but just learning how to cope. In these instances, the response may not provide any practical information, but it can still encourage and uplift the initiator. An emotional support response regarding battling insomnia during withdrawal might include, “Hang in there, it does eventually get better!”
While Long admits, the research is ongoing, as the amount of time humans spend in the virtual world of social media continues to increase, we can only assume the number of people seeking help online will continue to grow as well. Long envisions artificial intelligence and machine learning as a catalyst that could help people through addiction and withdrawal.
“Ideally, we’d like to design a mobile application, a computer bot with AI technology behind it,” she predicted. Elaborating, she continued, “By analyzing the questions and answers, we can teach the computer how to answer the different questions and generate an emotionally appropriate answer.”
In the modern world, voice assistants like Alexa, Google, and Siri continue to learn how to tell us jokes, order our groceries on-demand, and even learn when to call for emergency assistance, it’s clear we already rely on AI for informational support. Foreseeing a day when AI learns how to provide emotional support, is really not a stretch of the imagination.