Artificial intelligence can now more accurately detect whether you’re depressed by analyzing the sound of your voice, thanks to new research by University of Alberta computing scientists.
Using two standard benchmark sets of audio recordings ranging from five to 50 minutes long, Ph.D. student Mashrura Tasnim and professor Eleni Stroulia developed a method that combines several machine-learning algorithms to recognize depression more accurately from acoustic cues.
The researchers built on past studies suggesting the timbre of our voice contains information about our mood.
The ultimate goal is to develop helpful applications from the technology, Stroulia explained.
“A realistic scenario is to have people use an app that will collect voice samples as they speak naturally. The app, running on the user’s phone, will recognize and track indicators of mood, such as depression, over time,” she said.
“Much like you have a step counter on your phone, you could have a depression indicator based on your voice as you use the phone.”
About 11 percent of Canadian men and 16 percent of Canadian women will experience major depression at some point in their lives, according to the Government of Canada. And 3.2 million Canadian youth aged 12 to 19 are at risk for developing depression, according to the Canadian Mental Health Association.
Such a tool could help people reflect on their own moods over time or be used to work with mental health service providers, the researchers said.
“This work, developing more accurate detection in standard benchmark data sets, is the first step,” noted Stroulia.
The study, “Detecting Depression From Voice,” was presented at the Canadian Conference on Artificial Intelligence earlier this year.
A wearable vibration sensor for accurate voice recognition
Mashrura Tasnim et al. Detecting Depression from Voice, Advances in Artificial Intelligence (2019). DOI: 10.1007/978-3-030-18305-9_47
Researchers improve AI that can tell from your voice if you’re depressed (2019, July 12)
retrieved 12 July 2019
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.