Steve Ardire on a Foolproof Way to Diagnose Mental Health


Trailblazer: Steve Ardire, CEO

The trail he's blazing: AI-driven mental health assessment tool that uses tonal and emotional analysis to give therapists accurate real-time patient insights. The below article is a summary of the Podcast Interview.

Are you so overwhelmed with Covid-19 stress that you're considering getting help from a mental health professional? AI expert Steve Ardire believes the quality of traditional mental healthcare leaves much to be desired. His company, is introducing a new AI-driven mental health assessment tool. It will give industry professionals what he believes they need - accurate and objective data.

“Mental health tools don’t get deep enough to validate a clinician’s subjective analysis.”

Ardire’s solution couldn’t be more timely. 50% of Americans say the corona crisis is harming their mental health and depression rates have tripled. In this interview, Ardire explains the limitations of emerging Telehealth services and why clinicians need accurate assessments, now more than ever.

Q: What are the limitations of telehealth services?

A: Whether you visit a doctor via telehealth or in-person, they only have a brief overview of your medical history to go on. You’re typically not getting an in-depth assessment. They will likely do a quick diagnosis and may prescribe a drug.

"Patients are often burdened to provide detailed explanations"

Video chat services are not helping the situation. Patients are often burdened to provide detailed explanations and must spend time writing and sending messages. And the generic feedback they get may be of little help.

Q - Why are mental health problems hard to detect on video chat?

A - Mental problems are hard to detect to begin with. There are several dimensions of mental illness. The DSM model is the gold standard for assessing them, but it tries to define mental illnesses in neat little boxes. Most clinicians would agree few patients fit these models because many symptoms overlap among different illnesses.

"Human behavior is very messy. Most of the activity in the brain is non-conscious."

A patient with anxiety could also have depression and suicidal tendencies. So it’s hard to create an accurate assessment over video chat. It is too subjective and there are too many variables.

Q - What mental health problems are doctors struggling to diagnose?

A - The most challenging clinical disorders include borderline personality disorder and manic depression. Manic depression has a pattern of varying moods. It’s also referred to as bipolar disorder where there are episodes of mania and depression.

This is tough to treat. And there are also people with varying degrees of this problem that can result in anxiety and depression.

Q - What's wrong with current mental health tools?

A - Human behavior is very messy. Most of the activity in the brain is non-conscious. AI has its limitations, too. According to a McKinsey study, only 16% of companies have taken AI beyond the pilot stage. AI is very difficult to apply, especially for mental health assessments.

"Current mental health tools can't recognize causal analysis."

A start-up I know of is using AI for sales enablement with psychographic and behavioral data. In fact there are many AI platforms offering predictive risk assessments. While these tools may increase accuracy, but they only answer the who, what, where, when and how of a question – not the why.

The why is the hardest to achieve. It represents causal reasoning - in other words, your common sense. Current mental health tools can't recognize causal analysis. They do the what and how, hoping that's enough to put a patient on the road to recovery. But they don’t go deep enough to validate their subjective analysis.

Q - Is there a better way to assess mental health?

A - We’ve developed something we call Contextual AI. It’s based on the idea that the human voice is rich in rhythm and pitch. Stress can be detected through intonation when someone is anxious. Our goal is to assess mental health through a combination the spoken word and emotions.

"We've created an AI engine that can see what people are thinking in real-time."

Psychologists and clinicians are not trained to identify someone’s hidden micro-expressions. A good example of this is how it’s possible to fake your way through a lie detector test. That’s why video chat and a doctor's semi-objective opinion have limited effectiveness.

If there was a machine capable of parsing the spoken word and reading a persons’ emotions, you could cut the guesswork. We've created an AI engine that can see what people are thinking in real-time. It will enable a more personalized and accurate treatment.

Q - How Does Work?

A - Our belief is that using multi-modal contextual AI is the most effective approach. There are other solutions that use tonal analysis only. But we’ve combine tonal and emotional analysis and included semantic classifiers that identify emotional state.

"Telehealth companies will be better equipped to help anyone with an internet connection."

Our mental health assessments include social media questions, too. This is important for today’s youth who must address problems like cyber-bullying. Traditional mental health assessments don’t include this rich data.

We’re working with Vanderbilt Neuroimaging and Brain Dynamics Lab. We'll be validating our Group Reflections product through an FMRI. This will reveal how a patient’s brain is working while they are being asked open and closed questions. will cut the guesswork. Mental health professionals will have deeper insights to better help their patients. Telehealth companies will be better equipped to help anyone with an internet connection. will help put America and the world on the road to better mental health.

Learn about trailblazing entrepreneurs. Get podcast interviews sent to you by email. Subscribe at: TechnologyTrailblazers.Club.

Derek Little

Derek Little

CEO or and Chief Podcasting Officer of TechnologyTrailblazers.Club

Read More