— are increasingly being looked to as a way to help screen for, or support, people who dealing with isolation, or mild
with a patient. It's an area garnering lots of interest, in part because of its potential to overcome the common kinds of financial and logistical barriers to care, such as those Ali faced.There is, of course, still plenty of debate and skepticism about the capacity of machines to read or respond accurately to the whole spectrum of human emotion — and the potential pitfalls of when the approach fails.
Tekin says there's a risk that teenagers, for example, might attempt AI-driven therapy, find it lacking, then refuse the real thing with a human being."My worry is they will turn away from other mental health interventions saying, 'Oh well, I already tried this and it didn't work,'"she says. Someone dealing with stress in a family relationship, for example, might benefit from a reminder to meditate. Or apps that encourage forms of journaling might boost a user's confidence by pointing when out where they make progress.It's best thought of as a"guided self-help ally," says Athena Robinson, chief clinical officer for Woebot Health, an AI-driven chatbot service.
There are plenty of people to chat to on the internet, you just have to know where to look. They’ll stay up with you all night.
The borg hive mind is not far off now
This sounds like the beginning of a bad scifi movie.
Lmao
Patient: “I just have this overwhelming sense of sadness.” Chatbot: “I’m sorry, could repeat your ailment? Type 1 for depression, Type 2 for anxiety, Type 3 for something else.”
Nope
Is swearing considered talking? Because I do that to my computer all the time.
' Talk to a machine. That'll help....'
Just stop with this stuff. No.
What if its computers that are driving you crazy?