An app that tells you when you’re depressed

wfwfwf

Bloomberg

A Facebook message pops up on my phone screen. “What’s going on in your world?” It’s from a robot named Woebot, the brainchild of Stanford University psychologist Alison Darcy.
Woebot seems to care about me. The app asks me for a list of my strengths, and remembers my response so it can encourage me later. It helps me set a goal for the week — being more productive at work. It asks me about my moods and my energy levels and makes charts of them.
“I’ll help you recognize patterns because … (no offense) humans aren’t great at that,” Woebot tells me with a smirking smile emoji. So Woebot knows that I felt anxious on and happy. But who else might know? Unlike a pedometer, which tracks something as impersonal as footsteps, many mental-health apps in development rely on gathering and analyzing information about a user’s intimate feelings and social life.
“Mental-health data is some of the most intimate data there can be,” said Adam Tanner, a fellow at Harvard University’s Institute for Quantitative Social Science.
Chatbots have existed since the 1960s — one was named after “Pygmalion” heroine Eliza Doolittle — but advances such as machine learning have made the robots savvier. Woebot is one of an emerging group of technological interventions that aim to detect and treat mental-health disorders. They’re not for everyone. Some people may prefer unburdening themselves to a human, and many apps are hindered by bugs and dogged by privacy concerns. Still, the new technologies may fill gaps in current treatment options by detecting symptoms earlier and acting as coaches for individuals who might otherwise never seek counseling.

WARNING SIGNS
Clinicians and privacy experts are welcoming these inventions with one hand while holding up warning signs with the other. Technology might be a powerful tool to improve treatment, but an emotional problem, if it becomes known, can affect insurance coverage, ruin chances of landing a job or color an employer’s perception. With possible changes coming to health-care law, it’s unclear if pre-existing mental-health conditions could once again be used to charge people more for insurance or deny them coverage. Privacy concerns aside, the promise of collecting data is the ability to render a holistic picture of a person’s mental state that’s more accurate than infrequent assessments conducted in a doctor’s office.

Leave a Reply

Send this to a friend