Slingshot AI study on therapy chatbot met with skepticism

Mario Aguilar covers technology in health care, including artificial intelligence, virtual reality, wearable devices, telehealth, and digital therapeutics. His stories explore how tech is changing the practice of health care and the business and policy challenges to realizing tech’s promise. He’s also the co-author of the free, twice weekly STAT Health Tech newsletter. You can reach Mario on Signal at mariojoze.13.

Mental health chatbot developer Slingshot AI wants the world to believe that its smartphone app, called Ash, will do more good than harm. The evidence that the company offers, though, raises more questions than answers.

Founded in 2022, Slingshot has raised $93 million from investors and last summer launched Ash, touting it as “the first AI designed for therapy.” The company says Ash has been used by more than 150,000 people to help them manage everyday struggles like stress and anxiety. Ash is currently free to use by anyone.

Despite its momentum, Slingshot faces an uphill climb to earn trust: It recently complained to the Food and Drug Administration that high-profile news stories about tragedies allegedly tied to generative artificial intelligence products like ChatGPT have “skewed the public’s perception of risk associated with general wellness apps like Ash.” The company wrote that  “Ash can provide enormous benefit at low risk” by employing basic guardrails and transparency. Slingshot believes its safety systems can identify and appropriately respond to risky prompts. The app’s disclaimers state that Ash is not intended for people facing a mental health crisis.

STAT+ Exclusive Story

This article is exclusive to STAT+ subscribers

Unlock this article — and get additional analysis of the technologies disrupting health care — by subscribing to STAT+.

Already have an account? Log in

View All Plans

To read the rest of this story subscribe to STAT+.

Subscribe

Continue Reading