The Internet seemed to be blowing up earlier this week on Amazon Prime Day, when big deals (and some not-so-huge) were offered at the online shopping hub. One of the biggest deals was the tabletop gadget known as Amazon Echo, which has voice-recognition capabilities that come in the form of "Hey, Alexa, tell me XXX!"
Yep, you can count me as one who took advantage of that deal and bought myself an Amazon Echo.
This gives me a chance to try the voice-enabled diabetes tools concept that more and more in our patient community seem to be embracing, especially those in the do-it-yourself #WeAreNotWaiting subset.
My plan is to soon hook up my Nightscout data-sharing to be able to ask directly, "Hey Alexa, what's my blood sugar trend?" So instead of having to look at my Dexcom receiver or Pebble Watch with Nightscout data (yes, I still use my Pebble and it does still work), I'll be able to just shout out a question and get the answer.
Ah, the beauties of living in the First World and having access to such tech toys!
Whether this voice tech will become a staple or is just a craze is yet to be determined, but as of now it's a niche solution that gives some of us one more option to interact with our diabetes data (for better or worse).
Alexa Diabetes Challenge Finalists Announced
In case you missed it, just this week the "Alexa Diabetes Challenge" finalists were announced after the challenge launched a few months back. It's focused on developing new talking D-tech for those with type 2 diabetes specifically. This Merck-sponsored contest supported by Amazon Web Services and New York-based innovation consultancy Luminary Labs is a multi-stage competition offering $125,000 in prize money for new voice-enabled solutions to "improve the lives of those with T2D."
A total of 96 submissions came in from across the US, ranging from research institutions, software companies, startups, and healthcare providers (HCPs). We asked if there were any patients, aka PWDs (people with diabetes) who submitted ideas, and Luminary Labs told us this:
"Yes, we did receive submissions from teams that were either led by or had team members who are living with type 2 and type 1 diabetes. Since the criteria called for a type 2 focus, the submissions did not speak to specifically targeting type 1 patients. However, some discussed the potential of their solution to be adapted to T1."
Anyhow, a panel of judges narrowed the list down to five finalists who'll move on to the final round:
- DiaBetty, University of Illinois: A virtual diabetes educator and at-home coach that is sensitive and responsive to a patient’s mood. It provides patients with context-dependent, mood sensitive, and emotionally aware education and guidance, enhancing patient skills for self-management.
- My GluCoach, HCL America, Inc.: A holistic management solution, developed in partnership with gaming company Ayogo, that blends the roles of voice-based diabetes teacher, lifestyle coach, and personal assistant to serve the individual and specific needs of the patient. It leverages health pattern intelligence from sources such as patient conversations and wearable and medical devices.
- PIA: Personal Intelligent Agents for Type 2 Diabetes, Ejenta: A connected care intelligent agent that uses NASA-licensed AI technology integrated with IoT device data to encourage healthy habits, detect at-risk behaviors and abnormalities, and alert care teams. Of all the finalists, the Ejenta co-founder is the only one that specifically noted a D-connection in having a daughter with type 1 diabetes.
- Sugarpod, Wellpepper: A multimodal solution that provides specialized voice, mobile, video, and web interactions to support patient adherence to comprehensive care plans. It offers education, tips, and tracking tools, including a smart foot scanner, which uses a classifier to identify potential abnormalities.
- T2D2: Taming Type 2 Diabetes Together, Elliot Mitchell, Biomedical Informatics PhD Student at Columbia University, and team: A virtual nutrition assistant that uses machine learning to provide in-the-moment personalized education and recommendations as well as meal planning and food and glucose logging. Its companion skill authorizes caregivers to connect with a patient’s account to easily engage from afar.
Of course without knowing more specifics of these ideas or the people behind them, it's tough to judge how effective they might be for daily use in the real world. But naturally, Amazon Web Services has high hopes.
“Voice technology like Amazon Alexa can dramatically improve user experience by providing the ability for people to interact with devices at a more personal level," says Steve Halliwell, Director of Healthcare and Life Sciences at Amazon Web Services, Inc. "These finalists showcase how one day people may use Alexa skills and fully integrated AWS services to create new healthcare scenarios."
The five finalists now move onto the next round dubbed Virtual Accelerator -- a process offering them expert mentorship as they continue to develop their proposals. The Luminary Labs startup organizing this challenge will host an Innovators’ Boot Camp for the finalists at Amazon’s Seattle headquarters at the end of July. From there, the finalists will present their concepts to the judges in New York City in September, and a winner will be selected for the $125,000 grand prize in October.
Will Talking Tech Take Over?
Voice-enabled diabetes tools are of course critical for expanded accessibility by PWDs with visual or physical impairments, and we're already starting to see that implemented; in June, One Drop announced that its slick chrome meter is now integrated with Amazon Echo for tracking BGs, food and physical activity.
But for the rest of us, as I eluded to back in March when first writing about talking D-tech, it's cool in a sense... but it's also very niche right now and likely to remain that way for some time.
Really, it's unclear whether voice-activation on the whole is the wave of the future, or just trendy in the moment.
Right now, I find Alexa is not any more actionable than Siri or Google. Sure, you can ask it for carb count or a BG number. It can blink reminders. But people may not notice the blinking lights, and the recognition isn't perfect so you -- the person living with diabetes -- must still engage and enunciate, so it sometimes feels like just one more chore we're being asked to handle as patients. An expensive toy?
On the other hand, maybe in 10-20 years everything will be run by voice-recognition technology: our home appliances, our cars, and yes, even our health and medical treatments.
We'll just have to wait and see what the future holds...
(Hey Alexa, what's the future of Talking D-Tech look like?)