Crowdsourcing the AHA Chain of Survival with Amazon’s Alexa

Researchers have taught Amazon’s Alexa and Apple’s Siri to recognize the unique auditory signature of agonal breathing that heralds early cardiac arrest


By Kelly Grayson

“Hal, call 911 and tell Lucas to start CPR.”

“I’m sorry, I’m afraid I can’t do that, Dave. Would you rather I open the pod bay doors instead?”

Researchers have taught Amazon’s Alexa and Apple’s Siri to recognize the unique auditory signature of agonal breathing that heralds early cardiac arrest, thus further outsourcing to artificial intelligence the first link in the AHA Chain of Survival. (Image/Brothers Park
Researchers have taught Amazon’s Alexa and Apple’s Siri to recognize the unique auditory signature of agonal breathing that heralds early cardiac arrest, thus further outsourcing to artificial intelligence the first link in the AHA Chain of Survival. (Image/Brothers Park

“Fine, I’ll just have Siri call 911. Nancy Magee has the Pulse Point app, and she’s just in the next room. She’ll be in here doing compressions on me in the next 30 seconds.

“Without your iPhone, Dave, you’re going to find that rather difficult.”

Then I woke up.

On a side note, I learned it’s best not to have 2001: A Space Odyssey playing in the background when I research articles. And if you’re under age 50 and have no idea what I’m talking about, kids, hit Netflix and fill that gaping hole in your pop culture education.

Recently, I told you how the Series 4 Apple Watch has incorporated fall detection and arrhythmia recognition into its suite of capabilities. Now, researchers have taught Amazon’s Alexa and Apple’s Siri to recognize the unique auditory signature of agonal breathing that heralds early cardiac arrest, thus further outsourcing to artificial intelligence the first link in the AHA Chain of Survival.

I just hope that whatever AI that detects my cardiac arrest hasn’t read my early Facebook posts and decides that the human race is better off without me.

Smart speakers get smarter

Utilizing a series of recorded agonal breathing audio from cardiac arrests and a unique set of audio filters to screen out white noise, pet noises, air conditioning, etc.; as well as extensive sleep lab recordings of hypopnea, snoring, central sleep apnea and obstructive sleep apnea, researchers were able to train their smart speakers to recognize agonal breathing with 98.62% accuracy, from up to three meters away.

The researchers recognize the limitations of their study, namely that roughly half of cardiac arrests display no agonal breathing, and that their research relied on a relatively small sample size of cardiac arrest recordings.

Still, more data will likely result in improvements in diagnostic accuracy across a broad variety of environments and demographics, and the technology already exists to crowdsource that second link in the Chain of Survival: early CPR. With smartphone apps like Pulse Point and Pulse Point Verified Responder, bystander CPR is now possible not only in public spaces, but also in residences. It won’t be long before some tech geek manages to marry the arrhythmia-detecting features of the Apple Watch with Siri’s ability to recognize agonal breathing, automatically dial 911, unlock the front door and flash the porch lights for the responders, and then queue up your CPR compressions playlist and the latest ACLS algorithms.

Now all we need is for Tony Stark to build an automated CPR device, using his nanobots to assemble around your still-warm corpse and start compressions, and shock you with its Arc Reactor-powered defibrillator array.

New from Medtronic and Stark Industries, it’s the Lucas XV, Iron Man Edition!

Join the discussion

Copyright © 2019 ems1.com. All rights reserved.