The Human OS iconThe Human OS

Users sit in two WHILL NEXT wheelchairs in the Haneda Airport in Japan

Self-Driving Wheelchairs Debut in Hospitals and Airports

Autonomous vehicles can add a new member to their ranks—the self-driving wheelchair. This summer, two robotic wheelchairs made headlines: one at a Singaporean hospital and another at a Japanese airport.

Read More
A man wearing a robotic exoskeleton stands in front of the fridge; the Amazon Echo sits on the counter nearby.

How a Paraplegic User Commands This Exoskeleton: "Alexa, I'm Ready to Walk"

For a person who’s been navigating the world in a wheelchair, it’s a pretty powerful thing to be able to issue the commands: “Alexa, I’m ready to stand up,” and “Alexa, I’m ready to walk.”

Bionik Labs’ ARKE is the latest robotic exoskeleton that enables paraplegic people to rise to their feet and walk using their paralyzed legs. And it’s the first to integrate the hardware with the Amazon Echo platform, allowing exoskeleton users to control the device with simple voice commands addressed to Amazon’s Alexa, the virtual assistant used in home automation. 

ARKE is still a prototype; the company hopes to bring it to market within the next few years. Company cofounder and COO Michal Prywata says the ARKE’s voice-control system will set it apart from the few exoskeletons that are already available to consumers, because it will make the ARKE easy to use in the home. “There’s a huge opportunity for us in the home exoskeleton market, which hasn’t been tapped into yet,” he says.

With the voice commands, users can trigger the exoskeleton’s basic actions—standing up and sitting down, walking and stopping—and tweak parameters like stride length.

They can also ask for information by saying, for example: “Alexa, what’s my battery status?” And because Amazon’s platform uses natural language processing, users don’t need to speak a precise sequence of words. A user could also say, “Alexa, how much power is left in the battery?” and get the same information. Prywata sees that flexibility as a big advantage: “The user doesn’t have to think about it too much,” he says.

A robotic exoskeleton is positioned on a bench in a kitchen; the Amazon Echo device is on the counter behind
Photo: Bionik Media

The other exoskeletons on the market are the Ekso GT (see demo), which is only intended for use in hospitals and rehab clinics; the ReWalk (see demo), which is approved for both clinical and at-home use, and the Indego, also approved for both the clinic and the home. These devices are programmed and controlled by various types of buttons and interfaces, or have sensors that detect when users shift their weight forward to trigger a stand-up or a step forward. The ARKE also includes such sensors to trigger steps.

Bionik Labs’ existing products are upper body robotics used primarily in stroke rehab, and they’re intended only for the clinic, not the home. The company plans to bring its ARKE exoskeleton to rehab clinics, but not initially in the United States. “We haven’t targeted the United States because ReWalk and Ekso have been there for quite some time already on the clinical side,” says Prywata. “So Asia is our target market there.” 

But Prywata says the at-home market is still wide open in the United States, because “no one’s done it properly yet.” He says 60 percent of the company’s R&D is now devoted to making a new exoskeleton that’s optimized for home use, and which would be cheaper and more lightweight. Such a device would be intended not only for people with paralysis due to a spinal cord injury or stroke, but also people who have mobility problems stemming from disorders like multiple sclerosis. 

Exoskeletons could even become everyday helpers for the elderly, he says. “An exoskeleton for the aging population could have much broader use: anyone that has some sort of disability in walking that’s severe enough to require a walker or two canes,” Prywata says. 

A close-up of the top portion of a robotic exoskeleton, showing its backpack-like straps
Photo: Bionik Media

To integrate the ARKE with the Amazon Echo platform, the Bionik Labs team got everything they needed from Amazon’s software developer kit. But to take the next step—to get medical regulators to approve their exoskeleton as a commercial product suitable for at-home use—the team will likely need direct input from Amazon, Prywata says.

The whole system, including the voice-control features, will need to meet certain medical safety standards, and Prywata doesn’t think Amazon’s Alexa has previously been approved by medical regulators. “I’m assuming this is the first time a medical device has been integrated into their platform,” he says. To be certain, maybe he should ask Alexa. 

An illustration shows a transparent human head in profile with a brain tumor highlighted in red.

IBM Watson Makes a Treatment Plan for Brain-Cancer Patient in 10 Minutes; Doctors Take 160 Hours

In treating brain cancer, time is of the essence.

A new study, in which IBM Watson took just 10 minutes to analyze a brain-cancer patient’s genome and suggest a treatment plan, demonstrates the potential of artificially intelligent medicine to improve patient care. But although human experts took 160 hours to make a comparable plan, the study’s results weren’t a total victory of machine over humans.

The patient in question was a 76-year-old man who went to his doctor complaining of a headache and difficulty walking. A brain scan revealed a nasty glioblastoma tumor, which surgeons quickly operated on; the man then got three weeks of radiation therapy and started on a long course of chemotherapy. Despite the best care, he was dead within a year. While both Watson and the doctors analyzed the patient’s genome to suggest a treatment plan, by the time tissue samples from his surgery had been sequenced the patient had declined too far.

IBM has been outfitting Watson, its “cognitive computing” platform, to tackle multiple challenges in health care, including an effort to speed up drug discovery and several ways to help doctors with patient care. In this study, a collaboration with the New York Genome Center (NYGC), researchers employed a beta version of IBM Watson for Genomics.

IBM Watson’s key feature is its natural-language-processing abilities. This means Watson for Genomics can go through the 23 million journal articles currently in the medical literature, government listings of clinical trials, and other existing data sources without requiring someone to reformat the information and make it digestible. Other Watson initiatives have also given the system access to patients’ electronic health records, but those records weren’t included in this study.

Laxmi Parida, who leads the Watson for Genomics science team, explains that most cancer patients don’t have their entire genome (consisting of 3 billion units of DNA) scanned for mutations. Instead they typically do a “panel” test that looks only at a subset of genes that are known to play a role in cancer.

The new study, published in the journal Neurology Genetics, used the 76-year-old man’s case to answer two questions. First, the researchers wanted to know if scanning a patient’s whole genome, which is more expensive and time consuming than running a panel, provides information that is truly useful to doctors devising a treatment plan. “We were trying to answer the question, Is more really more?” says Parida. 

The answer to that question was a resounding yes. Both the NYGC clinicians and Watson identified mutations in genes that weren’t checked in the panel test but which nonetheless suggested potentially beneficial drugs and clinical trials. 

Secondly, the researchers wanted to compare the genomic analysis performed by IBM Watson to one done by NYGC’s team of medical experts, which included the treating oncologist, a neuro-oncologist, and bioinformaticians.

Both Watson and the expert team received the patient’s genome information and identified genes that showed mutations, went through the medical literature to see if those mutations had figured in other cancer cases, looked for reports of successful treatment with drugs, and checked for clinical trials that the patient might be eligible for. It took the humans “160 person hours” to formulate recommendations, while Watson got there in 10 minutes. 

However, while Watson’s solution was first, it might not have been best. The NYGC clinicians identified mutations in two genes that, when considered together, led the doctors to recommend the patient be enrolled in a clinical trial that targeted both with a combinatorial drug therapy. If the patient had still been healthy enough, he would have been enrolled in this trial as his best chance of survival. But Watson didn’t synthesize the information together this way, and therefore didn’t recommend that clinical trial. 

While it’s tempting to view the study as a competition between human and artificial intelligence, Robert Darnell, director of the NYGC and a lead researcher on the study, says he doesn’t see it that way. “NYGC provided clinical input from oncologists and biologists,” he writes in an email. “Watson provided annotation that made the analysis faster. Given that each team addressed different issues, this comparison is apples to oranges.”

Doctor Robert Darnell sits behind his desk at the New York Genome Center
Photo: IBM
Robert Darnell, a lead researcher on the study, says doctors need AI tools like Watson to keep up with the data deluge in medicine.

IBM’s Parida notes that the cost of sequencing an entire genome has plummeted in recent years, opening up the possibility that whole-genome sequencing will soon be a routine part of cancer care. If IBM Watson, or AI systems like it, are given swift access to this data, there’s a chance they could provide treatment recommendations in time to save the lives of people like the brain-cancer patient in this study.

Darnell says he hopes IBM Watson will become a routine part of cancer care because the amount of data that clinicians are dealing with is already overwhelming. “In my view, having doctors cope with the avalanche of data that is here today, and will get bigger tomorrow, is not a viable option,” he says. “Time is a key variable for patients, and machine learning and natural-language-processing tools offer the possibility of adding something qualitatively different than what is currently available.”

This study was part of a collaboration between IBM and the NYGC announced in 2014, which set out to study the genomics of a few dozen brain-cancer patients. Darnell says the team is now working on a paper about the outcomes for 30 patients enrolled as part of that larger study. 

It’s worth noting that not everyone is sold on the value of IBM Watson for health care: A recent Wall Street analyst report declared that the Watson effort is unlikely to pay off for shareholders. Even though it called Watson “one of the more mature cognitive computing platforms available today,” the report argued that Watson’s potential customers will balk at the cost and complications of integrating the AI into their existing systems. 

The report also called attention to a fiasco at the MD Anderson Cancer Center in Texas, in which an IBM Watson product for oncology was shelved—after the hospital had spent US $62 million on it. 

A photo illustration shows a laptop-sized device that monitors sleep patterns mounted to a window in a home office

AI-Enabled Device Emits Radio Waves to Wirelessly Monitor Sleep Patterns at Home

Around 50 million people in the U.S. suffer from sleep disorders. In order for physicians to diagnose these disorders, patients must spend a night in a sleep lab hooked up to electrodes and sensors, which can be unpleasant and nerve-racking.

MIT researchers have now come up with a way to wirelessly capture data on sleep patterns from the comfort of a patient’s home. Their laptop-sized device bounces radio waves off a person, and a smart algorithm analyzes the signals to accurately decode the patient's sleep patterns.

Read More
A photo shows a close-up of a researcher holding a test tube containing the DNA strand used to insert malicious code into a sequencing program.

Researchers Embed Malware Into DNA to Hack DNA-Sequencing Software

Researchers at the University of Washington have shown that changing a little bit of computer code in DNA sequencing software can make a computer vulnerable to malware embedded in a strand of DNA.

In a related analysis, the group evaluated the security of 13 software programs commonly used for DNA analysis, and found 11 times as many vulnerabilities as are present in other types of software.

Lee Organick, a doctoral student at the University of Washington with a background in synthetic biology, says she hopes their results raise awareness among bioinformatics researchers about the poor security of this software [PDF].

Read More
A man wears a VR headset to play a mind-controlled game in virtual reality.

Startup Neurable Unveils the World's First Brain-Controlled VR Game

Imagine putting on a VR headset, seeing the virtual world take shape around you, and then navigating through that world without waving any controllers around—instead steering with your thoughts alone.

That’s the new gaming experience offered by the startup Neurable, which unveiled the world’s first mind-controlled VR game at the SIGGRAPH conference this week. 

In the Q&A below, Neurable CEO Ramses Alcaide tells IEEE Spectrum why he believes thought-controlled interfaces will make virtual reality a ubiquitous technology.

Neurable isn’t a gaming company; the Boston-based startup works on the brain-computer interfaces (BCIs) required for mind control. The most common type of BCI uses scalp electrodes to record electrical signals in the brain, then use software to translate those signals into commands for external devices like computer cursors, robotic limbs, and even air sharks. Neurable designs that crucial BCI software.

Read More
Anthrax under the microscope

AI Makes Anthrax Bioterror Detection Easier

In the wake of the 9/11 terrorist attacks, separate terror incidents involving letters laced with anthrax killed five Americans and sickened 17 in what the FBI describes as the worst biological attacks in U.S. history. Detection of such lethal anthrax spores could get a speed boost from artificial intelligence that has learned to identify the telltale patterns of the dangerous bacteria within microscope images.

Read More
Photo shows a smiling teenage girl with an award ribbon from a science fair.

Teenage Whiz Kid Invents an AI System to Diagnose Her Grandfather's Eye Disease

When 16-year-old Kavya Kopparapu wasn’t attending conferences, giving speeches, presiding over her school’s bioinformatics society, organizing a research symposium, playing piano, and running a non-profit, she worried about what to do with all her free time.

It was June 2016, the summer after her junior year in high school, and Kopparapu was looking for a new project that would use her computer science skills. Her thoughts quickly turned to her grandfather, who lives in a small city on India’s eastern coast.

In 2013 he began showing symptoms of diabetic retinopathy, a complication of diabetes that damages blood vessels in the retina and can lead to blindness. Eventually he was diagnosed and treated, but not before his vision deteriorated. Still, he was lucky: Although treatments such as medication and surgery can stop or even reverse eye damage if the disease is caught early, most patients never receive care.

Two images of a human retina, one with healthy blood vessels and the other showing signs of diabetic retinopathy
Image: National Eye Institute
Kopparapu's system can distinguish between a healthy retina (left) and one showing signs of diabetic retinopathy (right).

Kopparapu knows the statistics by heart: Of 415 million diabetics worldwide, one-third will develop retinopathy. Fifty percent will be undiagnosed. Of patients with severe forms, half will go blind in five years. Most will be poor.

“The lack of diagnosis is the biggest challenge,” Kopparapu says. “In India, there are programs that send doctors into villages and slums, but there are a lot of patients and only so many ophthalmologists.” What if there were a cheap, easy way for local clinicians to find new cases and refer them to a hospital?

That was the genesis of Eyeagnosis, a smartphone app plus 3D-printed lens that seeks to change the diagnostic procedure from a 2-hour exam requiring a multi-thousand-dollar retinal imager to a quick photo snap with a phone.

Read More
Dean Kamen, dressed in a black button-down shirt, speaking at a lectern

Dean Kamen Opens Organ-Building Institute

On Friday afternoon, New Hampshire Governor Chris Sununu bounded onstage in Manchester like a high school quarterback at a pep rally. “You guys excited?” boomed Sununu. “You should be! This is awesome.”

Sununu was one of a parade of state and federal dignitaries lavishing praise and congratulations on inventor Dean Kamen at the official launch of BioFabUSA, a public-private partnership meant to bring together the technologies needed to create human organ factories. “We are here today on the birth of an entire new industry,” exclaimed Sununu.

Read More
Illustration: ACS Sensors

3D-Printed "Earable" Sensor Monitors Vital Signs

Fitness-tracking wristbands and bracelets have mostly been used to count steps and monitor heart rate and vital signs. Now engineers have made a 3D-printed sensor that can be worn on the ear to continuously track core body temperature for fitness and medical needs.

The “earable” also serves as a hearing aid. And it could be a platform for sensing several other vital signs, says University of California Berkeley electrical engineering and computer science professor Ali Javey.

Core body temperature is a basic indicator of health issues such as fever, insomnia, fatigue, metabolic functionality, and depression. Measuring it continuously is critical for infants, elderly and those with severe conditions, says Javey. But wearable sensors available today in the form of wristbands and soft patches monitor skin temperature, which can change with the environment and is usually different from body temperature.

Body temperature can be measured using invasive oral or rectal readings. Ear thermometers measure infrared energy emitted from the eardrum and are easier to use than more invasive devices. That’s the route Javey and his colleagues took for their earable sensor, reported in the journal ACS Sensors.

Read More
Advertisement

The Human OS

IEEE Spectrum’s biomedical engineering blog, featuring the wearable sensors, big data analytics, and implanted devices that enable new ventures in personalized medicine.

 
Editor
Eliza Strickland
New York City
Contributor
Emily Waltz
Nashville
 
Contributor
Megan Scudellari
Boston
 

Newsletter Sign Up

Sign up for The Human OS newsletter and get biweekly news about how technology is making healthcare smarter.

Advertisement
Load More