Wearables Bring a Host of Tricky Legal Questions, Says Tech Attorney

Can your employer force you to wear a tracker? Can you replace your natural limbs with prosthetics?

5 min read

An image of a man in a plaid shirt reclining in a chair and pointing to a wearable attached to his wrist.
Photo: Getty Images

Every new technology needs a good lawyer. For wearables, that’s tech attorney Kraig Baker, a Seattle-based partner in the firm Davis Wright Tremaine. When companies need to think through issues like data privacy for wearable trackers or the capabilities of smart textiles and sports devices, he’s the guy to call. 

Baker talked with IEEE Spectrum about the legal questions that will come up as wearables—including “embeddables” and “ingestibles”—become intimate parts of our lives. He sees these technologies as the latest step in a long trend toward human-computer integration. First desktop computers gave way to laptops, bringing computing into our homes; now laptops are giving way to mobile phones, bringing computing into our pockets and “our lives in motion,” he says. Next, wearables will bring computing onto and into our bodies. 

“The thing that's unique about wearables: You have something on your body that's collecting a very intimate kind of data.”

IEEE Spectrum: Thinking about the legal issues of wearables, the first things that come to mind are data privacy and security. How do these issues play out in wearables? 

Kraig Baker: In general, consumers are comfortable with people using their data in ways they expect. If my fitness tracker uses my data to make suggestions about my workouts, I may be interested in that. On the other hand, if I find out that it’s taken all my running routes and used them in a way I didn’t expect, I may get angry. 

We’ve seen that repeatedly with data privacy. But the thing that’s different and unique about wearables: You have something on your body that’s collecting a very intimate kind of data. Companies like Amazon are not collecting information about my sleeping habits or the salt in my sweat. 

Spectrum: So when you talk about wearables, you’re thinking way beyond fitness trackers, right?

Baker: People are using exoskeletons to help them do things, people will have hearing aids that do live translation of foreign languages. The FDA just approved an ingestible medical device that you swallow. And researchers are working on medical devices that embed trackers in your body. The intrusiveness and the scope of what they can do will expand over time. 

Spectrum: How are wearables being used by employers? 

Baker: This is one of the biggest areas of growth for wearables. A company might decide to put monitors on their truckers to see if they’re tired, or to see if their pilots have alcohol in their systems. Construction workers might want to wear exoskeletons so they’re not subject to overuse syndrome (repetitive strain injuries). 

A headshot of Kraig Baker.Kraig Baker is a tech attorney and partner in the firm Davis Wright Tremaine.Photo: Courtesy of Kraig Baker

Spectrum: What legal issues come up when wearables are used on the job? 

Baker: The biggest issues are transparency and informed consent. The consent issue is tricky: Can you have informed and reasonable consent if the employer just says, “This is my policy”? In terms of what the employer can demand, it probably varies on a state-by-state and country-by-country basis.

The other big issue is ensuring a narrow use of the data collected—that it’s only being used in the way employers say they’ll use it. That’s something I call the honey pot problem: Now that the employer has this trove of data, how do we ensure they don’t use it for something impermissible?

Spectrum: The U.S. military has funded a lot of research into wearables, including projects on electrical stimulation of the nervous system to speed up learning and to improve the performance of warfighters and intelligence analysts who stare at screens all day. Do you think these issues of consent will play out in the military first? 

Baker: You have fewer rights in the military, they can just tell you to wear it. But the bigger issue is long-term health impacts and how that relates to conditions of employment. If the military is trying to create a hypervigilant fighter pilot out on a special ops mission, can they require the pilot to use certain devices, even though they know—or ought to know—that there are long-term physical or mental consequences to using those devices?

Spectrum: We’ve been talking about wearables used on the job. But can employers also use wearables to supervise the activities of employees when they’re off the clock?

Baker: We’ve seen this question come up in pro athletics. Some contracts have things like—You can’t ride a motorcycle during the off season. But can they make you wear a monitor all the time and then ask, “You were up till two in the morning. Were you tossing and turning because you’re nervous about the big game, or were you down in the casino playing blackjack?”

Spectrum: Are unions starting to raise these issues when they negotiate contracts? 

Baker: This has come up in the collective bargaining in pro sports leagues. Can you mandate that this is worn, and if you can, who owns the data? If the team owns the data, how can they use it? If the player is traded halfway through the season, does the data go with the player or stay with the old team, and can they use it for competitive advantage?

For other unionized workers, it’s certainly not the first negotiation point; they’re more concerned about pension, retirement, medical benefits. But this notion of being tracked, particularly by something I wear on my body, makes people uncomfortable. Employees want to feel they have autonomy. If you tell me I have to sit in this office or at this desk, I understand that as a typical condition of employment. But if you say, I have to wear this thing...

Spectrum: Do you expect to see a lot of pushback? 

“I think wearables for augmentation bring the biggest, most interesting, and most legally scary questions.”

Baker: It depends. What is the narrative that develops around this? Maybe we’ll find out that two or three companies overreached, that they tracked their employees and discovered personal information that they wouldn’t want shared. And then people won’t trust the technology. Or maybe it will be, “Oh my gosh, this technology just saved our lives because this chip disclosed that we had a drunk pilot.” In which case, people say, “That’s terrific.” You never know what will go viral.

Spectrum: We’ve been talking mostly about whether people can be compelled to use wearable technologies. Will we also see cases where people are demanding the right to use wearables? 

Baker: I think wearables for augmentation bring the biggest, most interesting, and most legally scary questions. Maybe in order to augment my performance now, I’m willing to use a device that will take time off my life later. Under what circumstances am I considered a rational actor who can make my own choices?

I heard a talk by the director of MIT’s biorobotics lab (Hugh Herr). He has prosthetics because he was in a climbing accident many years ago. He now has four or five sets of prosthetic legs—and he can climb better with his new legs. What do we do with the first person who says, “I want to be a cyborg, I want someone to cut me off at the knees because then I’ll be able to perform better at my job or my passion”? Maybe it will be someone who can be a better concert pianist because they can use their brain to direct their prosthetic hands to operate in new ways. I think this is gong to be an enormous issue.

Spectrum: Are the laws and regulations behind the technology? Will these questions all get sorted out in the courts? 

Baker: That’s the way the law always is with technology. Think of the senators quizzing Mark Zuckerberg on what Facebook is. They showed a fundamental misunderstanding of the technology. And Facebook has been around for a while, and 2.9 billion people are using it. How are they going to think about things like, “I want to augment my brain with computer chips” or, “I installed a neural net in my brain so I can control a bunch of drones”—this really edge case, science fiction, comic book kind of stuff? But it is conceivable that someone could do it in the next 10 to 15 years. 

There isn’t a legal framework for thinking about these things. The law is behind, and it will continue to be. But that’s okay—it keeps me employed. 

The Conversation (0)