Yesterday’s Apple keynote is over, but the tech revolution it welcomed in is just starting. No, not the HBO streaming; the ResearchKit, which could change how we approach healthcare apps.

 Photo Credit: NEC Corporation of America cc
Photo Credit: NEC Corporation of America cc

As mobile health apps have grow progressively more popular the stakes for the cyber security behind them has never been higher, as Lawrence J. Tabas and Jenna K. Shedd note in a post for Health Law Gurus:

Mobile health apps (also known as mHealth apps) are increasingly popular with consumers. As of 2014, there were more than 100,000 mobile health apps available on iOS and Android platforms, and total revenue from mobile health apps is expected to increase to $26 billion by the end of 2017, according to a research2guidance report. Mobile health apps have the potential to revolutionize the health care industry by engaging patients in their health care and facilitating communications between patients and their physicians…Mobile health apps contain large amounts of data, the majority of which is personal and sensitive information about the app’s user.

Apple had long been touting their app HealthKit as a sort of future of digitizing and sharing healthcare information. But now they’ve expanded it with ResearchKit, which allows users to participate in research studies from their phones. Users can opt-in from their iPhones, and do tasks as simple as measuring the gait of their walk or saying “Ahh” to test for Parkinson’s, given that they’re already carrying around an iPhone which is chock full of finely tuned sensors and processors. So far there’s five apps that will research asthma, breast cancer, cardiovascular disease, and diabetes in addition to Parkinson’s.

It’s a potentially powerful attempt to harness the legion of iPhone users for a greater cause, and to compound that good Apple is making the app open source.

HealthKit and ResearchKit are another step towards the Internet of Things future, and they’re a big one: given that the FTC hasn’t set any legal precedent on how to legislate the Internet of Things, Apple’s standard will carry weight for how app creators, and healthcare apps in particular, build their own. So it’s a good thing they seem to be doing it pretty by the book—or at least the standards the FTC has laid out: notice and choice; data minimization; and data security.

Screen Shot 2015-03-09 at 2.33.52 PM
Apple’s HealthKit offers a more intensive procedure for allowing data to be transmitted

Users are given plenty of options on what kind of data they are sending, and those choices are laid out in a clear way for the user to pick and choose from (as you can see in the photo). In fact, Apple seems to have been interested in building notice and choice principles into their HealthKit from the get go. At Apple’s Worldwide Developer Conference last year, HealthKit developer Justin Rushing laid out the thought behind the app’s sharing abilities:

We want to encourage our users to only give apps access to information they feel comfortable sharing. To support this we let users give your app permission on a per object type basis. That way if all your app needs is step-data, they don’t need to give it access to what may be more sensitive kinds of data.

…In HealthKit we let you see if you’ve been granted sharing, or write access, to a particular type, but you can’t see if you’ve been granted read access. This is because for some kinds of information knowing that a user blocked your app can be just as private as knowing the data itself. For example, if a user blocks you from seeing their blood pressure, it could indicate that the user is diabetic, and we don’t want to leak this information.

In their report, the FTC was noncommittal about what an ideal model for notice and choice would look like, recognizing that there was “no one-size-fits-all approach.” In fact, a lot of their discussion is around how difficult wide-spread implementation would be, and allowed some wiggle-room for companies:

Recognizing concerns that a notice and choice approach could restrict beneficial new uses of data, staff has incorporated certain elements of the use-based model into its approach. For instance, the idea of choices being keyed to context takes into account how the data will be used: if a use is consistent with the context of the interaction–in other words, it is an expected use–then a company need not offer a choice to the consumer. For uses that would be inconsistent with the context of the interaction (i.e., unexpected), companies should offer clear and conspicuous choices.
But Apple—in lieu of just sending all information or opting out—has created a precedent for giving the user complete control over the finer points of data being transmitted.

Data security is, obviously, a major concern around any sort of data transmission, but health care in particular is lucrative for hackers. As one New York Times article put it, “security experts estimate there’s only two types of companies left in the United States: those that have been hacked, and those that don’t know they’ve been hacked,” and in 2014 a whopping 42 of those cyber attacks were on health providers. With the growing adoption of Apple Pay, Apple has certainly put its eggs in the cloud basket, so one can only hope that they’ve got the data security to back it up.

As far as data minimization goes, Apple has explicitly said that they won’t see any of the information transmitted across ResearchKit. But given the awesome amount of data that HealthKit and ResearchKit will be monitoring, there’s an understandable fear that the health data collected on a user could be used against them in the future. Take, for instance, the example of a young woman in Calgary Forbes reported on who will wear a Fitbit to backup her personal injury case against a former company:

What’s intriguing (and a little creepy) is that cases like [the woman’s lawyer Simon] Muller’s could open the door to wearable device data being used not just in personal injury claims but in prosecutions. “Insurers will want it as much as plaintiffs will,” says Muller, for assessing sketchy claims.

Insurers wouldn’t be able to force claimants to wear Fitbits as part of an “assessment period,” like Muller’s client, but they could request a formal court order from whoever holds the data to release it to them, says Dr. Rick Hu, co-founder and CEO of Vivametrica. “We would not release the information,” he adds. Insurers could instead request it from a law firm or even from Fitbit directly.

“It’s always evolving with technology,” says Muller. “A number of years ago we saw courts requisition Facebook [for] information. If you’ve been wearing the Fitbit monitors it’s likely you’ll see court applications to compel disclosure of that data.”

Whether or not ResearchKit or HealthKit is used as evidence (beyond that for scientific research) only time will tell. But Apple has the potential to do something big here, and in more ways in one. So here’s to hoping they don’t screw it up.