Google’s Plan to Crunch Health Data on Millions of Patients Draws Fire | Threatpost

Tech behemoth Google is using artificial intelligence to reportedly slice and dice personal healthcare details on millions of Americans. That has some researchers diagnosing the company with HIPAA violations and prescribing regulatory controls as a remedy. And, at least one federal regulator is opening a probe into the project.

Google has inked a partnership with Ascension, one of the nation’s largest non-profit health systems with operations in 21 states. Under the terms of the deal, which was first mentioned on a second-quarter earnings call, the internet giant said that it is providing cloud infrastructure and the G Suite productivity suite to the organization. However, Google has come under fire for a third part of the agreement, which involves “piloting tools that could help Ascension’s doctors and nurses more quickly and easily access relevant patient information, in a consolidated view.”

First reported by the Wall Street Journal earlier this week, the work is code-named “Project Nightingale.” And according to internal Google documentation that the outlet obtained, that third part of the partnership involves Google gathering complete health histories and personal contact information for patients spread across 2,600 hospitals, doctors’ offices and other facilities. The idea is to apply AI to the data in hopes of recommending better treatment plans, flagging potential problems (like drug interaction issues) cross-referencing relevant medical events, suggesting the enforcement of narcotics policies, recommending replacement or addition of doctors, executing billing changes and more.

According to the documents, to make this happen, when a patient has an appointment, doctors and nurses examine the patient and input data into the cloud; that data is also fed into the Project Nightingale system.

The issue for some is that Google is reportedly doing this without the patients themselves being aware of the situation. “Neither patients nor doctors have been notified. At least 150 Google employees already have access to much of the data on tens of millions of patients, according to a person familiar with the matter and the documents,” the WSJ reported.

Google, for its part, issued a blog shortly after the article was published. In it, Tariq Shaukat, president of Industry Products and Solutions at Google Cloud, confirmed the partnership. However, he was light on the details of exactly what AI-related activities are involved in Project Nightingale, saying only that “We aim to provide tools that Ascension could use to support improvements in clinical quality and patient safety.”

Shaukat also stressed that the Health Insurance Portability and Accountability Act of 1996 (HIPAA) does allow medical providers to share protected health information (PHI) data with business partners without express consent by the data subjects, under what’s known as a Business Associate Agreement (BAA). A BAA stipulates that data cannot be used for any other purpose than for providing the specific services encompassed by the agreement. Google added that “patient data cannot and will not be combined with any Google consumer data” and that it would not be used for non-healthcare-related purposes.

Also, in an FAQ, Google laid out specifics when it comes to its data-security plans for Project Nightingale:

“Data is logically siloed to Ascension, housed within a virtual private space and encrypted with dedicated keys. Patient data remains in that secure environment and is not used for any other purpose than servicing the product on behalf of Ascension. Specifically, any Ascension data under this agreement will not be used to sell ads. There are access logs for any individual who might come in contact with PHI in the process of helping Ascension configure and test tools, to ensure all policies are followed. Finally, these systems are included in Google’s annual compliance audits for ISO 27001 certification and SOC2/3. These are procedures in which external auditors check that we have the systems and processes in place to guarantee access control, data isolation, logging and auditing.”

Researchers, Feds Weigh In

Despite Google’s assurances on the HIPAA and privacy front, some are raising serious concerns about the implications of the partnership with Ascension. For instance, Roger Severino, the officer director for The Office for Civil Rights in the Department of Health and Human Services, told the WSJ in a statement that the division has opened an inquiry that “will seek to learn more information about this mass collection of individuals’ medical records to ensure that HIPAA protections were fully implemented.”

“Yes, this is absolutely a HIPAA violation,” Adam Kujawa, director of Malwarebytes Labs, told Threatpost. He also raised the specter of insider threats, where rogue employees see an opportunity in all of that rich data. “The medical information could be sold by…employees to advertisers or cybercriminals who deal with massive data collection, likely for a pretty penny considering the value and full details included in the data. It boggles my mind how something like this could happen and why it would happen between two massive companies.”

Fausto Oliveira, principal security architect at Acceptto, told Threatpost that despite Google’s assurances, consumers in reality have no control how the data will be used.

“From a private consumer point of view, what warranties and rights does an individual have that his data is not going to be shared with third parties that may then use that data for non-health related purposes?” he said, adding that Google still hasn’t answered certain other questions, such as, if there are clerical errors that may impact insurance costs, how does an individual ensure that the data that Google has in their possession is corrected; and, how is the data anonymized (if at all).

The lack of data-subject control is top-of-mind from a privacy perspective, Kujawa added.

“We need some regulations on how individuals can have at least some control over their own data and how it’s used by third parties,” he said. “The fact that all of this information can be collected, analyzed and shared for the sake of making somebody money (regardless of the virtue of the intent) is clearly a sign that our current system of data privacy is completely broken, especially if it is legal for companies to do this because of a lack of actual prevention of abuse of the data or a lack of safeguards to ensure we aren’t tricked into handing over our data with pages of lawyer-speak (like EULAs).”

Eric Silverberg, founder of LGBQT-led Perry Street Software, said that he’s concerned whether the protections being made for patient data are robust enough.

“As a leader of a community that has faced health discrimination in the past, we are deeply concerned by reports that Google is using its platform monopoly to surreptitiously aggregate health information on users without explicit consent,” he said via email. “It is easy to imagine how HIV and STD status could be used to deny health coverage to LGBTQ Americans, as has been the case in years past. Google should halt this program immediately until it has been fully explained to regulators, policymakers and users. It is because of reckless data use decisions like this that Perry Street Software, publishers of two of the largest gay dating apps in the world, severed its advertising business relationship with Google in 2018.”

Oliveira also pointed out the danger should the data be hacked.

“If there was a video exposing the project, how long before consumer data appears in a pastebin putting the consumers at risk?” he said. “Health data contains sensitive information about an individual that can be used to influence voter districts. It opens the door to abuse of the healthcare information such as race and sexuality by prospective employers and potential by threat actors (blackmail or reputational attacks as an example). And assume for a moment that this data is exfiltrated in part or in its totality, criminals worldwide will jump on this information to perpetrate all kinds of mischief and crime.”

Google pointed Threatpost to its blog post in response to a request for comment.

What are the top risks to modern enterprises in the peak era of data breaches? Find out: Join breach expert Chip Witt from SpyCloud and Threatpost senior editor Tara Seals, in our upcoming free Threatpost webinar, “Trends in Fortune 1000 Breach Exposure.” Click here to register.