In a post last week I compared Apple’s new mHealth App store rules with our classic regulatory models. I noted that the ‘Health’ data aggregation app and other apps using the ‘HealthKit’ API that collected, stored or processed health data would seldom be subject to the HIPAA Privacy and Security rules. There will be exceptions, for example, apps linked to EMR data held by covered entities. Equally, the FTC will patrol the space looking for violations of privacy policies and most EMR and PHR apps will be subject to federal notification of breach regulations.
Apple has now publicly released its app store review guidelines for HealthKit and they make for an interesting read. First, it is disappointing that Apple has taken its cue from our dysfunctional health privacy laws and concentrated its regulation on data use, rather than collection. A prohibition on collecting user data other than for the primary purpose of the app would have been welcome. Second, apps using the framework cannot store user data in iCloud (which does not offer a BAA), begging the question where it will be acceptable for such data to be stored. Amazon Web Services? Third, while last week’s leaks are confirmed and there is a strong prohibition on using HealthKit data for advertising or other data-mining purposes, the official text has a squirrelly coda; “other than improving health, medical, and fitness management, or for the purpose of medical research.” This needs to be clarified, as does the choice architecture. Continue reading →
On September 9 Apple is hosting its ‘Wish We Could Say More’ event. In the interim we will be deluged with usually uninformed speculation about the new iPhone, an iWatch wearable, and who knows what else. What we do know, because Apple announced it back in June, is that iOS 8, Apple’s mobile operating system will include an App called ‘Health’ (backed by a ‘HealthKit’ API) that will aggregate health and fitness data from the iPhone’s own internal sensors, 3rd party wearables, and EMRs.
What has been less than clear is how the privacy of this data is to be protected. There is some low hanging legal fruit. For example, when Apple partners with the Mayo Clinic or EMR manufacturers to make EMR data available from covered entities they are squarely within the HIPAA Privacy and Security Rules triggering the requirements for Business Associate Agreements, etc.
But what of the health data being collected by the Apple health data aggregator or other apps that lies outside of protected HIPAA space? Fitness and health data picked up by apps and stored on the phone or on an app developer’s analytic cloud fails the HIPAA applicability test, yet may be as sensitive as anything stored on a hospital server (as I have argued elsewhere). HIPAA may not apply but this is not a completely unregulated area. The FTC is more aggressively policing the health data space and is paying particular attention to deviance from stated privacy policies by app developers. The FTC also enforces a narrow and oft-forgotten part of HIPAA that applies a breach notification rule to non-covered entity PHR vendors, some of whom no doubt will be selling their wares on the app store. Continue reading →
The stakes were high in Sutter — under the California statute medical data breach claims trigger (or should trigger!) nominal damages at $1000 per patient. Here four million records were stolen.
Plaintiffs’ first argued the defendant breached a section prohibiting unconsented-to disclosure. The not unreasonable response from the court was that this provision required an affirmative act of disclosure by the defendant which was not satisfied by a theft.
A second statutory provision argued by the plaintiffs looked like a winner. This section provided, “Every provider of health care … who creates, maintains, preserves, stores, abandons, destroys, or disposes of medical information shall do so in a manner that preserves the confidentiality of the information contained therein.” Continue reading →
Art Caplan has a new opinion piece on NBCNews on the controversy over the case of Jessie Herald, in which he was offered a plea bargain that involved sterilization for a reduced sentencing. From the piece:
Jessie Lee Herald was facing five years or more in prison after a crash in which police and prosecutors said his 3-year-old son was bloodied but not seriously hurt. But Herald cut a deal. Or more accurately, the state agreed to reduce his sentence if he would agree to be cut. Shenandoah County assistant prosecutor Ilona White said she offered Herald, 27, of Edinburg, Virginia, the opportunity to get a drastically reduced sentence if he would agree to a vasectomy. It may not be immediately clear what a vasectomy has to do with driving dangerously and recklessly. It shouldn’t be. There is no connection.
Art Caplan has authored a new opinion piece on Bioethics.net on the issue of “chipping” human beings. From the piece:
There has been a great deal of fingerpointing, second-guessing and recrimination over the decision by the President to exchange five former Taliban leaders for the American soldier, Bowe Bergdahl. “You’ve just released five extremely dangerous people, who in my opinion … will rejoin the battlefield,” Senator Marco Rubio, R-Fla., and likely Presidential candidate told Fox News. Senator John McCain, R-AZ, told ABC news and many other outlets that he would never have supported the swap if he’d known exactly which prisoners would be exchanged given their former high roles in battling the U.S. in Afghanistan.
Put aside for a second whether the five Taliban leaders that were flown to Qatar for Bergdahl are now too old and too long removed from Taliban affairs to resume anything close to their old roles. Presume, instead, they will eagerly resume where they left off prior to their capture, attacking Americans and others they see as hindering Taliban goals for Afghanistan. Is it possible that the U.S. did something to these men before letting them go in the swap—surreptitiously implanting them with microchips so that they could be tracked or traced?
The President’s Council of Advisors on Science and Technology (PCAST) has issued a report intended to be a technological complement to the recent White House report on big data. This PCAST report, however, is far more than a technological analysis—although as a description of technological developments it is wonderfully accessible, clear and informative. It also contains policy recommendations of sweeping significance about how technology should be used and developed. PCAST’s recommendations carry the imprimatur of scientific expertise—and lawyers interested in health policy should be alert to the normative approach of PCAST to big data.
Here, in PCAST’s own words, is the basic approach: “In light of the continuing proliferation of ways to collect and use information about people, PCAST recommends that policy focus primarily on whether specific uses of information about people affect privacy adversely. It also recommends that policy focus on outcomes, on the “what” rather than the “how,” to avoid becoming obsolete as technology advances. The policy framework should accelerate the development and commercialization of technologies that can help to contain adverse impacts on privacy, including research into new technological options. By using technology more effectively, the Nation can lead internationally in making the most of big data’s benefits while limiting the concerns it poses for privacy. Finally, PCAST calls for efforts to assure that there is enough talent available with the expertise needed to develop and use big data in a privacy-sensitive way.” In other words: assume the importance of continuing to collect and analyze big data, identify potential harms and fixes on a case-by-case basis possibly after the fact, and enlist the help of the commercial sector to develop profitable privacy technologies. Continue reading →
In a recent blog I discussed the benefits and potential draw-backs of a new “EU Regulation on clinical trials on medicinal products for human use,” which had been adopted by the European Parliament and Council in April 2014. Parallel to these legislative developments, the drug industry has responded with its own initiatives providing for varying degrees of transparency. But also medical authorities have been very active in developing their transparency policies.
In the US, the FDA proposed new rules which would require disclosure of masked and de-identified patient-level data. In the EU, the EMA organized during 2013 a series of meetings with its five advisory committees to devise a draft policy for proactive publication of and access to clinical-trial data. In June 2013 this process resulted in the publication, of a draft policy document titled “Publication and access to clinical-trial data” (EMA/240810/2013).
In a letter to the EMA’s executive director Dr. Guido Rasi, dated 13 May 2014, the European Ombudsman, Emily O’Reilly, has now expressed concern about what seems to be a substantial shift of policy regarding clinical trial data transparency. Continue reading →
Privacy is never easy to think about. This week it became harder. Two pieces framed my week. First, Eben Moglen’s essay in The Guardian (based on his Columbia talks from late last year) took my breath away; glorious writing and stunning breadth combined to deliver a desperately sad (but not entirely hopeless) message about government and corporate overreaching in data collection and processing.
A wry speech posted by software developer Maciej Ceglowski also helped frame my thoughts. He wrote, “The Internet somehow contrives to remember too much and too little at the same time, and it maps poorly on our concepts of how memory should work.” There’s the problem in a nut. Ceglowski alludes to the divide between how human (offline) memory operates (it’s “fuzzy” and “memories tend to fade with time, and we remember only the more salient events”) and the online default of remembering everything. Government and Google and, for that matter, Big Data Brokers tell us that online rules now apply across the board and ‘that’s just peachy’ because we’ll have better national security, better searches, or more relevant advertising. But, that’s backwards. Continue reading →
A resident of Spain allegedly owed back taxes triggering attachment proceedings. The local newspaper published the details of an upcoming auction of his property in early 1998. At some point the issue was settled. However, the matter was not forgotten—the newspaper was online and a Google search of the gentleman’s name returned this history. He complained to the Spanish data protection agency (AEPD) that he had a right to have older, irrelevant information erased and that Google should remove the links. The AEPD agreed and Google sued for relief. The Spanish High Court referred the interpretation of the Data Directive (95/46) to the European Court of Justice in 2010 and in 2013 the Advocate-General issued an advisory opinion supportive of Google’s position. Somewhat surprisingly the European Court of Justice has now taken the opposite view (Case C‑131/12, Google Spain SL v. AEPD, May 13, 2014). Continue reading →
Emerging Issues and New Frontiers for FDA Regulation
Monday, October 20, 2014
We are currently seeking abstracts for academic presentations/papers on the following topics:
Stem cell therapies
Genetic (and biomarker) tests
Comparative efficacy research
Drug resistant pathogens
Mobile health technologies
Other related topics
Abstracts should be no longer than 1 page, and should be emailed to Davina Rosen Marano at firstname.lastname@example.org by Tuesday, June 3, 2014. Questions should also be directed to Davina Rosen Marano.
We will notify selected participants by the end of June. Selected participants will present at the symposium, and will be expected to submit a completed article by December 15, 2014 (after the event) to be considered for publication in a 2015 issue of FDLI’s Food and Drug Law Journal (FDLJ). Publication decisions will be made based on usual FDLJ standards.
“aims to remedy the shortcomings of the existing Clinical Trials Directive by setting up a uniform framework for the authorization of clinical trials by all the member states concerned with a given single assessment outcome. Simplified reporting procedures, and the possibility for the Commission to do checks, are among the law’s key innovations.”
Moreover, and very importantly, the Regulation seeks to improve transparency by requiring pharmaceutical companies and academic researchers to publish the results of all their European clinical trials in a publicly-accessible EU database. In contrast to earlier stipulations which only obliged sponsor to publish the end-results of their clinical trials, the new law requires full clinical study reports to be published after a decision on – or withdrawal of – marketing authorization applications. Sponsors who do not comply with these requirements will face fines.
These groundbreaking changes will enter into force 20 days after publication in the Official Journal of the EU. However, it will first apply six months after a new EU portal for the submission of data on clinical trials and the above mentioned EU database have become fully functional. Since this is expected to take at least two years, the Regulation will apply in 2016 at the earliest (with an opt-out choice available until 2018).
We live in a time when increasingly our personal information is publicly available on the internet. This personal information includes our names and phone numbers, things we’ve written and things we’ve done, along with a good deal of information that only exists because we interact with others on the internet – thoughts that we might not have otherwise externalized, or that we certainly would not have saved so that others could read.
If all of this information is publicly available, all of this information can be gathered. Already advertisers analyze our behaviors to better target products to us. It is not hard to imagine a not so distant future where the government analyzes this data to determine whether we have a DSM mental disorder. By looking at the online behaviors of those already diagnosed – the way the syndrome affects their usage patterns, the sites they visit, and how they interact with others online – it is likely that one can find statistically significant usage patterns that can distinguish individuals with a diagnosis from those without. The available data could then be mined to identify other individuals that exhibit the usage pattern and allow for presumptive diagnosis.
Where does one start with AOL CEO Armstrong’s ridiculous and unfeeling justifications for changes in his company’s 401(k) plan. Cable TV and Twitter came out of the blocks fast with the obvious critiques. And the outrage only increased after novelist Deanna Fei took to Slate to identify her daughter as one of the subjects of Armstrong’s implied criticism. Armstrong has now apologized and reversed his earlier decision.
As the corporate spin doctors contain the damage, Armstrong’s statements likely will recede from memory, although I am still hoping The Onion will memorialize Armstrong’s entry into the healthcare debate (suggested headline, “CEO Discovers Nation’s Healthcare Crisis Caused by 25 Ounce Baby”). But supposing (just supposing) your health law students ask about the story in class this week. What sort of journey can you take them on?
For privacy advocates the last week contained something of a gut-check when the UK’s splendidly descriptive Health and Social Care Information Centre announced something of a bonanza for big data companies; the NHS’s care.data program, here, will make anonymized clinical data broadly available to researchers and commercial interests with few limitations, here.
For once, however, the US attitude to the growing big data phenomenon has appeared more robust. Writing on the White House Blog, here, Presidential counselor John Podesta announced he will be leading “a comprehensive review of the way that ‘big data’ will affect the way we live and work; the relationship between government and citizens; and how public and private sectors can spur innovation and maximize the opportunities and free flow of this information while minimizing the risks to privacy.” Results are promised in 90 days.
For health lawyers, however, the most interesting recent development has been the FTC’s denial of LabMD’s motion to dismiss, here. The LabMD complaint involves the data security practices of a clinical testing laboratory. The FTC alleged “unfair . . . acts or practices” under Section 5(a)(1) of the FTC Act. One of LabMD’s arguments for dismissal was that the specific HIPAA and HITECH statutes dealing with the health privacy and security obligations of covered entities blocked the FTC from enforcing its more general authority. According to the FTC:
Nothing in HIPAA, HITECH… reflects a “clear and manifest” intent of Congress to restrict the Commission’s authority over allegedly “unfair” data security practices such as those at issue in this case. LabMD identifies no provision that creates a “clear repugnancy” with the FTC Act, nor any requirement in HIPAA or HITECH that is “clearly incompatible” with LabMD’s obligations under Section 5.
LabMD is an important development. I have argued at length, here, that big data activities outside of HIPAA-protected space have illustrated the gaps in data protection because of the manner in which the US has regulated discrete vertical industries. LabMD suggests that the FTC is prepared to fill in the gaps.
I recently posted this draft on SSRN. Feedback much appreciated. Here is the abstract:
Fragmentation and lack of coordination remain as some of the most intractable problems facing health care. Attention has often alighted on the promise of Health care Information Technology not least because IT has had such positive impact on many other personal, professional and industrial domains. For at least two decades the HIT-panacea narrative has been persistent even though the context has shifted. At various times we have been promised that patient safety technologies would solve our medical error problems, electronic transactions would simplify healthcare administration and insurance and clinical data would become interoperable courtesy of electronic medical records. Today the IoM is positioning HIT at the center of its new “continuously learning” health care model that is in large part aimed at solving our fragmentation and lack of coordination problems. While the consensus judgment that HIT can reduce fragmentation and increase coordination has intuitive force the specifics are more complicated. First, the relationship between health care and IT has been both culturally and financially complex. Second, HIT has been overhyped as a solution for all of health care’s woes; it has its own problems. Third, the HIT-fragmentation solution presents a chicken-and-egg problem — can HIT solve health care fragmentation and lack of coordination problems or must health care problems such as episodic care be solved prior to successful deployment of HIT? The article takes a critical look at both health care and HIT with those questions in mind before concluding with some admittedly difficult recommendations designed to break the chicken-and-egg deadlock.
Who is responsible for the stewardship of the body? What about the body’s tissue? Or its blood? While many people suggest that each individual owns their own bodies, these questions can become considerably more complicated when viewed through the lens of a bio-banking effort. When blood and tissues are “inside” or “part of” our body, it may appear clear, but once removed from the larger whole, the question of ownership can become substantially more challenging. One of Wednesday’s pre-conference programs explored these issues in considerable depth, featuring perspectives from a range of relevant actors.
These issues generate unprecedented opportunities for healthcare innovators and entrepreneurs to design solutions that can effectively address widening disparities between healthcare supply and demand, particularly within vulnerable and underserved areas.
Presuming doctors, their helpers or your neighbors are going to look, ethical standards or not, shouldn’t patients be told if someone does? I think so. I think the transplant candidate had the right to know that he tweeted himself right out of a shot at a liver transplant. And you need to realize that information you put up on social media sites may wind up being used by your doctor, hospital, psychologist, school nurse or drug counselor.
Right now there are no rules or even suggestions to guide doctor-patient relationships over the Internet. Both now have new ways to look at one another outside the office or exam room. If they are going to continue to trust one another then we need to recalculate existing notions of medical privacy and confidentiality to fit an Internet world where there is not much of either.
In July, the Lancet covered Turkey’s development and implementation of universal health coverage extensively in an article and in supplementarycomments. The main article, written by those who are directly involved in the development of the health systems reform (including the former Health Minister), presents a success story. Within Turkey, however, the success of the reform hasbeendisputed. Two points in particular are being repeated by the Turkish Medical Association (TTB), doctors, and journalists: the negative effects of the reform on (1) the quality of health care personnel and (2) privacy.