Please join the Food and Drug Law Institute and the Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard Law School for an academic symposium on cutting-edge legal and regulatory issues facing FDA. Leading academics will present papers on mobile health, stem cells, personalized medicine, and other novel medical product issues, as well as food regulation. Papers will be available to registered attendees in advance, and will be published in an upcoming issue of the Food and Drug Law Journal.
Registration is now open online. A limited number of free seats are available to Harvard affiliates. For more information or to request a seat, please email us at firstname.lastname@example.org.
The Health Subcommittee of the House Energy & Commerce Committee held a hearing last week on the FDA’s proposed draft guidance regarding laboratory-developed tests (LDTs), as part of its “21st Century Cures” initiative. The hearing, which can be viewed online (here and here), featured representatives from the FDA, industry, and research organizations. And although the various panelists offered differing views on the propriety of the FDA’s decision to begin exercising its regulatory authority over LDTs, there seemed to be more agreement than disagreement among the panelists.
Most interestingly, as Representative Henry Waxman pointed out toward the end of the hearing, “[no]body on the panel [is] arguing that there shouldn’t be a very careful scrutiny of these tests. It seems like the question is who should do it: CLIA or the FDA.” Representative Waxman’s subsequent colloquy with Harvard Medical School Professor Christopher Newton-Cheh on this point particularly helped to differentiate the historical roles of CMS and the FDA in this space. But even those panelists who opposed the FDA’s involvement seemed supportive of expanding CMS’ authority under CLIA to conduct clinical validity analyses. (Anyone interested in the administrative law aspects of this issue should know that problems of shared regulatory jurisdiction have recently received increased scholarly attention, with Jody Freeman and Jim Rossi providing a particularly thorough treatment of the issue in their recent article, Agency Coordination in Shared Regulatory Space.) Continue reading →
There was an article a couple of weeks ago in the New York Times about “engineered foods,” and how “a handful of high-tech start-ups are out to revolutionize the food system by engineering ‘meat’ and ‘eggs’ from pulverized plant compounds or cultured snippets of animal tissue.” The author discussed some of the business models, interviewed some of the entrepreneurs, and contemplated some of the implications of “Food 2.0.” The concerns she noted involved the nutritional impact of these foods, and the possibility of resource-intensive production.
But what sort of screening will these food products go through before they enter the food supply? How will FDA vet these new foodstuffs for toxicity or allergenic properties? It won’t. Manufacturers can self-affirm that food products are safe, based on their own studies, and market them on that basis with no prior FDA approval. This is the GRAS (generally recognized as safe) route to marketability. A manufacturer could also choose to submit to the formal food additive approval process, which is extremely time-consuming, but considering the breadth of the GRAS exception, they probably won’t. Many more GRAS notifications are filed per year than food additive petitions.Continue reading →
Reversing its previous deference to corporate speech interests, the U.S. Court of Appeals for the D.C. Circuit came down in favor of consumer protection in a July 29 decision. In American Meat Institute v. U.S. Dept. of Agriculture, the court upheld a federal government regulation requiring meat companies to disclose the countries of origin for their products. If your beef comes from Argentina or Canada, you will know that from its label.
More importantly, the court gave the Food and Drug Administration greater freedom to reduce tobacco use in the United States. In explaining its reasoning, the court repudiated the logic of an earlier decision by the court that rejected the FDA’s graphic warnings for cigarette packs. According to the meat labeling opinion, the cigarette warning decision did not allow sufficient leeway for the government to mandate warnings or other informational disclosures to consumers.
Perhaps the U.S. Supreme Court will restore the D.C. Circuit’s previous balance, but for now, the tide has turned in favor of the public’s health.
Last week I blogged about recent publications concerning the global battle against anti-microbial resistance (AMR). I did not mention a recent paper published in the June 2014 issue of Nature, which describes how European and U.S. researchers and authorities are increasingly considering clinical research in unconventional areas to fight AMR. The news-report “Phage therapy gets revitalized” by Sara Reardon concentrates on the use of viruses (bacteriophages) to battle bacteria. The idea is not new, but apart from some applications in the former Soviet Union, it never was established as a major research area elsewhere. In particular the paper examines the European Phagoburn project, which is the first large, multi-centre clinical trial of phage therapy for human infections, funded by the European Commission. It involves a phase I-II trial of using viruses for the treatment of bacterial infection following burns. The European Union (EU) is contributing €3.8 million (US$5.2 million) to the Phagoburn study demonstrating that it is taking the approach seriously. Meanwhile, the US National Institute of Allergy and Infectious Diseases announced in March 2014 that it regards phage therapy as one of seven key areas in its strategy to fight antibiotic resistance.
So far Western practice has concentrated on treating complex or unidentified infections with broad-spectrum antibiotics. These antibiotics would typically eliminate multiple types of bacteria, including those who have beneficial effects to the human organism. Despite resulting in direct negative consequences for patients, e.g. gastrointestinal disorders, these “atomic bomb” approaches can result in biological niches where resistant “bad bugs” can prosper. This is the reason why scientists are turning towards more targeted approaches. This is where phage therapy comes into play. Like “guided missiles”, phage-therapy has the ability to kill just species of bacteria or strain. Quoting the US virologist Ryland Young and the head of the scientific council at the Eliava Institute in Tblisi (Georgia), Mzia Kutateladze, the Nature report explains how nature offers an almost unlimited source of different phages and that so far no identical phages have ever been found. For this reason it is fairly simple to identify a particular phage for a bacterial target. If the bacterium should become resistant against that particular phage, researchers would modify the viral cocktails that are used for treatment by adding or substituting phages. At the Eliava Institute such updates occur – according to the report – approximately every 8 months and the scientists would not be fully aware of the precise combination of phages in the cocktail.
In light of these advantages the recent interest of US and EU stakeholders in phage therapy comes as no surprise. However, the scientific and legal challenges confronting these projects are complex. After all we are talking about viruses here, which triggers alarm bells with regard to public perception, safety concerns, and the regulation of relevant research. It also appears questionable if – or under what circumstances – regulatory authorities would be willing to grant market approval for such a rapidly changing product like in the case of e.g. influenza vaccines. Another significant problem for the development of new phage therapies, also addressed in the paper, lies in the reluctance of pharmaceutical companies to invest into the field. The potential obstacles for more private involvement in phage therapy are many and range from considerable risks of failure, reputational damage, and unforeseeable side-effects to insufficient certainty with regard to intellectual property protection and guarantees of a profit.
In January, the Food and Drug Administration (FDA) approved the use of the PillCam COLON 2 as a minimally-invasive means of viewing the colon, a development that is sure to be welcomed by U.S. patients who currently undergo an estimated 14 million colonoscopies each year. While the approval represents a major step forward, the PillCam is unlikely to supplant current procedures just yet.
The colon has traditionally been examined via optical colonoscopy, a procedure perceived by many to be uncomfortable and embarrassing that involves insertion through the rectum of a 5-6 foot long flexible tube as part of an examination that can take 30 to 60 minutes. Air must be pumped in through the rectum in a process called “insufflation.” Sedatives and pain medication are generally used to help relieve discomfort. In contrast, the PillCam COLON contains a power source, light source, and two tiny cameras encapsulated in an easy-to-swallow pill that produces no pain or even sensation as it moves through the colon. Reflecting the absence of discomfort, one report from a clinical researcher noted that a few patients have insisted on X-rays to confirm that the device had passed in their stool (FDA Consumer). The pill takes about 30,000 pictures before passing naturally from the body, which usually occurs before the end of its 10-hour battery life.
The safety record of capsule endoscopy, the category to which the PillCam COLON belongs, so far appears to compare favorably with the alternatives. Capsule endoscopy may be less likely to produce accidental colonic perforations or other serious complications, which occur in less than 1% of traditional colonoscopies despite the best efforts of the treating physician. Tears of the colon wall can in turn “rapidly progress to peritonitis and sepsis, carrying significant morbidity and mortality.” (Adam J. Hanson et al., Laparoscopic Repair of Colonoscopic Perforations: Indications and Guidelines, 11 J. Gastrointest. Surg. 655, 655 (2007)). Splenic injury or other serious complications also occur rarely with optical colonoscopies. Unlike “virtual colonoscopy,” which uses computed tomography (CT) to peer into the body, capsule endoscopy does not involve bombarding the body with radiation. A leading study published in the New England Journal of Medicine reported no serious adverse events among 320 subjects given the PillCam COLON, and concluded that use of the device was “a safe method of visualizing the colonic mucosa through colon fluids without the need for sedation or insufflation.” Continue reading →
Richard A. Epstein is a professor of law at NYU Law School, a Senior Fellow at the Hoover Institution, a Senior Lecturer at the University of Chicago and a visiting scholar with the Manhattan Institute’s Center for Legal Policy. His forthcoming book is “The Classical Liberal Constitution,” from Harvard University Press.
On November 22, 2013, the Food and Drug Administration flexed its regulatory muscle by sending a warning letter to a genetic-testing company that goes under the stylish name of 23andme. The object of FDA scorn was a diagnostic kit that the tech company, backed by among others Google and Johnson & Johnson, sold to customers for $99. The kit contained an all-purpose saliva-based test that could give customers information about some 240 genetic traits, which relate to a wide range of genetic traits and disease conditions. The FDA warning letter chastised 23andme in no uncertain terms for being noncooperative and nonresponsive over a five-year period in supplying information that the FDA wanted to evaluate its product as a Type III device under the Medical Devices Act.
Legal Regulation of 23andme
There is no doubt that the FDA is on solid legal ground. This case is not like the processes involved in Regenerative Sciences, LLC v. United States, where the FDA asserted that physicians’ use of certain stem-cell procedures for joint disease involved the use of a drug that required FDA approval before it could be approved for use. In an earlier essay for the Manhattan Institute, I argued that this classification was in fact both legally incorrect and socially mischievous. In this case, the legal arguments are not available to 23andme because the current definition of “medical devices” covers not only those devices intended for use on the human body, but also those used for the diagnosis of disease. The Type III classification means that this device has to receive premarket approval from the FDA, which in turn requires that it be shown to be safe and effective for its intended use. Getting approval under this standard is arduous business, because any such approval must be for each of the tests separately. 240 tests thus require that number of approvals. The costs are prohibitive, and the delay enormous.
The FDA Warning Letter is significant both for what it says and for what it does not say. Continue reading →
At Regulatory Focus earlier this week, Alexander Gaffney wrote about what he characterized as “a torrent of studies” that FDA is conducting or has proposed conducting on prescription drug promotion, and, in particular, on direct-to-consumer advertisements. The studies include, among others, a surveystudy aimed at sussing out “the influence of DTC advertising in the examination room and on the relationships between healthcare professionals and patients”, a study exploring similarities and differences in the responses of adolescents and their parents to web-based prescription drug advertising, and a study that will use eye tracking technology to collect data on the effect of distracting audio and visuals on participants’ attention to risk information.
Gaffney speculates that “the proposed studies could indicate coming changes in FDA’s regulatory approach toward advertising[.]” Another possibility is that the studies are part of an effort by FDA to build up the evidence base supporting its current regulatory approach. In a Tweet commenting on Gaffney’s article, Patricia Zettler–a Fellow at Stanford Law School’s Center for Law and the Biosciences who was formerly an Associate Chief Counsel for Drugs at FDA’s Office of Chief Counsel–asks whether the data generated by the studies could help insulate FDA from First Amendment challenges. Continue reading →
Stem cells have been an endless source of fascination and controversy since Dolly the sheep was cloned in 1996. This month’s announcement of a cloned human embryo from a single skin cell  came on the heels of Sir John B. Gurdon and Dr. Shinya Yamanaka’s receipt of the 2012 Nobel for Physiology and Medicine for their work with induced pluripotent stem cells. Pluripotent stem cells can be embryonic or induced. Embryonic stem cells (ESCs) can generally be obtained from human embryos or by cloning embryos through somatic cell nuclear transfer (SCNT), as was done for Dolly. Gurdon and Yamanaka demonstrated that pluripotent cells may also be formed by reprogramming adult cells to an embryonic state, resulting in induced pluripotent stem (iPS) cells without having to use eggs or cloning, or destroy embryos. However derived, pluripotent cells are capable of differentiating into virtually any cell type in the human body. This imbues them with great promise for scientific breakthroughs and medical advances, but also raises serious ethical, legal and safety concerns about their use.
Less controversial are “multipotent” adult stem cells (ASCs) which do not involve embryos or raise as many safety concerns as pluripotent cells. ASCs are found throughout the body. Their ability to differentiate is more limited than pluripotent cells but is vast nonetheless. The NIH’s clinicaltrials.gov site lists some 4500 ASC trials as compared with 27 for embryonic stem cells and 21 for induced pluripotent stem cells. Recent announcements of new stem cell treatments usually involve ASCs, such as last month’s news that a toddler born without a trachea received a new one made from her own adult stem cells. It is therefore no surprise that ASCs have captured the attention of researchers, investors, physicians, patients and – increasingly – regulators, both here and abroad.
A growing number of physicians routinely offer treatments involving ASCs to their patients which can be performed in their offices. Autologous adult stem cells, used to treat a variety of conditions, are harvested from the patient, processed, and returned to the same patient. It is no surprise that moving ASCs from laboratories to physician offices raises complex questions of law. We consider one of the more pressing ones: to what extent can the FDA regulate a physician’s ability to treat a patient with that patient’s own stem cells? In the coming months, the D.C. Circuit Court of Appeals will hear oral arguments on this very issue in United States v. Regenerative Sciences.
On May 22. 2013, the Senate Health, Education, Labor and Pensions (HELP) Committee unanimously approved S.959, “The Pharmaceutical Compounding Quality and Accountability Act,” and S.957, “The Drug Supply and Security Act,” (now incorporated into S. 959 as an amendment). Congressional efforts to enact comprehensive legislation to improve drug safety and secure the nation’s drug supply chain have lingered for over a decade. The lack of federal uniformity has allowed a patchwork of state legislation to emerge, attracting the less scrupulous to those states with the lowest security. The issue finally gained traction among HELP Committee members when 55 people died and 741 more became ill after contracting fungal meningitis from contaminated steroid injections made by the New England Compounding Center (NECC). Committee member Sen. Pat Roberts (R-KS) stated that given prior reports of problems with NECC, this tragedy could have been averted but for a “shocking failure to act” by NECC, state and federal regulators, and Congress.
As NECC’s role in the meningitis outbreak came to light,gaps in regulatory oversight did, too. The federal Food Drug and Cosmetic Act (FDCA) currently recognizes only two categories of pharmaceutical manufacturers: commercial pharmaceutical companies and compounding pharmacies. To qualify as the latter under federal law, the entity must make individual or small batch, patient-specific drugs and do so only with a physician’s prescription for that patient. Compounded drugs must be either be unavailable in the commercial market or needed in commercially unavailable doses or combinations. The FDCA exempts such compounders from its pre-marketing requirements applicable to commercially manufactured drugs. Thus, federal law clearly covers commercial pharmaceutical manufacturers, state law just as clearly oversees and licenses pharmacies but as the NECC case demonstrates, there is nothing clear about the responsibility for inspecting, licensing or otherwise overseeing compounders that do not fill prescriptions on a per patient basis.
Instead of compounding in response to an individual prescription, the New England Compounding Center made large batches of drugs for institutional buyers such as hospitals. Many of its drugs were commercially unavailable but some were knock-offs of marketed FDA-approved drugs – a practice which is clearly unauthorized. NECC’s business model was certainly not unique; neither was the limited and erratic response of state and federal regulators to complaints about the facility’s unsafe manufacturing practices. Congress knew that large-scale compounders existed along with concerns about their safety. Several members of the Senate HELP Committee had worked on curative legislation for over ten years, but made few inroads until the NECC crisis prompted the HELP Committee to shift from park into drive.
Manufacturers assert that they have no obligation to provide consumers with notice through labeling when ingredients created through innovative technologies are introduced into consumer products designed for human consumption. On the other hand, consumers take the position that they have the right to know what ingredients are in these products, especially when ingredients are novel and the risks associated with exposure to them are unknown. Recent events suggest that this problem may be developing a life cycle that savvy manufacturers should be watching. The first in what may be a series of examples of this life cycle is the conflict over the labeling of genetically modified plant ingredients in food.
When industry ignored this consumer preference, a market was created for products that are “GMO-free.” Thus, the practice of “GMO-free” labeling was born. The growing consumer labeling movement also triggered repeated attempts to pass labeling laws. While these efforts have been unsuccessful to date, they are gaining traction – for instance, it cost industry 40 million dollars to block California’s prop 37 calling for mandatory labeling last fall. With more legislative proposals cropping up (a ballot initiative in Washington State and legislative proposals in Connecticut, Vermont, New Mexico and Missouri), a growing consumer boycott of some organic or “natural” brands owned by major food companies and a recently introduced popular mobile app by Fooducate that allows consumers to check for GMO content in a growing number of products, industry may be seeing the writing on the wall. Just this year, Ben & Jerry’s Ice Cream has decided to remove GMO ingredients from its supply chain. And the Meridian Institute, which organizes discussion of major issues, convened a meeting in Washington last month that included executives from PepsiCo, ConAgra and about 20 other major food companies, as well as Wal-Mart and advocacy groups that favor labeling. See here. Many are predicting that voluntary labeling may be right around the corner.
It appears that this life cycle of manufacturers’ refusal to disclose innovative ingredients with unknown risks and consumers’ reactive self-help measures may be repeating itself in the context of the use of nanotechnology in consumer products.
While reading some of the great articles from the health section of the New York Times over the holidays it struck me that such articles, in their need to be concise and accessible, often give only passing treatment to regulatory concepts that can be fundamental to the story. Accordingly, I thought it might be useful to write a series of posts digging down a bit deeper into some of the regulatory foundations of health stories that percolate up to public attention through the news. In this post I’ll begin by looking at an interesting point relating to drug efficacy standards raised by an article about a newly expensive (but decades-old) drug.
In Andrew Pollack’s “Questcor Finds Profits, at $28,000 a Vial” we read that a drug called Acthar, first approved by the FDA in 1952 and used primarily to treat rare infantile spasms, has in recent years become a very expensive and (for it’s maker) lucrative treatment for conditions ranging from multiple sclerosis to rheumatologic conditions. The article is worth a read for its thoughtful discussion of drug pricing, but it also makes passing reference to a some important regulatory concepts that bear further examination. One issue that particularly stood out to me was Pollack’s statement that Questcor, Acthar’s manufacturer, has been able to market the drug for a variety of uses “without being required to prove that the drug actually works” because it was “essentially grandfathered” into an anachronistic efficacy standard by being “approved for use in 1952, before the [FDA] required clinical trials . . . .” On first read, that sounds fairly alarming, so I thought it might be worthwhile to unpack the law around such “grandfathered” drugs a little. While it is true that FDA did not require proof of effectiveness for new drugs until lawmakers included this requirement in the Drug Amendments of 1962, it isn’t the case that pre-1962 drugs simply get a free pass on proving effectiveness. The truth, as one might expect, is somewhat more complicated. Continue reading →
How to Survive a Plague is a moving chronicle of the onset of the AIDS epidemic as seen through the lens of the activists who mobilized to identify and make available the effective treatments we have today. Beginning at the start of the epidemic, when little was known about the HIV virus and even hospitals were refusing to treat AIDS patients out of fear of contagion, the film follows a group of leaders in the groups ACT-UP and TAG. Using existing footage interspersed with current-day interviews, it tells the story of how patients and concerned allies pushed the research community to find a way to treat what was then a lethal disease.
The film’s portrayal of the U.S. Government, specifically then-President George H. W. Bush and high ranking officials in the Food and Drug Administration, is damning. As hundreds of thousands of people became infected with HIV and the death toll rose, prejudice against marginalized groups (especially gay men, IV drug users) contributed to a lack of urgency about the need to learn how stop the spread of the virus and how to treat the opportunistic infections that killed people with full-blown AIDS. In contrast, footage of demonstrations, meetings, and conferences highlights the courage of the activists who risked and endured discrimination, beatings and arrests to bring attention to the need for more research.
But How to Survive a Plague is more than a documentary about the power people have to make change when they join together to demand action. It also is a provocative commentary about unintended consequences. I saw the film while attending the annual Advancing Ethical Research Conference of Public Responsibility in Medicine and Research (PRIM&R). In that context, I was especially interested in the way How to Survive a Plague highlights an interesting ethical issue in clinical research. Namely, the problem of protecting people so much from research risks that the protection itself causes harm. Continue reading →
I’m sure many of us are talking about the contaminated steroid injections which have spread a fungal form of meningitis Exserohilum rostratum across the country. The CDC, which as is usually the case is doing an excellent job of providing clear and current information, reports that as of “October 17, 2012, a total of 47 patients have laboratory-confirmed fungal meningitis.” They offer some reassuring information—that “this form of fungal meningitis is not contagious” and some scary information—that there are 257 cases and ten deaths in 15 states and that incubation periods last up to a month.
In 2007, motivated by concerns that pharmaceutical companies were not sharing negative data about what had been learned in clinical trials, Congress established enhanced reporting requirements.
A series of articles published in January 2012 in the British Medical Journal demonstrates that data reporting remains deeply problematic, especially for industry-sponsored trials. (The articles can be found here and are very much worth reading).