Yves here. Fortunately, this post does not use the Wuhan Institute of Virology as the hook for this story but instead focuses on activist Edward Hammond, who in 2004 started to question biolab safety and got unsatisfactory answers.1 The short version is that the oversight mechanism, institutional biosafety committees, is weak. And this should come as no surprise, since these biosafety minders report to the lab funder, the NIH. No wonder they are set up to be in “See no evil” mode.
By Michael Schulson is a contributing editor for Undark whose work has also been published by Aeon, NPR, Pacific Standard, Scientific American, Slate, and Wired, among others. Originally published at Undark
In 2004, an activist named Edward Hammond fired up his fax machine and sent out letters to 390 institutional biosafety committees across the country. His request was simple: Show me your minutes.
Few people at the time had heard of these committees, known as IBCs, and even today, the typical American is likely unaware that they even exist. But they’re a ubiquitous — and, experts say, crucial — tool for overseeing potentially risky research in the United States. Since 1976, if a scientist wants to tweak the DNA of a lab organism, and their institution receives funding from the National Institutes of Health, they generally need to get express safety approval from the collection of scientists, biosafety experts, and interested community members who sit on the relevant IBC. Given the long reach of the $46-billion NIH budget, virtually every research university in the U.S. is required to have such a board, as are plenty of biotechnology companies and hospitals. The committees “are the cornerstone of institutional oversight of recombinant DNA research,” according to the NIH, and at many institutions, their purview includes high-security labs and research on deadly pathogens.
The agency also requires these committees to maintain detailed meeting minutes, and to supply them upon request to members of the public. But when Hammond started requesting those minutes, he found something else. Not only were many universities declining to share their minutes, but some didn’t seem to have active IBCs at all. “The committees weren’t functioning,” Hammond told Undark. “It was just an absolute joke.”
The issue has gained fresh urgency amid the Covid-19 pandemic. Many scientists, along with U.S. intelligence agencies, say it’s possible that SARS-CoV-2, the virus that causes Covid-19, emerged accidentally from a laboratory at the Wuhan Institute of Virology, or WIV — a coronavirus research hub in China that received grant funding from the NIH through a New York-based environmental health nonprofit. Overseas entities receiving NIH funding are required to form institutional biosafety committees, and while grant proposals to the NIH obtained by The Intercept mention an IBC at the Wuhan institution, it remains unclear what role such a committee played there, or whether one was ever really convened.
An NIH spokesperson, Amanda Fine, did not answer questions about whether the Wuhan institute has had a committee registered with the agency in the past. In an email, she referred to a roster of currently active IBCs, which does not list WIV. Other efforts by Undark to obtain details about meetings of the Wuhan lab’s IBC were unsuccessful. But so, too, were initial efforts to obtain meeting minutes from several IBCs conducting what is supposed to be both routine and publicly transparent business on U.S. soil. Undark recently contacted a sample of eight New York City-area institutions with requests for copies of IBC meeting minutes and permission to attend upcoming meetings. Most did not respond to initial queries. It took nearly two months for any of the eight institutions to furnish minutes, and some did not provide minutes at all, suggesting that in many cases, the IBC system may be as opaque and inconsistently structured as when Hammond, who eventually testified before Congress on the issue in 2007, first began investigating.
Indeed, recent interviews with biosafety experts, scientists, and public officials suggest that IBC oversight still varies from institution to institution, creating a biosafety system that’s uneven, resistant to public scrutiny, and subject to minimal enforcement from the NIH. Hammond and other critics say these problems are baked into the system itself: As the country’s flagship funder of biomedical research, the NIH, these critics say, shouldn’t also be charged with overseeing its safety.
For its part, NIH has argued that as an agency intimately involved in reviewing the complex details of biomedical research, it is well-suited to manage the network of committees ostensibly set up to help ensure the safety of that research. And the IBC system, the NIH says, is just one part of a multi-faceted biosafety apparatus. “They play an incredibly important part,” said Jessica Tucker, acting deputy director of the Office of Science Policy at NIH, “in this interlay of local and federal oversight.”
In some jurisdictions, including the research-heavy corridors of Boston and Cambridge, Massachusetts, the addition of local policies and oversight structures has provided a comparatively clear view of the potentially hazardous biomedical science undertaken there. But the wider network of IBCs remains far more opaque, and insights into how well they operate, or even whether they operate, remains unacceptably difficult to discern, Hammond and other critics say — perhaps even more so as a rising crop of for-profit companies offer IBC services to clinical research sites for a fee.
In recent interviews with Undark, biosafety professionals variously described Hammond as “kind of an asshole” and “like a bulldozer”— though those same experts also acknowledged that he has identified real issues. “A lot of what he’s saying makes sense,” said David Gillum, the chief safety officer for Arizona State University and a past president of ABSA International, the flagship professional organization for biosafety specialists in the U.S. Many people in the biosafety community, Gillum said, would agree that “the NIH, if it’s conducting the research — maybe they shouldn’t be self-policing.”
Altering the DNA of microbes and other organisms can bring incalculable social benefits, including new insights into pathogens, new tools for synthesizing drugs, and the development of lifesaving vaccines. Much of it poses little, if any, risk. But it can also, in some cases, involve potential hazards: A pathogen might escape, a lab worker or research subject might be harmed, or a genetically altered organism might spill into the wild without appropriate vetting.
Lab accidents involving pathogens do happen, though most are minor. In rare cases, laboratory workers suffer serious harm or die. Occasionally, incidents can have even broader consequences: Many scientists believe a flu pandemic in 1977 may have originated from an accident at a Soviet lab — though researchers have suggested other explanations in recent years. People sometimes hijack research for nefarious ends, too: The perpetrator of the 2001 anthrax attacks in the U.S., which killed five people, was almost certainly a federal laboratory worker with access to the bacteria and lab equipment.
In response to such risks, the U.S. has developed a range of methods to improve biosafety, which applies to accidents, and biosecurity, which applies to intentional misuses of the technology. In addition to IBCs, some institutions with large research operations employ biological safety officers, whose jobs include inspecting labs, advising researchers on safety practices, preparing materials for IBC review, and, sometimes, serving as IBC members. Research with some pathogens and toxins requires additional review from federal agencies — including background checks for employees and rigorous specifications for lab spaces. Those requirements are backed by the force of law, and are administered by the Centers for Disease Control and Prevention and the U.S. Department of Agriculture.
In the biosafety system, the IBC is a kind of local court, overseeing the implementation of the 149-page NIH Guidelines for Research Involving Recombinant or Synthetic Nucleic Acid Molecules. “We get thousands of emails every year with questions,” the NIH’s Tucker said. “So that is usually the entry point for discussions with IBCs about challenges that they may be facing.”
Much of this system emerged in the 1970s, after Paul Berg, Janet Mertz, and other researchers at Stanford University developed a technique to insert pieces of foreign DNA into E. coli bacteria. Scientists had a new power that could be used to engineer organisms with novel properties. But with that power came risk. “There is serious concern that some of these artificial recombinant DNA molecules could prove biologically hazardous,” a panel of prominent scientists, chaired by Berg, wrote in Science in 1974. Among other scenarios, they worried about the escape of a bacteria engineered to resist antibiotics.
The next year, the National Academy of Sciences convened a meeting, chaired by Berg, at the Asilomar Conference Center in California. Despite some calls to include laboratory technicians, custodians, and other members of the public, the Asilomar participants were mostly senior scientists, along with a few lawyers and public officials. The discussion laid out a roadmap for biosafety in the U.S. Notably, an official summary of the proceedings did not include the word “regulation.” The NIH Guidelines, issued in 1976, are just that — guidelines, rather than regulations with the force of law. If any of those requirements aren’t met, NIH can demand changes, and, at least in theory, pull funding.
“In essence, the goal was self-governance,” wrote Susan Wright, a research scientist emerita in the history of science at the University of Michigan, in a 2001 paper on Asilomar and its legacy. The guidelines allow institutions to largely police themselves, with the IBC exercising oversight for most research. When Sen. Edward Kennedy, a Massachusetts Democrat who passed away in 2009, proposed a bill that would hand the power of regulating genetic engineering to an independent commission, Wright said, major scientific organizations rallied to defeat the proposal.
The prospect of NIH oversight did not immediately reassure residents of Cambridge, Massachusetts, a major scientific research hub. In 1976, amid public alarm about a newly proposed virus and genetics laboratory at Harvard, the city council held a hearing on recombinant DNA research. At a packed meeting, some council members were skeptical that the NIH was equipped to handle the issue. “We’re gonna find ourselves in one hell of a bind,” said councilmember David Clem, “because we are allowing one agency with a vested interest to initiate, fund, and encourage research, and yet we are assuming that they are non-biased and have the ability to regulate that and, more importantly, to enforce the regulations.”
The city went on to pass its own biosafety regulations, enforcing compliance with NIH standards. In the years that followed, however, few other municipalities followed suit.
At a packed city council hearing in Cambridge, council members discuss the safety of recombinant DNA research with scientists, including the molecular biologist Maxine Singer, who attended as a representative of the NIH.
Video: MIT/J. Christopher Anderson
Edward Hammond founded The Sunshine Project, a bioweapons watchdog group, in 1999, along with a German colleague. They figured the subject would stay relatively obscure. Then the September 11 attacks happened, followed by the anthrax scare. In response, the George W. Bush administration and Congress poured billions of dollars into preparing for a bioterrorist attack. The number of laboratories studying dangerous pathogens ballooned.
When Hammond began requesting minutes in 2004, he said, he intended to dig up information about bioweapons, not to expose cracks in biosafety oversight. But he soon found that many institutions were unwilling to hand over minutes, or were struggling to provide any record of their IBCs at all. For example, he recalled, Utah State was a hub of research into biological weapons agents. “And their biosafety committee had not met in like 10 years, or maybe ever,” Hammond said. “They didn’t have any records of it ever meeting.”
Other sources from the period after the 9/11 attacks suggest that institutions were often flouting the NIH Guidelines. In 2002, 2007, and 2010, a group of researchers conducted surveys of hundreds of IBCs. Of the IBCs that responded, many were failing to train their members, and many were conducting expedited reviews of research without full committee input — both violations of NIH requirements. In the 2010 survey, nearly 30 institutions reported that they had no formal process to ensure that relevant experiments even received an IBC review.
The NIH has sometimes cracked down on institutions. In 2007, a young insect geneticist, Zach Adelman, joined the IBC at Virginia Tech. Not long after, the NIH determined the IBC was not functioning properly, and made the committee members go back and re-evaluate all relevant research on campus.
If these labs are required to keep minutes and show them on demand, perhaps that is why they are being set up in foreign countries so that they are not subject to such demands.
Priority ordering checks out. Profit primacy and everything else is just a mild annoyance. Rebecca Moritz is angling to be Bill Gates’ next wife, the head of the WHO, or running the NIH in some shape, way, or form. Anyone know if her or her institution enjoy a sizable revenue stream from licensing opportunities to biotech firms? Employment opportunities? Stock grants?
One thing I have noticed is that the culture today is distinctly different than 20 years ago. I have worked in BSL-2 and 3 level labs in the past and the attitude today is far more risk-averse and risk-conscious than what Hammond is talking about.
I am not sure how universal this is, but there is more attention on safety at the managerial level. 20-25 years ago training was very haphazard and there was little record-keeping to ensure staff was up-to-date. Now there is required training, refreshed every 12 months and lab access is strictly controlled, usually through electronic access.
I also noticed that staff is more safety conscious, I seen several times in the last few years people refuse to do experiments because they considered safety precautions to be inadequate. So in my experience current culture is different than one described by Hammond, although I would agree that macho culture was prevalent in the past.
Lab culture is like mgmt culture = highly dependent on the individuals in charge.
I have worked in labs in around the world and adherence to paperwork, SOPs, signs, electronic safeguards – little to do with baseline culture rule-obidience, all to do with likelihood & severity of getting in trouble.
Regarding foot note 1: that study has authors with clear conflicts of interests & several hospitals & the infamous WIV are in the centre of the heatmap of clusters. See https://m.youtube.com/watch?v=BYJX8_vWei4 for the longform.
No other coronavirus has a FCS & the virus genome has a 19nucleotide sequence only found otherwise in pre-pandemic Moderna patents, see Frontiers of Virology paper. Does it matter? If humans created & caused this mess, then this article is part of solution. Otherwise mind the bat.
There is a huge number of bat viruses and only a tiny handful are in labs being studied.
See for instance, they have to speculate which is an indirect admission of how very limited the sampling is:
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6356540/
Any info on the Ukrainian Bio-labs and what they were doing?
This was very interesting. Biolab oversight is something I took for granted but always with unease. I had no idea it was so sloppy. The real danger of our irresponsibility was front and center during the anthrax attack. Nobody could trace the stuff with certainty. And final results have never been published. If our brave new world is going to have a professional labor force for things like biolab inspection, that sounds like a very good thing to me. I’m really sick of taking-heads, pols and bureaucrats who don’t know anything but still stand up in front of a camera and pretend that they do and that everything is under control. Clearly, it has not been. One glaring point of failure is the military biowarfare labs. A feature not a bug. The deadliest incubators. They will never be openly networked along with their spinoffs at the universities; there will always be stuff that is not made public. That’s disconcerting. One thing we could do is train dogs, rats and other animals to suss out places we think are fudging the truth. Or canaries in the coal mine.
Very informative and alarming. I’d also nominate for concern the new breed of lunatics known as ‘bio-hackers’. Until recently these people contented themselves with such inanities as surgically implanting Bluetooth equipped ID chips so they could avoid the inconvenience of paying for a soda at their workplace with real money. However, now they’ve branched out into DIY genomic modifications, up to and including using CRISPR (whether they pay royalties to MIT is uncertain ;-) I listened some months ago to an NPR science program that profiled quite favorably one of these guys. These people are inspired by the sort of cyber-libertarianism that you find in Silicon Valley and elsewhere where blockchain disciples gather. The big difference is crypto, worst case, separates a lot of fools from their money (and maybe sets off the next Global Financial Crisis). Biohackers combining DNA from different organisms, including themselves, could conceivably brew up a new virus that ends us all.
P