White House Launches Effort To Revise Biotechnology Regulation

While I find the timing suspect, on Thursday John Holdren, Director of the White House Office of Science and Technology Policy (OSTP), announced (along with other senior White House staff) the Administration will be reviewing the Coordinated Framework for the Regulation of Biotechnology, the policy that designates agency responsibilities for overseeing the introduction of biotechnology products into the environment (H/T Grist).  First developed in 1986, the last revision was in 1992.  So, clearly overdue.

Holdren’s announcement accompanied a memorandum to the Environmental Protection Agency, the Department of Agriculture and the Food and Drug Administration.  It (along with Holdren’s blog post) outlines the elements of the review process:

  • Updating the Common Framework (with public input) to clarify the biotechnology product areas (not processes) for which each agency will be responsible.  This will include how to handle situations where more than one agency may be responsible.
  • Developing a long-term strategy (with public input) to ensure that the Federal regulatory process will be better prepared for emerging biotechnologies.  This would include horizon scanning exercises and additional support of so-called ‘regulatory science.’
  • An independent examination of the future landscape of biotechnology.  The National Academies have already been engaged to start this analysis.

This all sounds great, but there are some aspects of this that give me pause.  First, the announcement comes the afternoon before the July Fourth holiday weekend.  It screams news dump – an effort to ensure that very few people become aware of the effort.

Additionally, while the revisions and the strategy will involve public input, Holdren asks for people interested in additional information to register.  If this wasn’t already part of an announcement that seems timed to minimize public reception, I might not think much of it.  But I can see the Administration limiting its subsequent publicity on this project to the people who register.  If they are going to try and hold listening sessions around the country (the first one will take place this fall), I think they should spread their message far and wide.

Finally, I guess I’m still a bit chagrined from other efforts to revise (or develop) regulations related to science and technology research.  The effort to revise the Common Rule related to human subjects research stalled out after a big public comment push in 2011.  And it still seems as though the push on scientific integrity policies has failed mainly from a lack of coordinated follow-through from the OSTP.

I’d love to see this not happen with the revisions to the Coordinated Framework, but I’m not optimistic – especially with roughly 18 months to go with this Administration.

Just Because The Outbreaks Aren’t In The News…

Recent developments should reinforce the notion that media coverage is not a correlation to the incidence of disease.

While the measles outbreak in California hadn’t been in the news since April, when state officials declared it over, measles wasn’t eliminated from the country.  As of June 26, the Centers for Disease Control and Prevention noted 178 cases of measles reported in 2015.  Just 117 of them were connected to the California outbreak (which started in late 2014).  And today Washington state health officials reported a death from measles, the first reported death from the disease in the United States since 2003.  As the disease was once eliminated from the U.S., its reemergence reflects its persistence – and the continued resistance to vaccination.

Unfortunately, the 2014 Ebola outbreak has yet to produce a viable vaccine, and while it has dimmed from American attention, it continues to affect western Africa.  Cases continue in Sierra Leone and Guinea, and re-emerged in Liberia – more than three months after the last reported case.  Meanwhile the person appointed by the Obama Administration to coordinate the nation’s response to Ebola left that position four months ago.

So, just remember that because we’ve stopped paying attention doesn’t mean a problem has been solved.  It just no longer bothers us enough to do something.

Committee Offers Guidelines For Promoting Transparency and Openness

Also in the Policy Forum section of this week’s edition of Science is a longer paper on how journals and scientific organizations might promote transparency and openness in their research.  The Transparency and Openness Promotion Committee developed the guidelines, which are also available on the website of the Center for Open Science, one of the parties involved in the project.  The committee met in November of 2014, and was organized by representatives from the Center for Open Science, Science magazine and the Berkeley Initiative for Transparency in the Social Sciences.

Signing on to these guidelines means the organizations and journals are expressing support for the guidelines.  It also represents a commitment to review those guidelines for possible adoption.

The guidelines have eight kinds of standards: citations, data, analytic methods, research materials, reporting research design and analysis, preregistration of studies and preregistration of analysis.  For each of these standards categories, there are three levels of disclosure.  Per the guidelines, “Level 1 recommends citation standards, Level 2 requires adherence to citation standards, and Level 3 requires and enforces adherence to citation standards.”

The appropriate level for each type is something that each journal and/or organization can decide for itself.  There are variations in scientific norms by field where citations are concerned, and there are also matters of infrastructure and expense to consider.

So this project has come to the beginning of the middle.  Over the next several months the many signatories should be reviewing their practices and determining which standards they will adopt and at what level.  If your field has signed on, see what you can do to ensure that a commitment is made to follow these guidelines.

 

National Academies Aims To Prime Pump For Self-Correction Discussion

In this week’s issue of Science the Policy Forum section includes an essay from several senior researchers and research administrators discussing the challenges to improving the incentives for ensuring high integrity in research.  The group was convened by the National Academies and the Annenberg Retreat at Sunnyland.

The essay covers a number of concerns about vetting research results that have been heard before (publishing negative results, need for additional mentoring, independent validation/replication, etc.) and the initiatives several journals and institutions are taking to improve those processes.  But one particular item caught my attention, and that of others: distinguishing between retractions due to fraud or misconduct and those needed for other reasons.  From the essay:

“[V]oluntary withdrawal of findings by a researcher eager to correct an unintended mistake is laudatory, in contrast to involuntary withdrawal by a duplicitous researcher who has published fraudulent claims. Alternative nomenclature such as “voluntary withdrawal” and “withdrawal for cause” might remove stigma from the former while upping it for the latter.”

In other words, the authors suggest folks aren’t so inclined to report unintended mistakes because of the stigma attached to the word retraction.  Whether or not withdrawal for cause has a more negative stigma than retraction is unclear to me.

Changing the nomenclature may help, but as the essay also notes, the infrastructure for checking research results has not matured at a rate comparable to either the increase in scientific research output or the increasing ease of committing scientific fraud and other misconduct.

What might be more effective, but possibly more challenging, is implementing this goal from the essay – “We believe that incentives should be changed so that scholars are rewarded for publishing well rather than often.”  I think this is an excellent goal, but there are two sets of stakeholders that have locked into a notion of scientific research quantity as a proxy for quality.  Not only is it embedded within the university reward structure, but it is also integrated into policymakers discussions of scientific research support.  With Nobel Prize winner numbers often cited (often as a scientific equivalent of ‘mine’s bigger than yours’) efforts to encourage fewer publications are going to be looked at a little oddly from those who hold the purse strings.

There will be a National Academies report coming later in the year that should give more details on some of the ideas broached in this essay.  Hopefully it can also prompt the dialogue desired by the authors.

Commissioner Moedas Wants A Funding Council For Innovation

As part of a conference on the European Research Area (ERA), Commissioner for Research, Science and Innovation Carlos Moedas gave a speech (H/T ScienceInsider).  As part of his remarks, he discussed starting a second era for The ERA and the ‘Innovation Union.’

Commissioner Moedas is concerned about how effectively European Union member states have been commercializing the results of their research.  It’s not a problem unique to Europe.  Finding sufficient funding to take research results and create commercially viable products is always a challenge in part because the perceived investment risks often dissuade potential investors.

Institutional support can help facilitate commercialization, and one means the Commissioner would like to have at the ready is a European Innovation Council (possibly) modeled after the existing European Research Council.

I include the (possibly) because while I understand the urge to duplicate a successful organization for a slightly different purpose, a funding organization for research is not necessarily going to be as effective for funding innovation.  If the goal is to make it easier for private companies to invest in promising research, regulatory changes may be more effective than a funding council geared toward supporting potential innovators and/or innovative companies.  Hopefully this kind of examination will take place (if it hasn’t already) between now and 2017, when this possible Council will be discussed during the mid-term review of the Horizon 2020 research programme.

There are other things worth following in the Commissioner’s speech.  Two things that I will be very interested in watching are his proposal for a European Research Integrity Initiative and the idea of developing a research data repository for the EU.  (I don’t think putting it in the cloud is necessarily the greatest idea, at least not without serious access control provisions.)

National Library Of Medicine Urged To Lead More On Data

Last week National Institutes of Health (NIH) Director Francis Collins approved a report on a new strategic vision for the National Library of Medicine (NLM) (H/T ScienceInsider).  The Director requested the report at the beginning of the year, and it arrives not quite 3 months since the retirement of NLM Director Donald Lindberg, who led the Library since 1984.

NLM is responsible for a number of program related to medicine and health-related data.  PubMed is perhaps the best known outside of the biomedical community, but NLM operates many other health-related databases, physical artifacts and other records.  But the report calls fro NLM to be more of a leader in biomedical information across NIH, the federal government, and internationally.  This will include an expansion of NLM activities in data science and biomedical informatics, and the Library will need to be systematic and considerate in how it expands while continuing to deliver quality service to its many stakeholders.

The report contains specific suggestions about how NLM could expand its offerings and improve its services.  For instance, it suggests that the Big Data to Knowledge program be located in the NLM.  But with the Library needing a new Director, many of these actions will likely wait until that person is on board, and can determine the necessary adjustments to resources and personnel to implement these recommendations.

The National Parks Service As A Science Agency

In this week’s edition of Science, the editorial covers the upcoming centennial of the National Parks Service (NPS) and how it has supported science.  Written by Editor-in-Chief Marcia McNutt and the science adviser to the National Park Service Gary Machlis, the editorial describes how the NPS has served as a field study location for researchers in many fields, and encourages scientists to continue to do so.

The NPS scientific apparatus includes a number of different tools, in a wider variety of fields than you might expect.  This likely isn’t news for ecologists or researchers in related fields, but I think the NPS centennial could be a wonderful opportunity to help demonstrate that science and technology matter to many different government agencies.

I want to close by re-emphasizing some of the recommendations in the editorial.  The research data generated in the park system should be as widely shared as practical.  Researchers should be encouraged to use parks sites for study, and this includes citizen scientists.  You can recreate and investigate in the National Parks.