February 20 – Edited to reflect Dr. Patil’s proper title. I suppose I was projecting when I erroneously wrote that he would be a Deputy Chief Technology Officer for Data Privacy.
Earlier today Federal Chief Technology Officer Megan Smith announced that the Administration has named Dr. DJ Patil as Deputy Chief Technology Officer for Data Policy
Privacy and Chief Data Scientist. Patil has been a data scientist for most of his career, which includes work in academia, government and the private sector. He is credited as one of the people who coined the term data scientist. He comes to the White House from RelateIQ, where he was Vice President of Product.
Much like with the Chief Technology Officer position, The White House gives the Chief Data Scientist a broad set of responsibilities. From the announcement:
“As Chief Data Scientist, DJ will help shape policies and practices to help the U.S. remain a leader in technology and innovation, foster partnerships to help responsibly maximize the nation’s return on its investment in data, and help to recruit and retain the best minds in data science to join us in serving the public. DJ will also work on the Administration’s Precision Medicine Initiative, which focuses on utilizing advances in data and health care to provide clinicians with new tools, knowledge, and therapies to select which treatments will work best for which patients, while protecting patient privacy.”
Patil will also work on the government’s efforts in open data and data science. The initial press surrounding the announcement has focused on the data scientist part of Patil’s job title.
What could prove interesting is how much time he will spend on the data privacy part of that title.
In the last week two federal agencies, the National Aeronautics and Atmospheric Administration (NASA) and the Agency for Healthcare Research and Quality (AHRQ), released their open access policy plans (H/T OA Tracking Project, techdirt and SPARC). They follow the Department of Energy, which announced its plans last August.
AHRQ and NASA plans both rely on the PubMed Central database for depositing publications that come from research the agencies fund. PubMed Central is the repository used for compliance with the National Institutes of Health’s Public Access Policy. (It is not the same system that underlies the new Department of Energy system.) Each agency would expect the research publications to be available on PubMed Central within 12 months of publication.
But the new open access effort of the Obama Administration addresses both research publications and research data. PubMed Central is not set up as a digital data repository, and most of the effort to set up new systems has been focused on research publications. The AHRQ policy departs from this by including digital data. The policy does not establish an AHRQ repository, but would require the agency to contract with a repository for address storage of research data covered by the policy. Grantees would have to submit a data management plan (increasingly a requirement of federal research grants), and work with agency personnel to address the matter.
The NASA policy on digital data does not involve a digital repository of its own – at least at the moment. It will instead serve as a central point of information for accessing digital data stored elsewhere. Grantees would outline their digital data access strategy in a data management plan. The agency will make a registry available with metadata and access instructions for datasets generated by its funded research. NASA indicates in its plan that it will consider developing a research data commons (in consultation with other Federal agencies), but gives no timetable for that decision.
It is worth noting that NASA went so far as to commission an independent assessment of PubMed Central, the Energy Department’s PAGES system, and the publisher-encouraged CHORUS system before making its decision on how to handle research publications. That suggests to me how seriously the agency is approaching the matter of open access to scientific information. That NASA recognizes the infrastructure investment that is involved in complying with this policy, and that it wants to make a wise investment. I do hope other agencies are following suit.
This year’s big superhero movie, at least based on expectations, will be Avengers: Age of Ultron. Ultron, at least in the Marvel movies, is an artificial intelligence program that makes its own robotic body while wreaking havoc against the Avengers. While it may be the biggest robot movie of the year, it won’t be the only one.
Also in the blockbuster camp is the newest Terminator film, coming out in July. Subtitled Genisys (yes, that’s how they are selling it), the film appears to be revisiting many of the events of the first film. That suggests a complicated plot with characters trying to do what was done in the first film, but having to deal with the events of all the intervening films (and television series) and their consequences.
Robots will likely be found throughout the new Star Wars film, scheduled for a Christmas-time release. But as with the other blockbusters with robots, I’m not anticipating a lot of time spent on how these characters can provide insight on our own lives. But there are other robot movies this year that could do that.
Chappie premieres in the United States on March 6. The next film from Neil Blomkamp (District 9, Elysium) focuses on the first robot that can think and feel. He was a police droid (continuing a theme from the Robocop remake and Almost Human from 2014). Based on the director’s short film Tetra Vaal, Chappie captures how the authorities react to the emergence of artificial intelligence.
In April Ex Machina will be released widely in the U.S (it is already screening in a few countries). The film focuses on a programmer who must evaluate how human a breakthrough artificial intelligence is. The antagonist for the robot in this film is its creator, rather than the ‘authorities’ as in Chappie.
I’ll include Automata in this list, though the film has already been in theaters and was released on DVD in November. Continue reading
While the confirmation challenges continue, at least one high-level science and technology appointee is stepping down. Margaret Hamburg, Commissioner of the Food and Drug Administration, will step down at the end of next month (H/T ScienceInsider). Even with slightly less than two years remaining in the Obama Administration, there is a chance Dr. Hamburg’s successor may not be confirmed before a new President is sworn in. (I am encouraged by indications that a nominee has already been identified.)
Hamburg’s tenure is one of the longest in the agency’s history. When she took over, the agency was struggling, with ethical challenges over drug approvals. The agency is currently in the middle of regulatory changes for food safety oversight and medical device approvals, and it has not been perfect. There were complaints over the agency’s inaction over compounding pharmacies (which are not presently regulated by the FDA), and Hamburg was overruled by Health and Human Services Secretary Sebelius over Plan B emergency contraception (though the courts eventually sided with the FDA).
Coming next month from Harvard University Press is Science Policy Up Close, an edited collection of writings from former presidential science adviser John Marburger. Edited by Robert Crease, the book covers much more of Marburger’s life than his tenure as the longest serving science adviser (current occupant John Holdren should take that distinction in early 2016).
The first half of the book covers Marburger’s extensive service prior to joining the George W. Bush Administration. Marburger chaired the organization that oversaw construction of the Superconducting Super Collider (SSC) project, and chaired the Shoreham Commission, a New York State panel in the early 1980s that assessed the safety and other concerns surrounding the Shoreham nuclear power plant.
I have seen one review of the book so far, in Nature (you should be able to access a ReadCube version without a subscription). Written by Sir Peter Gluckman, the chief science adviser to the Prime Minister of New Zealand, the review reads as though Sir Peter is frustrated by what is not covered as much as he may respect what is in the book. Regrettably, Dr. Marburger passed in 2011, and may not have had much input into this project.
I do not yet have a copy of the book, and hope to get my hands on it at some point. I am interested to see how Marburger’s explorations of the science of science policy are addressed – it would appear to be in the last chapter of the book. Assuming I can tackle my reading piles (measured in fractions of my height), expect a post on the book sometime this year.
In what appears to be part of its response to the problems federal labs had this summer controlling pathogens, the Centers for Disease Control (CDC) is looking to hire a person responsible for lab safety agency-wide, with an emphasis on pathogens. Reuters, which broke the story, indicates this was recommended after an agency review following this summer’s incidents. Adding to the urgency, last week a CDC technician may have been exposed to Ebola.
While the hiring search is handled by the CDC head of Infectious Diseases, CDC microbiologist Leslie Dauphin will serve as acting lab safety head. She will likely expand safety training and explore other avenues for improving safety.
Disasters.data.gov was launched in mid-December (H/T Executivegov.com). The site is intended to serve as a repository of open data sets related to disasters (which can include severe winter weather along with the usual suspects). There are several tools designed for emergency preparedness linked to at the site, and
There are two ways in which people can help build out the site:
Innovator Challenge – The site is looking for short descriptions of answers to this question: ‘How might we leverage real-time sensors, open data, social media, and other tools to help reduce the number of fatalities from flooding?’
Call To Action – The site wants to host open data sets for information that has, traditionally, not been made available to the public. Most of the data sets currently on the site are federal in origin, so there is a bigger need for information at a more local scale. This could include evacuation routes, information on what businesses remain open during disasters, or other disaster-related data.
If you are involved in disaster preparedness and response (and aren’t already familiar with the site), there are other ways you can get involved.