National Science Board Ends Year With Epic Fail On Public Comment Request

ETA – January 3

The National Science Foundation has repaired the bad link in its announcement and extended the deadline for comments to January 18.


Original Post – December 31

Yesterday the National Science Board (NSB) circulated this news announcement asking for comment on a report on data policies.  Comments are requested by January 10, continuing a particularly annoying trend in this administration of very short public comment periods.  The icing on this particular cake is not that the announcement was made on the Friday of a holiday weekend, but that the link to the report in the news announcement was broken.  Thanks to Peter Suber and Cliff Lynch for the correct link.

The report reflects the work of the NSB Task Force on Data Policies, which was charged in February 2010 with refining and improving National Science Foundation (NSF) data policies related to digital research data.  (The Office of Science and Technology Policy will shortly close a two month public comment period on public access to digital data generated by federal research.)  While I don’t think the NSB is anti-public, a review of the recommendations found in the report suggesst that the Board is still quite focused on the needs and interests of the scientists and engineers that receive NSF support.

And because I’m still a little annoyed, the highlighted language in this recommendation particularly struck me:

Provide leadership to Federal agencies and other national and international stakeholders in the development and implementation of digital research data policies, including the promotion of individual scientific communities to establish data sharing and management practices that align with NSF data policies.”

I would think the Office of Science and Technology Policy should be/is providing leadership to Federal agencies, but given the subdued public presence of that office, I can see why the NSB might think the NSF should fill the leadership vacuum.

Happy New Year, everyone!

An Immodest Proposal – Digitizing the Government

What heat and light that once circulated the We The People petition service has dissipated, but a recent petition there deserves greater attention.  Called YesWeScan, the effort is supported by the Center for American Progress, a liberal think tank, and, one of many organizations doing yeomans’ work in encouraging greater government transparency and public access to information.

The We The People petition is quite short, and to the point.

“The administration should create a group that will answer–within 1 year–the question “what would it take to scan .gov?” What are our federal holdings, what would it take to digitize them, how much would it cost, what are the economic and non-economic benefits?”

The effort is not actually new.  If you dig about on the YesWeScan site, you can see that there have been previous efforts to scan major federal holdings.  This would be much greater.  Strictly speaking, scanning .gov is not really what they’re looking to do here.  While I doubt there’s a comprehensive index of everything on .gov (they’re only now working on a review of .gov), this digitization would focus on everything the government has that isn’t digitized.  Which is likely larger than you imagine.

Unfortunately, the petition requires over 24 thousand additional signatures (as of this posting) by January 20 in order to get an official response.  Regardless of that outcome, the six questions posed by the project should be answered. Continue reading

The Why Behind the Retraction Matters At Least As Much As The What

The year began with a key retracted paper used in anti-vaccination efforts labeled a fraud.  That such a development is not mentioned in a year-end recap of scientific retractions – that includes an autism article – strikes me as shortsighted.  While it is mentioned in a more widely-published recap of retractions, most of that list strikes me as something that would pop up in a soundbite-ready Congressional criticism of scientific research.

But whatever list you’re reading, the focus is usually on what gets retracted, and usually doesn’t note that retractions are an important part of the process.  I don’t think an increase in retractions – if that is what’s happening – is a problem.  As the folks at Retraction Watch have put it quite well.

“science doesn’t stop when researchers publish a paper. But what also seems true is that once a paper is published, lots of people — authors and editors, in particular — are often reluctant to say just what’s happened next, particularly if it casts the study or the journal in a negative light.”

If the concern is over these end-of-year filler pieces, I think the concerns over publicizing retractions are overblown.  Now if the concern is more personal – concern over the impact on individual scientists and institutions – I can understand it more.  I just think it causes more harm than good to not talk about what is essentially post-publication peer review.

Harvard Researchers Publish Extensive Facebook Study; Privacy Concerns Forgotten?

Three Harvard sociologists have an article coming out in Proceedings of the National Academy of Sciences on the influences of tastes in a cohort of university students over their four years at…wait for it…Harvard.  They utilized Facebook data to conduct their research.  But when the data was initially released in 2008 (a condition of federal funding for the project), the university wasn’t identified.  The researchers mistakenly assumed that the data could remain anonymized.  They were quickly proven wrong.

This study also raised a bit of a ruckus about six months ago due to this and other concerns that the privacy of the student data was not effectively protected.  Amongst the concerns was that student assistants at Harvard had access to the data as part of their work supporting the sociologists.  The students whose profiles were used were not informed of the collection or the use of their data for research purposes.  Now, the data collection did begin before Facebook developed protocol for researchers interested in using their data.  Officials at the service indicate the project, as the Harvard researchers conducted it, would not likely happen the same way if it started today.

That does not necessarily mean that the concerns unique to social network research have been effectively addressed.  I am typically in favor of transparency, open access, and otherwise making data widely available.  That does not mean the rights and concerns of human subjects in such research can be overlooked.  That there’s little discussion of the issue in light of the research article concerns me.

NCATS Now A Reality – Will The Struggle Prove Fruitful?

With the now traditionally late signing of the new Fiscal Year budget (not quite 3 months late this time), we have an formal reorganization at the National Institutes of Health (NIH).  The National Center for Advancing Translational Sciences is official and the National Center for Research Resources is no more.  (A new center or institute requires greater Congressional action if a center or institute is not also removed.)

The idea behind the new center (NCATS) is to help ease the process of taking biomedical research and developing the relevant drugs, devices, or other forms of treatments for clinical use.  Now, there is a concern that additional government efforts in this area will crowd out either the fundamental research that NCATS is helping transform for clinical use, or private sector efforts in translational medicine.

However, perhaps the larger concern for NIH moving forward is to deal with the frustrations felt in some quarters as the process for developing NCATS and shifting programs from the National Center for Research Resources unfolded.  (The early announcement of the center’s formation didn’t help.) Should NCATS stumble in its first efforts, like this joint effort with the Defense Advanced Research Projects Agency and the Food and Drug Administration, it would not surprise me to see additional criticism come forward from aggrieved quarters.

Science and Technology Guests on Late Night, Week of December 26

5/9/12 – Edited to correct the spelling of Jon Ronson’s name.

12/26/11 – Original Post

As you might guess, lots of repeats this week (The Comedy Central shows are pre-empted).  So the science and technology content is, essentially, leftovers.

Tuesday – John Jon Ronson’s appearance on Conan, where he discusses his book The Psychopath Test.

Wednesday – Joshua Topolsky’s appearance on Jimmy Fallon’s program from earlier this month as his resident technology expert.

Thursday – Science Bob’s latest appearance on Live With Kelly runs again.

Friday – You can hear Alan Alda re-tell his Marie Curie stories on The Late Late Show with Craig Ferguson.

States Looking at Textbook Alternatives

Complementing Nature Publishing Group’s entry into online texts, several states have been developing systems to support online resources for its higher education students (H/T The Chronicle of Higher Education – check the comments section as well).  Whether it’s online courses or online textbooks, states are recognizing that at least for textbooks, platform independence helps in encouraging participation.  They certainly seem to make it easier to make it cheaper.

Merry Christmas, everybody.

Anti-Open Access Bill Sneaks Into Congress

Amidst the recent Congressional contretemps over payroll tax increases and intellectual property bills that might “break the Internet” Representative Darrell Issa introduced H.R. 3699, The Research Works Act.  This bill appears to put a stake in the ground against the National Institutes of Health open access policy and would run against the bills supporting open access for federally funded research.  How it might affect the ongoing open access review at the Office of Science and Technology Policy is unclear (the deadlines for comment there have been extended, slightly).  It might depend on which finishes first, that process or this bill.

Such efforts aren’t exactly new, but this bill takes a less complicated tack than its companion from the previous Congress.  H.R. 3699 simply mandates private-sector publisher consent (as well as the consent of the author(s) and their employer(s)) before any federal program would share such ‘private-sector research’ work.

This would arguably end the current NIH open access guidelines, which require depositing a copy of their federally funded work in PubMed, and otherwise making the research available after a set amount of time.

What’s a little surprising is that the sponsor of this bill, Representative Darrell Issa of California, has generally been in favor of greater access to information.  This includes areas of scientific research.  It would seem in this case that the Congressman was more responsive to the income streams of academic publishers than interests of disseminating federally funded information.  Public access bills have not gotten far in the Congress over the last few years (on either side of the issue), and I don’t expect that to change.  Of course, feel free to express your displeasure with the bill at your leisure.

Yes Virginia, There is Some Progress on Scientific Integrity Policies

Office of Science and Technology Policy (OSTP) Director John Holdren marked the passage of his latest deadline for scientific integrity policies with a blog post.  In that post he noted that 20 federal entities have produced final or draft policies that were submitted to his office.

Of those twenty entities, five have finalized policies:

  • Department of the Interior
  • National Aeronautics and Space Administration (NASA)
  • National Science Foundation (NSF)
  • National Oceanic and Atmospheric Administration
  • Intelligence Community

I have posted about draft and/or final policies for all of the above, save the intelligence community.  Should I find a publicly available version of that policy, I will post an analysis here.  I do need to review the final NSF policy, and will post on that soon.

The OSTP blog post notes that 13 other entities have submitted near-final policies, and you can expect to see at least some of them sent out for public review soon:

  • Department of Agriculture
  • Department of Commerce
  • Department of Defense
  • Department of Education
  • Department of Energy
  • Department of Health and Human Services
  • Department of Homeland Security
  • Department of State
  • Department of Justice
  • Department of Labor
  • Department of Transportation
  • Veterans Administration
  • US Agency for International Development
  • National Institute of Standards and Technology

Additionally the Environmental Protection Agency will submit a revised draft policy to OSTP in the next several days.

For those counting, that makes nineteen.  Presumably the twentieth is the Office of Science and Technology Policy itself, but I don’t know for sure.

You can now see a few of these policies via the OSTP resource library.  Why this was just a recent addition escapes me.

Automation Warps Your Perspective – Airplane Edition

Popular Mechanics has an excellent analysis of the Air France 447 crash, courtesy of the recently recovered flight recorders.  The plane crashed into the Atlantic in June 2009 during a Rio to Paris flight, and the crash was initially chalked up to a technical problem with the air-speed sensors during a bout of serious weather.  The flight recorder data suggest a different story, one where the disengagement of the auto pilot contributed to a series of consistent pilot errors.

The automation angle comes in when the analysis covers why the errors were made.

“Aside from the loss of airspeed indication, everything is working fine. Otelli reports that many airline pilots (and, indeed, he himself) subsequently flew a simulation of the flight from this point and were able to do so without any trouble. But neither Bonin nor Roberts has ever received training in how to deal with an unreliable airspeed indicator at cruise altitude, or in flying the airplane by hand under such conditions.”

The last phrase stands out for me.  Some pilots are allowed to fly cross-ocean trips that do not have training in flying the airplane by hand in certain conditions.  They are accustomed to having an automated system, certainly when the weather is nasty.

Yes, there are other issues involved in the crash, including some bad choices by the pilots that could have happened regardless of their simulation.  But airplanes are now sufficiently sophisticated that pilots are now expecting them to limit the things the airplane can do.

“The vast majority of the time, the computer operates within what’s known as normal law, which means that the computer will not enact any control movements that would cause the plane to leave its flight envelope. “You can’t stall the airplane in normal law,” says Godfrey Camilleri, a flight instructor who teaches Airbus 330 systems to US Airways pilots.”

“But once the computer lost its airspeed data, it disconnected the autopilot and switched from normal law to “alternate law,” a regime with far fewer restrictions on what a pilot can do. “Once you’re in alternate law, you can stall the airplane,” Camilleri says.”

It seems that the disengagement of the autopilot did not register for the pilots.  They may have recognized that the computer was no longer flying the airplane, but since the autopilot is also the flight computer, losing the autopilot means much more.  To borrow the language of the article, the law had changed, but they didn’t know it.  When pilots rely on the computer’s choices, the consequences of losing those choices needs to be more explicit.  Tough to do when all that’s needed – normally – is to flip a switch.