The Morning Jolt Could Be Taken Literally

I missed it when the press ran with it earlier this year, but thanks to the Star Trek website, I’m up to speed on Air Force efforts to substitute electricity for caffeine.  The intent is to use mild (and the emphasis should be on mild) electric shocks to keep airmen and women alert and attentive.

Efforts are still quite preliminary, but research scientists interviewed have claimed that noninvasive stimulation of the right areas of the brain have enabled subjects to respond better cognitively than a control group that had been awake for comparable amounts of time.  The techniques used are taken from electrical stimulation procedures used for some psychiatric conditions.  (The levels of electricity are much, much lower than what had been used in so-called electroshock therapy.  At 1-2 milliamperes, the shocks are just perceptible.)

One challenge the researchers still have to address is determining which areas of the brain need stimulation in order to achieve the increased alertness and cognitive function.  The research has focused on subjects who work in persistent data monitoring, and researchers are optimistic that a working device may be achievable in five years or so.  Whether or not something like this could (or should) be available commercially is a question for another day.

 

Bioethics Commission Isn’t Waiting For Next Meeting To Issue BRAIN Report

The Presidential Commission for the Study of Bioethical Issues has a lot to say about the BRAIN Initiative.  So much that the Commission report will take at least two volumes.  The Commission released Volume One of Gray Matters: Integrative Approaches for Neuroscience, Ethics, and Society today.  It’s the seventh report of the Commission since it was formed in late 2009.

The report was prompted by a request from President Obama to “identify proactively a set of core ethical standards – both to guide neuroscience research and to address some of the ethical dilemmas that may be raised by the application of neuroscience research findings.”  The recommendations in Volume One of the report are focused on achieving a more explicit integration of ethics into neuroscience research throughout the life of a research program.  There are four main recommendations:

  1. Integrate ethics early and explicitly throughout research
  2. Evaluate existing and innovative approaches to ethics integration
  3. Integrate ethics and science through education at all levels
  4. Explicitly include ethical perspectives on advisory and review bodies

Of specific application to the BRAIN Initiative is the need to include professionals with expertise in ethics in advisory boards and similar entities conducting research in this area.

Volume Two will focus more on the social and ethical implications of neuroscience research, topics likely to appear on the agenda of the Commission’s next meeting.  As a hint of what may be in that report, the Commission notes four examples that demonstrate the need to better integrate ethics throughout the course of neuroscience research:

  1. Neuroimaging and brain privacy;
  2. Dementia, personality, and changed preferences;
  3. Cognitive enhancement and justice; and
  4. Deep brain stimulation research and the ethically difficult history of psychosurgery.

The Commission did not give a deadline for when Volume 2 would be ready, but it may provide some insight on that front during the June meeting in Atlanta.

The Canadians Are Looking For A Few Good Science Policy Sessions

The Sixth Canadian Science Policy Conference (CSPC) is scheduled for October 15-17 in Halifax, Nova Scotia.  As I’ve said before, this event is unique in its focus and cross-section of attendees.  I can find no comparable U.S. event where researchers and practitioners participate as colleagues to discuss and define the important issues on national science, technology and innovation policy.

While the event is in October, the call for panel proposals is still open.  The organizers are looking to have several different kinds of panels.  Besides what may be considered the traditional conference panel of a moderated discussion amongst experts, you can submit panels organized around green papers or case studies.  You might also organize a session that is a series of quick individual presentations (like the TED conferences, though with more underlying rigor), or one focused on directly engaging the audience in a learning activity.

Selection criteria aren’t out of the ordinary.  Panels will be selected based on their alignment with conference objectives and specific themes:

  1. Canadian Science and Technology Strategy: Looking Towards 2020
  2. Advancing Canadian Economic Development with S&T
  3. Science and Risk in an International Context
  4. Innovation in Partnerships

In addition, diversity of representation and quality of the panel (both in content and organization) will also be considered when selecting panels.

Registration is required in order to submit a proposal, and submissions are due no later than June 6.  Even if you aren’t selected, consider attending the conference this October.  Good luck!

It Was Sooner Than I Thought – PCAST Releases Big Data Report

On Tuesday, I posted about a conference call of the President’s Council of Advisers for Science and Technology (PCAST), a call that took place yesterday.  During that call PCAST heard from the leads of its working group on a big data report, and the Council approved the report pending final edits.  In those situations, reports have followed relatively quickly, typically within a month.

Earlier today PCAST released the report.  It made me wonder how any concerns or serious questions raised in the conference call might have been handled.  Had the Council simply approved the report during its April 4 meeting, releasing it today would not have seemed odd.

Titled Big Data: A Technological Perspective, the report comes in the shadow of the big data report released by the working group assembled by John Podesta to coordinate the big data review for the administration.  There are five main recommendations in the PCAST report:

  • Policy attention should focus more on the actual uses of big data and less on its collection and analysis.
  • Policies and regulation, at all levels of government, should not embed particular technological solutions, but rather should be stated in terms of intended outcomes
  • With coordination and encouragement from OSTP, the NITRD (Networking and Information Technology Research and Development) agencies should strengthen U.S. research in privacy‐related technologies and in the relevant areas of social science that inform the successful application of those technologies
  • OSTP, together with the appropriate educational institutions and professional societies, should encourage increased education and training opportunities concerning privacy protection, including career paths for professionals
  • The United States should take the lead both in the international arena and at home by adopting policies that stimulate the use of practical privacy‐protecting technologies that exist today. It can exhibit leadership both by its convening power (for instance, by promoting the creation and adoption of standards) and also by its own procurement practices (such as its own use of privacy‐preserving cloud services)

Comparing this big data report to the administration’s report released today requires an additional post.  However, this topic runs smack into my day job, and such a post will likely show up on the blog I post to for work.  Suffice it to say that the PCAST perspective has a narrow, more technical focus than the administration’s report, though both are concerned with addressing the possible benefits and areas for abuse with the growth of large data sets that are more easily collected and analyzed.

PCAST May Be Close To Issuing Report On Big Data

The President’s Council of Advisors on Science and Technology (PCAST) will meet via conference call tomorrow (Wednesday), April 30, from 11 to 11:30 a.m. Eastern time.  According to the agenda, the meeting will focus on the PCAST report about big data and privacy, which was also a topic of discussion at the April 4 PCAST meeting (not presently available in the meetings archive, but the webcast is online).  In that meeting the PCAST working group was still developing the report.  But it is expected that the larger big data group conducting its review will have a report soon, so PCAST may have worked quite hard this month to get things ready.

We’ll know more after tomorrow’s call.  If the report is not ready, the next PCAST in-person meeting is scheduled for May 9.  As this effort was given a timetable of 90 days since mid-January, there will be pressure to have things released soon.

This report will be part of a collection of products coming from the Administration’s Big Data Review, each reflecting a different perspective on how to engage with large sets of data and how that might affect individual privacy.  For instance, PCAST, with its focus on science and technology, will likely approach research on big data differently that the Council of Economic Advisers or the Department of Commerce, which are also part of the administration’s review.  Student learning was a topic of discussion when the working group presented at the April 4 meeting, something I don’t think was high on the list for the other groups.

I’m highly skeptical that the pending reports will signal the end for the Administration on the matter of big data.  I won’t be able to make informed speculation until the reports come out, but John Podesta indicated when announcing the review that he anticipates his group’s report would serve as “the foundation for a robust and forward-looking plan of action.”  Then we can try and guess what the Administration might do to implement such a plan.

STEM Premier – A LinkedIn For STEM?

I spent today at the USA Science and Engineering Festival focused more on the booths than the performance stages (though the main and Einstein performance stages still couldn’t deal with crowds).  My impression of this festival (compared to its predecessors) is that things have shifted a bit more towards a trade show atmosphere.  There weren’t as many hands-on activities in evidence compared to 2012 (or even 2010).  Without knowing why that seems to have happened, I can’t reasonably opine on whether it’s a problem or not.

One theme that I am glad to see more explicit in this edition of the festival is the permeation of STEM skills and needs beyond the narrow perspective that often dominates policy discussions on the topic.  Scientists and engineers can have a variety of different educational and training backgrounds, but for decades it’s been shorthand in Washington that a scientist has a Ph.D., and engineers often do as well.  It’s a point of view that may have had value in the middle of the last century, but today is just too narrow to effectively address all the applications of science and technology in the world.

One of the many things I saw and heard these last two days was STEM Premier.  The booth had the aura of a headhunter (in the employment recruiting sense) and that made sense once I visited the website.  While the closest comparison I could think of is to LinkedIn, STEM Premier is not a social network in the same way as LinkedIn.  STEM Premier is a clearinghouse for information on STEM education and training, and provides a means for students, recent graduates, universities and employers to learn about opportunites, and to make connections for jobs, education and training.

In short, the service gets at a pretty big communication problem that many people are tangling with.  Nearly all science and engineering Ph.D. holders end up in STEM, but those Ph.D. holders are a much smaller proportion of STEM workers.  The more people work to reinforce this message, I think it will get easier to find people interested in STEM work and connect them with schools and employers that can provide the right opportunities.  As that computer company used to say, we need to Think Differently.

News and Notes: GoldieBlox Settles With Beasties, White House Seeks Public Tech Input

Late last year, GoldieBlox, a manufacturer of construction toys geared toward girls, reworked a song by The Beastie Boys for a commercial (no longer available online, for reasons that will become clear).  Trouble is, GoldieBlox didn’t ask first, and combined with the group’s strong aversion to having their music used in advertisements, legal action ensued.  A pre-emptive lawsuit by the toy company didn’t help matters.

A settlement was reached in the case.  Specifics were not forthcoming in the legal document, but GoldieBlox posted an apology to the main page of its website (a copy is available – H/T Rolling Stone – for future reference).

The White House has recently issued requests for comment on various data policies that the public should consider.  One is about the review of big data (really large sets of data collected on individuals) and privacy announced in January.  The Office of Science and Technology Policy is handling part of that project, for which it issued a request for information (responses are due by March 31).  But if you’d rather not go to that level of detail, the White House has a more informal input form on its website to get your thoughts on how various groups use big data.

The White House is also looking for feedback on some of its digital presence.  This includes an update to the website privacy policy (which will take effect on April 18).  You may want to review this before completing the big data form linked to above.  Also under review are the White House digital content practices.

Finally, in late February the White House posted on its blog about the progress of agency open government plans.  Building on the government’s second Open Government National Action Plan released last December, agencies will be working on revising agency plans.  Feedback is welcomed, and you can check agency plans by placing a /open after the agency website address.

Scientific Collections Memorandum May Hint At Future Scientific Data Policies

Yesterday Office of Science and Technology Policy (OSTP) Director John Holdren released a Policy Memorandum on scientific collections maintained by the federal government.  The memo defines scientific collections as:

“sets of physical objects, living or inanimate, and their supporting records and documentation, which are used in science and resource management and serve as long-term research assets that are preserved, cataloged, and managed by or supported by Federal agencies for research, resource management, education, and other uses.”

The memo was prompted by language in the 2010 reauthorization of the America COMPETES Act (the same law that nudged OSTP to start working on policies for expanding public access to federally funded research data and scientific publications).  It also continues work started in 2005 to institutionalize thinking about scientific collections across the government.

Per the memorandum, agencies that own, maintain, or otherwise financially supports scientific collections will have six months to develop policies on those collections consistent with the memo, relevant federal law and the following additional government directives:

  • The 2010 OSTP memo on scientific collections
  • The 2013 OSTP memo on access to federally funded scientific research
  • The 2013 Executive Order on making open and machine readable data the new default for government information

Per the guidelines described in the memo, agency policies will need to cover not only the management, accessibility and quality of the collections, but establish procedures for coordinating with the Smithsonian (a logical lead agency on such matters), developing appropriate standards for digital files associated with the collections, and handling de-accession, transfer and/or disposal of agency collections.

What attracted my attention in all of this was the heavy emphasis on making these physical collections have as much of a digital presence as practical.  I am left to wonder whether or not this will influence future policies and procedures for maintaining digital repositories that may not have tangible elements – namely research data.

OSTP has not been hard and fast in enforcing its deadlines.  And it has either been very lax in follow through once agency policies have been established, or kept their actions far from public view.  Neither is encouraging, and the latter, if true, undercuts much of the goodwill policies like this one could build amongst the public and scientific stakeholder that are interested in the outcomes and outputs of federally funded scientific activity.

First 2014 Golden Goose Award Connects Black Holes And Web Browsers

February 20 – Edited to Correct the deadline for award nominations)

During the recent meeting of the organization formerly known as the American Association for the Advancement of Science (AAAS) the organizers of the Golden Goose Award held a symposium.  As part of the event the first award recipient of the 2014 cycle was announced.  (Nominations remain open until April 28 18.)  He will be recognized in September at the 3rd Golden Goose Awards Ceremony.

Larry Smarr was recognized for his efforts in computing.  Presently a Professor of Computer Science and Engineering at the University of California, San Diego, Smarr was trained as an astrophysicist.  Working on gravitational physics research, Smarr recognized that the United States was behind in the availability of computing resources for academic research.  Through a proposal to the National Science Foundation (NSF), Smarr led the formation of the National Center for Supercomputing Applications (one of four centers started from that initial proposal.)  He served as its founding director, and while at NCSA, created a software development group focused on applications for researchers.  From this group came Mosaic, the precursor to many of today’s widely used web browsers.

While it’s this web browser connection that supports the Golden Goose recognition, the work of Smarr and his colleagues in the formation of the centers speaks to another critical, if under-recognized, part of the science and technology enterprise.  Research infrastructure is one of NSF’s goals, but it’s not, to use the vernacular, sexy.  Scientists and technologists gather what recognition they get from discoveries and breakthroughs, not keeping the sensors, laboratories and other tools of the trades in working order.  But without that effort, how could the scientists and engineers seek out those discoveries and breakthroughs?  We need that effort, and I hope that doesn’t get overshadowed by the recognition of unexpected connections that drives the Golden Goose Award.

Whether Stem Cells Can Be Patentable May Be The Least Of Commercial Challenges

Two recent actions involving the Food and Drug Administration (FDA) suggest to me that the matter of whether stem cell patents are valid may not be so critical.

The D.C. Circuit Court of Appeals ruled last week that the FDA has jurisdiction over stem cells cultured for therapeutic use (H/T The Scientist).  This decision upheld a lower court ruling that considered the act of culturing the cells more than ‘minimal manipulation,’ and therefore subject to FDA drug oversight regulations.

(IANAL, but I think this decision could be used to strengthen the case that stem cell patents – at least for cultures of said cells – would be valid.  After all, if there was more than minimal manipulation, wouldn’t that be sufficiently transformative to make the cultures no longer products of nature?  Again – I Am Not A Lawyer.)

Aside from the legal matters, there appears to be a big regulatory mismatch that will hinder commercialization of stem cell treatments.  In the February 6 edition of Cell Stem Cell researchers note (H/T The Scientist) that differing regulations between the National Institutes of Health (NIH) and the FDA may reduce the number of stem cell lines that could be used in clinical practice.

FDA regulations require that stem cell donors be screened for various diseases (so that treatments derived from those cells cannot infect others).  NIH regulations – not focused on commercial applications of stem cell research – do not have this requirement.  Now it is possible, as the article from The Scientist notes, for the FDA to allow some treatments to be approved without such a screening, but some alternative measure will likely be needed to mitigate the risk of infection.