The Supremes Wax Hypothetical On Executions, May End Up Moot

Yesterday the Supreme Court heard arguments in Glossip v. Gross, a case brought by condemned men in Oklahoma objecting to its method of capital punishment.  The argument in this case hinges on the state’s use of midazolam as the sedative in the three-drug cocktail of anesthestic, paralytic and a drug to stop the heart.  Death penalty opponents argue that the drug does not adequately sedate the condemned inmate, allowing them to suffer sufficient pain from the subsequent drugs to qualify as cruel and unusual punishment.

While it is rarely a guarantee that oral arguments indicate exactly how the justices will rule, it can be suggestive.  Reports from the Court indicate that the justices questions focused primarily on two threads.  The liberal justices were concerned about the effects of midazolam, while four of the conservative justices focused on what they consider a judicial end around.  As I’ve tracked on this blog, manufacturers and others have put drugs used in executions out of reach of states, forcing them to choose alternative drugs, and in some cases expand their execution options.

The justices’ questions about these tactics suggest they are upset at their decisions (about the Constitutionality of the death penalty) being subverted by what could be seen as technical, rather than political, means.  While I’d argue that the campaign to get execution drugs pulled is definitely political, I can see where the justices that support the death penalty see this effort as a turf battle.

Again, the oral arguments are more of a guide than a predictor of the court’s vote (much less the specifics of its opinions).  But it would seem that this case is unlikely to change the status of the death penalty.  And while some justices object to the movement to make certain drugs inaccessible for executions, there is little they can do.  In March an Illinois manufacturer of midazolam requested the State of Oklahoma return its supply of the drug (for a refund, of course).  The company, Akron Pharmaceuticals, will also take steps to make sure its supplies of midazolam and hydromorphone (a narcotic pain reliever some states have expressed interest in using) cannot be used for executions.  Similar letters have been sent to other states.  Should other manufacturers follow suit, it may not matter what the Court says when it rules in this case.

NIH Reiterates Opposition To Funding Gene Editing For Human Embryos

In light of Chinese researchers reporting their efforts to edit the genes of ‘non-viable’ human embryos, the National Institutes of Health (NIH) Director Francis Collins issued a statement (H/T Carl Zimmer).

(For what it’s worth, the research indicated a very low success rate in editing the gene.)

The statement mentions the various legal and regulatory prohibitions on funding the kind of research the Chinese conducted.  In this case, the editing was of a gene responsible for a particular blood disorder.  But the changes to the gene would be heritable by the descendants (if the embryos in question were viable), and that is the source of concern.

From the Director’s statement (CRISPR/Cas9 is the editing technique in question):

“NIH will not fund any use of gene-editing technologies in human embryos. The concept of altering the human germline in embryos for clinical purposes has been debated over many years from many different perspectives, and has been viewed almost universally as a line that should not be crossed. Advances in technology have given us an elegant new way of carrying out genome editing, but the strong arguments against engaging in this activity remain. These include the serious and unquantifiable safety issues, ethical issues presented by altering the germline in a way that affects the next generation without their consent, and a current lack of compelling medical applications justifying the use of CRISPR/Cas9 in embryos.”

While Collins also notes the federal laws and regulations that restrict funding, I do not expect the statement to be the end of the discussion around the gene-editing research reported on in China (which is probably continuing).  I suspect many would find the use of non-viable embryos in this research acceptable, even if it punts on the questions of consent to changes for future generations and the safety of the techniques on viable embryos.  After all, stem cell research lines have been derived from non-viable embryos.  I think that the need to (eventually) work with these technologies on viable human embryos makes the stem cell comparison problematic, but that won’t likely matter in the policy debates to come.

 

NIH Seeking Information To Help Form Million Member Cohort

Part of the Precision Medicine Initiative is the establishment of a research cohort.  Intended to be a million strong, the voluntary cohort would contribute a breadth of data to help extend the understanding of diseases.

As part of this process the National Institutes of Health (NIH) is seeking information from the public about the cohort.  Comments are due on May 7.  Specific questions the NIH is interested in are:

A. General topics on the development and implementation of this large U.S. cohort.
1) The optimal study design and sample size for a large U.S. precision medicine cohort.
2) Data to be collected at baseline and follow-up, including mode of collection and frequency and length of follow-up.
3) Potential research questions that could be uniquely or more efficiently and effectively pursued in a large U.S. precision medicine cohort.
4) Any other suggestions for NIH to consider in the development and implementation of such a research cohort.

B. Suggestions for existing or potentially new research entities (a health care system, research network, cohort study or consortium, or other entities such as longitudinal studies using digital-based platforms) that might be combined into a large U.S. cohort.  Providing the following information would be useful when suggesting research entities.
1) The capability of the existing or potentially new research entity to efficiently identify and follow 10,000 or more participants who are likely to consent to providing their medical and other health-related data, biospecimens, and genomic data for broad research use, including in sub-group analysis that could help assess various treatment effects and outcomes.  It would also be useful to provide the rationale that potential participants are likely to consent, as well as experience with and ability to participate in central IRB and a master contract agreement to streamline enrollment of the precision medicine cohort.
2) The capability for the research entity to provide individual-level participant data, particularly those from electronic health data (including both electronic health record and payer data), that can be integrated into a standard format to create a combined large longitudinal precision medicine cohort.
3) The capability for the research entity to track and retain the participants for several years of follow up.  The race/ethnic composition, sex, and age distribution of participants from the research entity likely to consent, by standard U.S. Census categories, would also be helpful. The NIH especially seeks information about studies of populations underrepresented in research and those with phenotypes or disorders of high public health and human impact. Additional information that would be of use includes: for health care systems, the current patient turnover rate and efforts that can be made to capture longitudinal data from clinical visits outside of the system and to continue follow participants who leave the system entirely; and for ongoing cohort studies, the retention rate to date.”

This strikes me as uncharted territory, depending on how many recruits can be found to participate in this cohort.  Should it truly be a million strong, there will be new questions that we don’t necessarily have answers for right now.  They include how to manage the study(ies), maintain the anonymity and privacy of the medical information, and simply how to analyze such massive amounts of data.  There’s a lot that can go wrong here, with unclear benefits on the horizon.  Personally, I consider this a risk worth taking.  Hopefully it will be remembered that there are risks involved.

Continue reading

Science and Technology Guests on Late Night, Week of April 27

Since the nature of the show demands it be included in this post every week it’s on, I’ll lead with StarTalk.  Tonight’s interview guest is Interstellar director Christopher Nolan, while comedian Eugene Mirman and cosmologist Janna Levin will join Neil deGrasse Tyson in front of the audience.  I will note that last week’s episode with George Takei, comedian Leighann Lord and astrophysicist Charles Liu is now available in podcast form from your provider of choice.  Presumably this will happen with the other television episodes.

On to the other programs this week.  With Avengers: Age of Ultron premiering in the U.S. later this week, the stars are out in force promoting the film.  Ultron, at least in this film, is an emergent artificial intelligence that takes humanoid form.  James Spader, who portrays him, was on with Kelly and Michael this morning.

Another smaller publicity blitz comes from the actors in Silicon Valley, an HBO program following young tech entrepreneurs.  Zach Woods will be on Conan Tuesday night, Thomas Middleditch will be on The Tonight Show this Wednesday, and Kumail Nanjiani will be on with James Corden Thursday night.

This week the daytime programs have more than the usual number of science and technology guests.  On Wednesday, Morgan Freeman will be on The View to promote the start of the newest season of Through the Wormhole.  On Friday, The Talk‘s technology reporter, Chi-Lan Lieu stops by, and Kunal Nayyar of The Big Bang Theory will be on with Kelly and Michael.

Author author David George Gordon will appear on The Late Late Show Tuesday night.  His books focus on the natural world, and include a bug cookbook and a field guide to researching Sasquatch.  While this could be full of science goodness, I’m not sure how James Corden will play it.

Of note in recently aired science and technology content, The Nightly Show covered the California drought in its Wednesday night program.  And not just because William Shatner put forth a plan for a water pipeline.

#SciFiSciPol – Ex Machina Address Robots And Gilded Cages

I finally saw Ex Machina, which recently opened in the United States.  It’s a minimalist film, with few speaking roles and a plot revolving around an intelligence test.  Of the robot movies out this year, it has received the strongest reviews, and it may take home some trophies during the next awards season.  Shot in Norway, the film is both lovely to watch and tricky to engage.  I finished the film not quite sure what the characters were thinking, and perhaps that’s a lesson from the film.

Unlike Chappie and Automata, the intelligent robot at the center of Ex Machina is not out in the world.  Ava has been developed, presumably alone, by a technology CEO meant to evoke people like Elon Musk, Steve Jobs and/or Mark Zuckerberg.  The CEO brings in one of his employees to test how ‘human’ Ava is, and the film engages with what it means for someone – human or human-made – to have ‘strong’ intelligence.

I don’t expect viewers to come out of this film with definitive answers about artificial intelligence.  I think the film was more interested in having the audience engage with the questions and at least poke at their assumptions.  But I’m not a psychologist, I’m a policy analyst.

How does this film address science and technology policy?  There is a throwaway line about surveillance via cell phone, but the most direct engagement with policy matters concerns how the robots are developed.  Locked away in a private research lab, the robots lack autonomy, and the existence of this work appears to be unknown outside of the facility.  Without getting into the details, the resolution of the film suggests to me that keeping the project under wraps would not be well received in the public.  There is an implication that the need for autonomy is connected to fully-formed intelligence, regardless of whether a being is organic or robotic.

I recommend this film.  It’s far from the kind of action-oriented film that most of this year’s robot movies are, and I think that choice makes it easier to introduce and explore the ideas around what it means to be intelligent.  The film also becomes an interesting mystery that managed to surprise me at least once.  Even if you’re not interested in thinking about how to manage artificial intelligence (if so, why are you here?), I believe you’ll find the film compelling enough to enjoy it.

Celebrate The Silver Anniversary Of The Hubble Deployment

Twenty-five years ago today the Hubble Space Telescope was deployed from the Space Shuttle Discovery.  You could note the anniversary by diving into the rabbit holes of Hubble imagery, but I prefer to enjoy the tribute videos.  Hubble represents the longevity of space missions (often in spite of themselves), and the persistent interest in looking as far as the electronic eye can see.

Hubble is also an excellent inspiration, as these two videos suggest.  The first is “Go Hubble!,” another entry from this year’s Science Rap Academy, and uses the music of Meghan Trainor’s song “All About That Bass” to good effect.

Adam Cole, NPR’s Skunk Bear prepared his Hubble tune based on Iggy Azalea’s “Trouble” It gets into more technical detail, and discusses the problems with Hubble’s mirror that was the main focus of the first servicing mission in 1993.

The Hubble will be around for at least a few more years.  Its successor, the James Webb Space Telescope, is scheduled to launch in 2018.  The Hubble Space telescope stands a good chance of reaching 30.

NIST Chooses PubMed To Help Comply With Public Access Policy

Earlier this month the National Institute of Standards and Technology (NIST) released its open access policy for federally funded research results and research data (H/T GCN). This is to comply with the Office of Science and Technology directive from 2013 for all agencies with annual research budgets over $100 million.

The policy will be implemented in stages, starting with two of the agency’s journals, and then expanding to all intramural agency research by October of 2015.  By October 2016 the policy will be completely implemented, and any NIST-funded research articles will need to be deposited in the PubMed Central repository within 12 months of publication (fields can petition the agency for a longer or shorter embargo period).

Research data will be handled indirectly.  NIST has developed an enterprise data inventory to list NIST-funded research datasets.  The agency would not store those datasets, but provide sufficient metadata and other information to allow the public to access those datasets.  This is consistent with the approach favored by many of the agencies that have already released their open access plans.

Open Access Bills Stage A Return Engagement

For the fifth consecutive Congress bills have been introduced to extend open access to government-funded research results.  In the last month three bills have been introduced, resembling bills introduced in previous years.

For the 114th Congress there are House and Senate versions of the Fair Access to Science and Technology Research Act (FASTR).  Essentially the same bills were introduced in the 113th Congress with the same sponsors and the same terms.  Previous editions of these two bills were under a different name – the Federal Research Public Access Act.  The bills are assigned to the House Oversight and Government Reform Committee and the Senate Homeland Security and Government Affairs Committee, respectively.  The bills would require open access to federally funded research articles within six months of publication.

Another open access bill in this Congress is the latest edition of the Public Access to Public Science Act.  It too was introduced in the previous Congress, and has the same sponsors this time around.  Compared to the FASTR bills, this act covers a smaller set of agencies (those under the jurisdiction of the House Science, Space and Technology Committee), and hews closer to the requirements of the February 2013 Policy Memo from the Office of Science and Technology on open access.  Like the FASTR bills, this legislation requires open access for covered articles within six months of publication.

With nearly 10 years of legislative efforts to expand open access, I’m not optimistic that these bills will be much more successful than their predecessors.  A major difference in this Congress is that agency public access policies are in the process of final review and/or implementation.  (The difference in embargo periods between these bills and agency access policies is not likely a bill-killer, at least by itself.)  That might help get these bills out of committee, but I think it will take stronger effort by their legislative champions to get them to the President’s desk.

Then there’s the matter of open access to research data, which is not covered by these bills.  Baby steps, I suppose.

Chief Technology Officer Bill Back, Now With New Sponsor

When President Obama decided to have a Chief Technology Officer (CTO), the position is not permanent.  At the moment, should his successor opt not to appoint one, the CTO position would cease to exist.

Since 2009, there have been two bills introduced to put the CTO position into law.  The bills, had either passed, would have put into law the specific requirements of the position, and establish a separate office for the CTO.  While this would make the position last longer than a single presidential administration, it would also remove a flexibility with the position that is reflected in the backgrounds and portfolios of the three CTO’s appointed by the Obama Administration.

The bills essentially make the CTO position into the head of information technology purchasing and implementation in the government.  That’s an important function, but there already is a federal chief information officer.  Ideally, at least from where I type, the CTO would be on par with the President’s chief science adviser.  Focusing the position on information technology runs the risk of making the job the government’s tech support, instead of a position focused on how to utilize all kinds of technology to support the government and serve the public.

There is now a third bill attempting to formalize the CTO position.  While the first two were sponsored by Representative Gerry Connolly (D-Virginia), the latest bill is sponsored by Representative Barry Loudermilk (R-Georgia), and co-sponsored by members of the House Science, Space and Technology Committee, where Loudermilk serves as chair of the Oversight Subcommittee.  The House Science Committee was not involved with the previous bills, which were considered by the Oversight and Government Reform Committee.  The newest bill has been referred to both committees for consideration

The latest bill makes the CTO position optional, but would require any CTO appointment to also serve as the Associate Director for Technology and Innovation at the Office of Science and Technology Policy.  (The first CTO in the Obama Administration, Aneesh Chopra, served in both positions.  His successors did not.)  The bill does focus the CTO position on information technology responsibilities, but it does think a bit more broadly than IT.  The CTO would also handle information exchange between the Congress, the Executive Branch, and the public; agency records transparency; and technological interoperability.

I appreciate the greater breadth of portfolio in this bill, but I’d rather not see the CTO subsumed by the OSTP.  An equal partner is more to my liking, but your mileage may vary.  I don’t expect this bill to have a better fate than its predecessors, but it’s certainly not to soon to see what President Obama’s successors might do about a CTO.  Where does your candidate stand on the issue?

White House Demo Day Announced

Earlier today the Chief Technology Officer, Megan Smith, announced the first White House Demo Day to take place later this year.

As described on the Demo Day website, the idea appears to be something like the White House Science Fair for entrepreneurs.  Put another way, successful entrepreneurs will come to talk about their ideas and how they managed their success.  The White House is particularly interested in hearing from entrepreneurs in underrepresented groups.  Nominees are being taken (scroll down) until this Friday, April 24.

(A 72-hour window for nominations may be a record short window of comment from this White House.  The cynical part of my analytical faculties makes me think that most of the entrepreneurs are already lined up, and this is just for show.  But I can’t prove that.)

The Demo Day announcement follows on last week’s Tech Meetup, an event where organizers of activities like maker fairs, coding camps, and other gatherings of local technology talent met at the White House to share best practices and discuss how to replicate their efforts in other places.