While Arizona may be the focus of domestic attentions about legislation against homosexuals, Uganda recently signed into law a measure far more draconian. The punishments for various levels of homosexual activity range from 7 years to life in prison.
Part of the Ugandan president’s rationale for signing the bill is that an 11-member committee reviewed the medical research on homosexuality, and according to the president, found that the origins of same-sex orientation were behavioral and not genetic.
As you might expect, some members of the committee objected to how the president characterized the report, and two members have resigned in protest. ScienceInsider reports the first draft from the committee did not suggest that homosexuality had no genetic basis. To further clarify the committee’s position, a second draft has been released that also removes language on regulation the committee feels could be misinterpreted (or distorted).
So, in future debates on using science to cover a political agenda, Uganda stands an excellent chance of being the ultimate example used in lengthy online discussions. If you find yourself in such a discussion, I’d recommend you extricate yourself immediately should Uganda be mentioned. It’s not likely to end well
On January 30 and 31 the United Nations Science Advisory Board held its first meeting in Berlin. Secretary General of the UN, Ban-Ki Moon, spoke to the group, emphasizing the need for science advise, and focusing on the organization’s development goal. From his remarks:
“We need science to understand our environment, protect it and to use it wisely. We need to understand the many economic and demographic forces are at play in our changing world. And we need to tackle the big issues – hunger and food security, growing inequalities, disaster prevention, urbanization, sanitation” and sustainable energy for all.”
It’s been tough to find more details about what the board did during its first meeting. For instance, it’s still not clear who is chairing the board. Perhaps additional details, or even minutes, will be available at some future date. It’s nice to see the United Nations with a scientific advisory board, but I do hope that more information is forthcoming.
This week’s issue of ResearchResearch has a cover story (H/T Penny Sarchet) on how the government science advisory ecosystem may be changing under the leadership of the new Government Chief Science Adviser (GCSA), Sir Mark Walport. These changes won’t be accompanied by an increase in staff or budget
At a minimum, the Government Office for Science (GO-Science) will have to deal with Walport’s increased presence in the office. While his predecessors usually continued to spend time in their previous postings while serving as GCSA, Sir Mark is present five days a week.
However, an oft-discussed possibility of appointing a Chief Social Scientist may remain on the drawing board. Per Sarchet, the idea will be kept ‘under review.’ Given that the government spokesman she talked with cited the What Works centres and economists in science advisory positions suggests that any appointment is a long way off.
There are a couple of additional items in the article that grabbed my attention. One was an acknowledgement that the relationships between the GCSA and other government officials are at least as important as the science budget in determining the influence of science and scientific advice in policymaking. I think that informal influence is important to remember. It gets frustrating sometimes in observing the U.S. science advisory enterprise as it’s tough to assess those relationships without being inside the system. The U.S. Office of Science and Technology Policy (OSTP) does not have a large budget, so the relationship between the OSTP Director and the agencies with major research activities is perhaps the critical determinant of whether cross-government initiatives (like the scientific integrity push) get anywhere.
The other item I found interesting is a stark difference between Sir Mark and his U.S. counterpart. While OSTP is right there with the other science agencies in arguing for the federal government’s science funding budget, Walport does not consider such advocacy as part of his job. In that sense, Walport recognizes that his first constituency is the Government, rather than the scientific community that sometimes sees a GCSA (or a Presidential science adviser) as their (un)elected representative.
I am looking forward to finding out more about the GO-Science reorganization, and wish Sir Mark and his colleagues all the best in what can be a frustrating job on the best of days.
Last week the Department of Justice and the National Institute of Standards and Technology (NIST) announced the members of the newly formed National Commission on Forensic Science. The Commission will hold its first meeting next month in Washington, D.C.
Members were selected from a pool of over 300 applicants, and are intended to represent a diversity of perspectives in law enforcement, research, and forensic laboratories. The commission will focus on studies and recommendations to improve quality assurance in forensic science and forensic science laboratories. Its recommendations to the Justice Department will include how best to manage the interactions between forensic sciences and the judicial system. The Justice Department will coordinate Commission activities with NIST, and officials from both agencies will serve as co-chairs and co-vice chairs of the Commission.
At the moment there is no Commission website. Normally the FACA Database would be the place to look for advisory committee information when there is no dedicated website. However, I cannot access it as of this writing. Hopefully this will be resolved in advance of the Commission’s first meeting in early February.
On December 18 the President’s Council of Advisers on Science and Technology (PCAST) issued its final report of 2013, a letter report focused on massive open online courses (MOOCs). An infographic is available highlighting both the major (general) recommendations and some statistics explaining why PCAST finds MOOCs and similar innovations of interest – as means to increasing access to higher education. PCAST considers the issue important enough to make this report the first of a series on similar education innovations.
MOOCs are different in some ways from previous innovations in online education (like OpenCourseWare or Khan Academy). For one they provide much more than course material. The ability to assess student progress and learning through the length of the course is much greater than what was previously available through more conventional correspondence or online opportunities. The distance between the student and the classroom can be effectively shrunk in ways that weren’t previously available. That is beneficial regardless of how large or open the course might be.
There have only been sufficient number of courses and students in MOOCs to generate two years worth of data to analyse. Additionally, the cost recovery mechanisms (while cheaper, the courses are certainly not cost-free) have yet to be worked out. As a result, the major recommendations of the report are to effectively monitor the situation but to let current providers continue to experiment (and possibly innovate) with MOOCs. We don’t yet know enough to determine which courses are best suited to this kind of teaching (are labs a practical MOOC activity?), and at what point are the benefits of in-person education lost to the MOOC student.
PCAST recommends that the government be ready to support research on these courses and the demonstrated learning outcomes. It can also support communities of research and practice to facilitate the gathering and exchange of information that can help students and teachers achieve what they want (and need) from online education.
The Presidential Commission for the Study of Bioethical Issues meets in Washington tomorrow. Continuing a topic introduced in its last meeting, the Commission will spend all of the public portion of its meeting on the BRAIN initiative. The President officially requested that the Commission identify a core set of ethical standards to guide neuroscience research and address possible ethical implications of applications that emerge from those research findings.
The BRAIN initiative is a major public-private partnership announced this past April by the Obama Administration. Federal agencies involved are the National Science Foundation, the National Institutes of Health (NIH), and the Defense Advanced Research Projects Agency. There is an advisory group hosted at the NIH. While federal agencies were a focus of the previous Commission meeting on this topic, private sector and international neuroscience activities will be discussed at tomorrow’s meeting.
Today the Presidential Commission for the Study of Bioethical Issues released its report on incidental findings. This project started at the beginning of the year, and focuses on findings that were beyond the aims or scope of a particular test or procedure. The ethical consequences of such findings can be most keenly felt when they involves human subjects and/or medical patients (see this article on The Atlantic’s website for a specific medical case). Such findings may not be information that a subject/patient is seeking, which adds an additional wrinkle to determining how to best approach the person and respect their rights to choose and one’s obligations to provide appropriate and effective treatment.
But they can be found in many different places, which is why the report, called Anticipate and Communicate, has a subtitle of Ethical Management of Incidental Findings in the Clinical, Research and Direct-to-Consumer Contexts. It’s also why this isn’t the first Commission report to address incidental findings. It was mentioned in the Commission report on genome sequencing (and probably not heeded by anyone at 23andMe, based on FDA action taken against that company)
Not that brief blog posts are the most reliable source of deep insight, but in an instance of covering a report that deals a lot in context, I have to emphasize that you really should read the report and not simply this post. As the report notes (Executive Summary, page 2):
“Discovering an incidental finding can be lifesaving, but it also can lead to uncertainty and distress without any corresponding improvement in health or wellbeing. For incidental findings of unknown significance, conducting additional follow-up tests or procedures can be risky and costly.4 Moreover, there is tremendous variation among potential recipients about whether, when, and how they would choose to have incidental findings disclosed. Information that one recipient regards as an unnecessary cause of anxiety could lead another recipient to feel empowered in making health- related decisions.”
The tl;dr – I’m not comfortable just summarizing the report findings and leaving it at that. it’s a complex issue that cannot easily be generalized. The Commission takes a general approach in its report, outlining how an ethical analysis of incidental findings can be conducted, and encouraging those with expertise in the technology and scientific knowledge related to a particular context to do what they can to anticipate possible incidental findings (not all of them can) and communicate those possibilities to the people being tested, as well as the public that may be tested at some point in the future (or have something revealed about them through the testing of others).
On November 22, the President’s Council of Advisers on Science and Technology issued a letter report on cybersecurity. It is concerned with providing cybersecurity in a frequently changing threat environment. As the overarching recommendation reads:
Cybersecurity will not be achieved by a collection of static precautions that, if taken by Government and industry organizations, will make them secure. Rather, it requires a set of processes that continuously couple information about an evolving threat to defensive reactions and responses.
The other recommendations address government’s own information technology practices, information sharing across the private sector and the government, and auditing cybersecurity practices in the private sector. This report follows up on a Feburary 2013 classified briefing provided by PCAST, so the recommendations are perhaps more for public consumption at this point
Also of note are two new faces on PCAST. Ernest Moniz had to step down when he became Secretary of Energy, and David Shaw and Ahmed Zewail are no longer on the Council. The new members recently appointed by the President to replace them are Susan Graham, an emerita professor of electrical engineering and computer science at the University of California, Berkeley; and Michale McQuade, vice president at United Technologies Corporation. McQuade has also worked for 3M and Eastman Kodak and has a physics background.
The Woodrow Wilson International Center for Scholars recently conducted a survey as part of a report on the growing do-it-yourself (DIY) Biology movement. The newly released report (H/T ScienceInsider) from the Synthetic Biology Project is apparently the first of its kind to track what activities the community is involved with.
The report authors are interested in countering existing stories about the DIYBio community that don’t match with what their research (and survey data) have demonstrated. The brief takeaway, from their perspective – the threat posed by this research (and these researchers) has been overstated in the press.
While the authors are careful to note that their work is a current snapshot of the field, I am concerned that the press connected to this report may oversimplify what’s going on. In other words, the new stories will be about how the old stories oversold the magnitude of what is going on in the field and the possible threats of what could be taking place.
I think the most productive recommendations from the report will be those focused on how to grow, support and manage research in this area moving forward. As a DIY community is not necessarily connected to existing institutions, having the capacity to educate interested researchers and provide them spaces to work is not guaranteed. Such resources could also ease the burden of monitoring and guiding the research moving forward. A previous report from the Synthetic Biology Project suggests to me that at least some community self-regulation would be useful in the future, as federal action is coming slowly.
Either way, this survey needs to be the first of several, and not the end of a discussion.
Breaking with recent patterns, the next public meeting of the President’s Council of Advisers on Science and Technology (PCAST) is scheduled for a Thursday. The Council will meet in Washington on November 21, in an even briefer than normal half-day session. As usual, there will be a live webcast, which will be archived shortly after the meeting. Visit the webcasts page on November 21 to catch the meeting starting at 9 a.m. Eastern.
If the current agenda (effective November 4) is suggestive, PCAST may have two reports to release very soon. The meeting will cover a letter report (typically fewer than 10 pages) on Education Information Technology, and a lengthier report on Cybersecurity.
The wild card (at least for me) is the session on Privacy and Technology with Nicole Wong. While her official title may be Deputy Chief Technology Officer, she is considered the first Chief Privacy Officer in the White House. I suspect this may be where PCAST members get engaged with surveillance matters, if they haven’t already done so in a more private setting. But that’s just a guess.