Australia Considering Tinkering With Its Census

The Sydney Morning Herald reports on the head of the Australian Bureau of Statistics (ABS) arguing in favor of changing its census (H/T The Conversation).  Currently taken every five years (like Canada used to do), there is interest in changing that to every 10 years.  Unlike in Canada, where the head of Statistics Canada resigned over the change of the census from mandatory to voluntary, the ABS chief defended shifting the census.  David Kalisch has argued that the current census was a modest input into national statistics.

Given the challenges Canada has faced since it went to a voluntary survey, I understand, and share, the caution some have expressed in making significant changes to a data tool.  If the shift to every ten years is accompanied with more regular population surveys (much like in the United States), it is possible that Australia could strengthen its data gathering and analysis capabilities.  But the caution encouraged by Canadians should be well considered.  Cutting costs appears to be part of the motivation behind these changes.  I have a hard time reconciling that interest with increasing the number of surveys taken over time (a less frequent census plus more regular population surveys).  The people in Australia – policymakers, researchers, and other concerned citizens – need to hold the ABS to account and demand to see how the agency plans to maintain, if not increase, the value of the information it collects.  The Save Our Census campaign indicates that some are.

Raising Awareness Without Raising Unnecessary Procedures

Recently actress and director Angeline Jolie Pitt went public with he decision to have her ovaries and Fallopian tubes removed as a precaution against ovarian cancer.  This follows her double mastectomy in 2013 to address her risk of breast cancer.

Her willingness to discuss these decisions is admirable, and hopefully her example can prompt thinking and discussions around cancer and cancer risk with all of us.  But, as we keep learning over 43 years in to the ‘war’ on cancer, the disease, and the decision on how to address it, remain complex and challenging.  Jolie Pitt has been clear in noting all of the factors that led to her decisions and the careful thought and discussions she had prior to making those decisions.  However, press coverage has focused on the choices more than what informed them.  (In other words, if you are considering following Jolie Pitt’s lead, please read her pieces in full, and talk to your family and doctors, before proceeding.)

Before I discuss the risks and consequences of significant preventive procedures, two items worth noting.  Jolie Pitt is in a position to spend the time and money to obtain high-quality medical care many people might not have.  The genetic testing that informed her mastectomy decision is likely cheaper today (since Myriad Genetics patent-based competitive advantage has been weakened), it is still expensive, as are the numerous procedures involved in a double mastectomy and the removal of ovaries and Fallopian tubes.

For what it’s worth, I’m in the midst of a similar, but much less serious, medical situation.  I may not get to the point where something needs to be removed, but the possibility is no longer abstract for me.  I mention this now as it has come up since I last posted on the risks and consequences of preventive screenings.

It cannot be repeated enough – Jolie Pitt has a family history of both breast and ovarian cancer, and the latter was the cause of her mother’s death.  Right away cancer screenings would be recommended earlier and perhaps more frequently than for a woman of her age with no family history of cancer.  Her’s is a special case, and not representative of an ‘average’ breast or ovarian cancer patient. Continue reading

So, About Those Scientists On Trial In Italy…

Important disclaimers – I am not a lawyer, either in this country or Italy.  I also don’t speak the language, so I am relying on secondary sources.

ScienceInsider has reported on the decision to acquit six of the seven people convicted of manslaughter in connection with the 2009 L’Aquila earthquake.  The people had been convicted due to the poor way they communicated the risk of possible earthquakes leading up to the 6.2 quake that killed 309 people.

It is not, and never was, about the prediction of earthquakes or a misunderstanding of the underlying science.  But that was the easy message, and the one that got through, at least outside of Italy.  I could have been more effective in communicating that in the many posts I made on the topic, and I apologize for that.

Back to the latest developments.  The appellate court which acquitted six of the seven defendants (all of the scientists were acquitted, while the public official remains sentenced to 6 years) ruled that only the public official could be faulted for the reassurances that caused some people to remain indoors.  The scientists, according to the appellate court, should not have been judged by any regulatory responsibilities they had, but by how well they complied with the accepted science of the time.  And because, according to the court, the notion that a cluster of earthquakes can indicate a larger one was not a commonly accepted scientific theory until after the L’Aquila quake.

That last statement seems like it could be subject to debate for years to come.  Perhaps that debate might play out – at least in part – in the next level of appeals.  The chief prosecutor can appeal this latest decision to the top Italian appeals court.  So this may still not be over.

Counting The Impact Of How A Government Counts

Back in 2010, the Canadian government opted to make the long form portion of its 2011 census voluntary.  Researchers who use the data in their work, and policymakers who use the data to make decisions were concerned about how a voluntary survey would impact the resulting data.

As expected, the early analysis suggests that the lower quality data will lead to higher spending.  Those costs might not be borne by the national government, but the provinces, local authorities and other parties that have used this data to track changes in their jurisdictions.  Without this data, they must pay to replace it, and due to the lower quality, they are paying more for less.  Smaller jurisdictions have been harder hit from this change, as response rates have been lower in rural jurisdictions, and smaller governments are less likely to have the resources to fill those data gaps.

The Canadian Chamber of Commerce has called for the long form to be mandatory in the next census (2016), and a bill was introduced in Parliament to that effect last week.  With the Conservatives still in majority, it seems likely to fail.  Those in the United States may not find themselves concerned, as we still have 5 years to the next census.  However, the Census Bureau administers the American Community Survey, and there have been efforts to curtail that in the past.  It may happen again.

The Last Mile Isn’t Just About Broadband

Noting the upcoming 10th anniversary of the Indian Ocean tsunami, Nature analyzes the tsunami monitoring system that emerged following the devastation.

The short of it – there are still challenges at the end of the message chain.  The three regional centers were effective in communicating warnings and data to countries, but getting the message to the people away from the major cities was still a struggle.    Countries can be strategic in determining which areas may be more susceptible to tsunami effects, and focus their efforts on those areas.  But the investment in infrastructure is still significant, and the maintenance of these networks represents a non-trivial amount.  Much in the same way that the infrastructure in the U.S. made it easier to manage the Ebola cases diagnosed in that country compared to the areas hardest hit in Africa, the communications infrastructure in Hawaii and other more developed coastal areas make it easier for tsunami warnings to be heard.

What the people do with the message when (or if) they get it is a separate question.  Rational action in the face of natural disaster seems less correlated with level of development, but I’d love to see any studies that address this.

DARPA Wants To Fight A Bug

The Defense Advanced Research Projects Agency (DARPA) often uses challenges to stimulate research in challenging areas.  At least some of the current work in self-driving cars can be traced back to several of DARPA’s Grand Challenges in autonomous ground vehicles.

The latest challenge appears to be the first that DARPA has issued outside of engineering and/or information technology.  Last week it announced the CHIKV Challenge for teams to develop methods to track and predict the emergence of a virus (H/T ScienceInsider).  The competition is interested in the Chikungunya virus, which has appeared in the Western Hemisphere for the first time in decades.  It’s mosquito borne, and any challenge solutions proven successful could be used for other viruses, especially those carried by mosquitoes.

The competition starts on September 1, and run through February 1 of next year.  The contest involves predictions of disease spread over the Western Hemisphere.  Entrants must submit the methodology, along with an indication of data sources and related models, by September 1.  Over the next several months, teams will submit accuracy reports indicating how well (or badly) their predictions match the spread of the virus, and describing their prediction for the balance of the competition period.

The top six teams will receive cash prizes (unless they are part of a federally funded research and development center).  DARPA hopes to follow in the footsteps of the Centers for Disease Control, which held a comparable competition on predicting the timing, peak and intensity of influenza during the 2013-2014 season.

Oso Landslide Analysis Leads To Competing Theories And Possible Adaptation

In March a landslide in Oso, Washington destroyed a neighborhood, killing 43.  This week two scientific analyses were issued (H/T ScienceInsider).  On Tuesday the Geotechnical Extreme Events Reconnaissance team (GEER, sponsored by the National Science Foundation), released its report.  Earlier today Science reported on the unpublished analysis from the U.S. Geological Survey and researchers at the University of Washington.  A notable difference between the two reports deals with the how of the slide.

The GEER team, which is set up to do quick analyses of natural disasters, theorizes that the slide happened in two phases.  The first slide was augmented by the collapse of a portion of the mountain when underlying support gave way.  One of the USGS researchers explained their theory of the slide (which was significantly larger than the smaller slides that frequent the area) as more compressed.  They believe the second spike in the seismic data is not a major event, and that an upper portion of the mountain broke off much sooner.  It comes down to debates over the proper analysis of seismic data.

What the GEER report highlights is the absence of systematic assessment of potential for landslides when planning construction.  Given what has been achieved for building in areas prone to earthquakes, it’s a little surprising that similar efforts have not taken place for areas with higher potential for landslides.  The failure to use detection systems and take advantage of historical data are similarly surprising.  Presumably the USGS report, whenever it’s released, won’t be as far apart from the GEER team in terms of recommendations.  We’ll have to wait and see.

What you might not want to wait on is to see if you nearest slide area is taking advantage of new detection and monitoring systems.  To have the tools and not use them strikes me as tragic, especially given the catastrophic nature of most slide losses (losing one house is a catastrophe – to that family).