“Sea no evil”

Posted June 5th, 2012 by Sylvia S Tognetti and filed in Interfaces of science and policy
The Colbert Report Mon – Thurs 11:30pm / 10:30c
The Word – Sink or Swim
Colbert Report Full Episodes Political Humor & Satire Blog Video Archive


Apparently expecting the future to be like the past, last week, North Carolina GOP legislators circulated a bill  that begged for and just received the Colbert Report treatment. The bill not only dictates that estimates of sea-level-rise (SLR) are to be based only on historical trends, post 1900, and may not take into account scenarios of accelerated rates of SLR. It also forbids state public entities from making policies based on any estimates not produced by the only state agency authorized by the bill to make them – the NC Division of Coastal Management, which may only do so at the request of the Coastal Resources Commission, using said method. Eli the Rabett sniffed out the back story, with lots of links – with more added in the comments.

The bill, apparently based on the advice of one John Droz, was prepared in response to a report of a state science panel that recommended adoption of a 1 meter (39”) rise in sea level by 2100 for purposes of policy and planning. Droz, a retired real estate developer with a degree in physics, and not a single peer-reviewed publication on climate or any other topic, and science advisor to an NC-20 (an advocacy group representing coastal counties) “says he consulted with 30 sea-level experts, most of them not named”  in his critique of the science panel estimates. Which is not how science works.

As a consequence of this “fight against sea level rise predictions”, if the bill were to pass, state agencies would be unable to plan for and respond to actual sea level rise, even when using federal funds. According to the Charlotte Observer, a $5 million federally/FEMA funded study being carried out by the NC Division of Emergency Management, would only be able to report a projected range of SLR from 3.9 to 15.6 inches rather than 1 meter.

Which will lead to even higher costs as the state attempts to stabilize beaches already eroding as a result of coastal development patterns.

[Long promised and long overdue post revisiting Post-Normal Science in Post-Normal Times, coming soon….]


Structuring science for policy

Posted January 29th, 2010 by Sylvia S Tognetti and filed in Interfaces of science and policy

Complex problems, like climate change or the decline of honeybees, ultimately come down to debates about causation that tend to also be highly contentious, because, with multiple potentially contributing factors, uncertainties can never be fully eliminated. When stakes are high and decisions are urgent, the analytical difficulties are inevitably compounded by value judgments. A new paper by Laura Maxim and Jeroen van der Sluijs, nicely summarized by Kate Ravilious on the Environmental Research Web provides an approach for analysis of these kinds of debates, that could be useful as a structure for providing transparency in the assessment of any type of complex environmental problem.

In this paper, they apply the approach to a case study of the decline of honeybees in France, which appeared to be associated with the use of the insecticide Gaucho as sunflower and maize seed dressing. But several other potential factors were widely cited in the public discourse, such as imported queens, unfavorable climate during flowering, insufficient pollen, diseases and viruses, inadequate or illegal use of pesticides, and changes in sunflower varieties. Using a set of criteria for causality, they developed a set of questions regarding the potential relationship between each of these potential factors and the signs of the problem, which included a 30-70% loss of honey, lethal and sub-lethal signs in the bees during flowering (e.g., mortality, paralysis, loss of orientation, apathy, shivering and other abnormal behaviors). For each question and for each potential causal factor, stakeholders were asked to provide scores of 1-10 regarding the convincingness of the evidence, based on standards used in US courts. They were also asked to provide justification for their scores. The stakeholders interviewed included 2 representatives each from the Bayer Institute of Crop Science, AFSSA (the French food safety authority), and the French Ministry of Agriculture. Also, 5 public scientists, and 20 beekeepers who had experienced the problem in their own apiaries.

Among the results: an association between 5 of the eight potential factors with the lethal and sub-lethal effects that had been observed could be ruled out because they were either not biologically plausible, not verified in the field, or were unknown. They note that, although the scientific details of these potential factors were never addressed, they were widely cited as “plausible” in the public discourse. They also note two distinct storylines. One, defended by the beekeepers and the public scientists, and based on both field observations and scientific studies of the impacts of imidacloprid (the active ingredient in Gaucho), was that Gaucho was the main contributor to the loss of honeybees in areas with seed dressed sunflower and maize crops. The second storyline, defended by Bayer and AFSSA, and based on research that did not reproduce the observed effects, was that other factors were to blame. They also referred to honeybee losses in general, rather than to the particular sunflower and maize areas where crops had been treated with Gaucho.

In other words, the results “showed that in public discourses, some expert actors can present as being plausible hypotheses which are not scientifically validated and thus downplay a correct understanding of the problem by their listeners. Not all experts are equally attentive to the robustness of the scientific support for the hypothesis that they evoke.”

To anyone who has followed the public discourse on climate change, this story will sound familiar. The approach presented provides a more formalized and systematic way of asking the kinds of questions often raised in this context, although in a more diffuse manner that can be hard to track for anyone who has other things to do. Although the results are unlikely to persuade those whose aim is to delay action by sowing confusion, it can provide some much needed transparency for those who are perplexed by contradictory messages about the science of climate change, and who are truly interested in good faith negotiation of policies that rest on science.

Taking a holiday from reality

Posted May 6th, 2008 by Sylvia S Tognetti and filed in Interfaces of science and policy

Why not a total gas tax holiday? As I said previously, good luck trying to reinstate it in the fall. And as Stephen Colbert explains, it is always summer somewhere, and soon, it will be summer everywhere! If this thing flys, it will just prove his point, that willingness to go against the experts proves one is ready to be president in this country. And mine, that experts still have a thing or two to learn about communicating with people, and informing policy. But with a little help from the blogosphere and more than one late night comedian, they seem to be doing better this time than when Bill Clinton raised the gas tax by a nickel in 1993, and then lost the Democratic majority in Congress, so we shall see… If nothing else, this lunacy will, perhaps, get a few more voters to consider the trade-offs we are all making, or are forced to make, since we all get stuck with the policies of whoever is elected, not to mention the consequences.

Communicating complex science for policy

Posted December 6th, 2007 by Sylvia S Tognetti and filed in Interfaces of science and policy

Besides the PNT, I also write an occasional e-bulletin on payments for watershed services, that is now another blog! Since communication of science for policy is an underlying theme of the PNT, below I have cross-posted the latest one. There have been a number of other good posts on the subject recently, that I haven’t blogged because I haven’t had the time to add any thoughts, but for now I will just call your attention to a post by Andrew Revkin, which, among other things, discusses the tendency to “normalize” a bad situation. More on that soon, I promise. In the meantime, the most recent Flows bulletin:

Forests and water: Communicating complexity and shaping policy

Whether or not the absence of trees causes flooding or water shortages, is a question that persists perhaps because it produces overly generalized answers that fit easily into existing preconceptions. It also fits easily into policy frameworks and stories that paint the world in black and white. Depending on the latest scientific publication, newspaper headlines can proclaim trees to be a menace that is advancing the desert – or failing to regulate floods. But single scientific studies generally only address fragments of a larger puzzle, and few if any experts endorse the one-size-fits all approach that the media implies (Nambiar 2006).

These kinds of generalizations also support rigid land use policies, and conveniently eliminate nuances that can be better addressed with a more flexible place-based approach which is necessary to manage an ecosystem. The tremendous interest in payments for watershed services is driven in part by the popular appeal of this generalized model, in which the flow of water that links upstream practices to downstream consequences also provides scientific justification for a market-based approach to conservation. As an added benefit, payments for watershed services would also contribute to poverty alleviation in marginal upper watershed areas. In practice, there are often trade-offs between meeting these diverse objectives, and implementation is never as elegant as the model.

A set of ICRAF (2006) policy briefs that synthesize two decades of research in this field, assert that what matters is not the presence or absence of trees but the types of tree and where they are located. Also of importance, is what happens to land after forests are removed (Bruijnzeel, 2007). These factors all have implications for the amount of water trees consume, and the extent to which they control erosion. It is also important to keep in mind the pathways of water and sediment flow, some of which have created today’s fertile land. Rivers may be muddy because of landslides, erosion of banks during peak flows, or sediment from roads and paths – rather than due to open fields.

Mosaics of mixed land use – combining forestry, agroforestry and upland cropping – are typical of traditional upper watershed systems and can support denser populations than forested areas. However, they generally don’t fit into the discrete classifications found in land use policies, in which land is designated for either forested or agricultural use. As a result, farmers are often excluded from access to traditionally used land areas, causing conflict with states.

Controversies aaabout forest and water relationships are deeply rooted, going back at least as far as the late 1800s during the promotion of settlement to the arid American west. Following what had been a wet period, Ferdinand Hayden claimed that, if trees were planted across theGreat Plains , “aridity would give way to well-watered fertility” and rain would follow the plow (Worster, 2001). Based on the results of an extensive survey, John Wesley Powell doubted these claims in his prophetic 1878 Report on the Lands of the Arid Region of the United States. He had, however, observed an association between increased streamflow and upland deforestation, which became a justification for more centralized authority over land use and resource management (Worster 2001). Eventually this resulted in policies of state control over forests to assure the steady flow of water for irrigation and other downstream uses, and for efficient management of timber resources (Hays 1959). It also reinforced existing European land use policies rooted in the feudal period, and became a model for colonial and exclusionary resource management and state ownership of forests elsewhere in the world (Fay and Michon, 2003).

Under this historical context, scientists can no longer play the role of disinterested bystanders. Instead, they need to engage interactively with the public and be aware of the potential uses of their findings in the policy arena. According to Jasanoff (2007), interactive engagement by scientists can help the public think critically about science and bring a healthy skepticism to its claims – instead of accepting it as an arbitrary set of well-established facts. As with climate change, greater public appreciation of the scientific process can help reduce manipulation of the facts in the policy arena, where scientific uncertainty is often cited as justification for arbitrary or delayed decisions.

Given the inherent uncertainties of watershed processes – particularly in the context of highly diverse upland environments, participatory processes are essential for assessing the science and establishing policy-relevant facts. Place-based assessments can also support more nuanced messages that enable mutual learning and more flexible approaches to management. As a more interactive approach to communication, this mutual learning can help broaden the frame of reference for decision-making and enable consideration of trade-offs between the various kinds of ecosystem services and the multiple ways they support human well-being.

References, further information, and new resources are listed below the jump.

Continue Reading »