Every fall, I teach a graduate seminar in risk communication at the University of Michigan. I’ve been revising the class reading list over the summer, both to add newer material but also to remind myself of the key messages I try to communicate.
One of these key messages comes up most cleanly in the class session about communication and perception of climate change risks. Put simply the message is this:
People don’t understand how complex systems work. Even highly educated people often don’t understand how complex systems work.
They just think they do.
It would be one thing if the first reaction of most people to thinking about a complex risk was, “wow, that looks complex!” We know that risks that seem particularly uncertain or unknowable make us more concerned. Thus, you would think that really, really complex risks like climate change would make us really, really concerned.
But, they don’t for many people. And I don’t think that this is because people aren’t presented with facts or evidence. It’s because people’s mental model of weather and climate is much simpler than the actual system is, and their conceptual misunderstanding leads them to draw dramatically different conclusions from the evidence they see and hear.
Years ago, John Sterman (Professor at MIT’s Sloan School of Business) created a (now computerized but once manual) game for MBA students called The Beer Game that is designed to help them understand the dynamics of complex systems. What it does in particular is show them how incredibly hard it is to have good intuitions about what to do about complex systems.
The game is conceptually simple: You’re one of the key players in the beer distribution process (e.g., a factory, a retailer, or a beer distributor). If, say, you’re the distributor, you order beer from suppliers, but it takes a while to actually arrive. You try to keep enough beer on hand to meet varying demand from customers. Simple, right?
Wrong.
The MBA students (who are all, of course, very intelligent and motivated) usually start with a simple model in mind. If there’s more demand, order more beer. Less demand, order less. The problem is that they don’t anticipate the delays in supply. These delays mean that demand today has to be filled from orders from a week ago, and today’s orders will only help fill demand in a week.
As a result, if there’s a really high spike in demand (think Super Bowl), students tend to react to the short term shortage by ordering lots more beer from the supplier. The problem is that the spike in demand is over by the time that the (now really large) orders arrive, creating oversupply. In response, students tend to stop placing any new orders, which then creates undersupply a week later. Cyclical patterns of over- and under-response are common. More importantly, the longer the delays between orders and arrival of supply, the worse people are at managing the system.
Why do I bring up the Beer Game in the context of climate change? Because there are lots of analogies between MBA students’ failures to understand the complex system of demand and supply and most importantly delays and the general public’s failure to understand the complex dynamics of climate change and weather.
In 2008, Sterman published a very nice article that I use in my class in Science titled “Risk Communication on Climate: Mental Models and Mass Balance.” In it, he argued that most people have a naive mental model of climate that is pretty simple: The more bad stuff we put in the atmosphere, the warmer it (might) get. Stop putting it in, and things get better pretty quickly.
The implications of this model are straightforward: If you can change the climate fast, you only need to intervene when the current situation becomes intolerable. As soon as it does, changing behavior results in improved outcomes.
Only one problem: Greenhouse gases and other drivers of climate change aren’t like that. They last. As a result, climate changes have momentum. Thus, even if the world were to stop adding greenhouse gases today, it would take decades before the changes induced by what has already occurred would stabilize.
Just like a beer distributor who doesn’t account for the delay between orders and delivery, people generally don’t account for the delay between human actions and observable outcomes. As a result, the complex system leaves them still feeling safe and unchanged even if it is not. Many people believe that even if climate change is occurring, we still have time to make changes to prevent harm. Failures to anticipate the momentum of the system, however, means that they overestimate how quickly changes in behavior could result in improvements.
It is worth noting that there are experts who really understand these types of feedback and feed-forward delays. Engineers and scientists who work in in control systems deal with such issues on a daily basis. Their training and experience let them know that they have to anticipate longer term effects when they plan short-term responses.
For the rest of us, however, I wonder whether it might be possible to generate better understanding of complex system risks if we drew analogies to real world things that have similar momentum properties. Most people get the idea that it takes a long time for a train or a large boat to come to a stop. They understand that letting up on the gas pedal when driving a car on the highway doesn’t mean that you’ll immediately come to a stop. Maybe, if we discussed complex system risks in terms of real world analogies instead of quantitative data, they might make more sense.
For example, consider that the climate isn’t like an agile sports car making sharp turns on a winding road. It’s more like a fully loaded 18 wheel truck driving on ice. Just like the truck, whether the climate system is moving fast or slowly, it has a lot of momentum, and it’s not going to stop any time soon.
Analogies are very powerful communication tools. On the plus side, they can help people draw on their everyday experience to understand risks and systems that we have no direct experience with. The challenge with using analogies, however, is that the analogy has to focus our attention on the relevant complexity without confusing us. You might have noticed that I intentionally described the analogical 18 wheel truck as driving on ice. The reason was that I didn’t want you thinking about the fact that most trucks have brakes (because our climate system doesn’t have any).
The climate can’t stop on the proverbial dime. So maybe letting up on the accelerator a bit earlier than we think we need to might be a good idea. And maybe talking about climate risks through the language of real world models of momentum will help people understand why.
Brian J. Zikmund-Fisher is an Assistant Professor of Health Behavior & Health Education at the University of Michigan School of Public Health and a member of the University of Michigan Risk Science Center and the Center for Bioethics and Social Sciences in Medicine. He specializes in risk communication to inform health and medical decision making.