In a 2006 Downing Street seminar, Robert Cialdini had one message that struck particularly deeply, and led to quite a bit of nervous laughter around the State Dining Room. He talked about how often policymakers, in his opinion, not only missed the power of social norms and influence, they often inadvertently used them in a way that actually backfired.
He argued that too often, in their haste to impress upon colleagues or the public the gravity of a particular issue, policymakers would inadvertently reinforce the very behavior they were trying to discourage. For example, a campaign concerning knife crime would end up communicating the message that most young people carried knives, or a campaign to reduce tax evasion or benefit fraud would inadvertently communicate that everyone was at it. Such campaigns, or speeches, were the policy equivalent of putting a ‘no ball games’ sign on a high brick wall—well-meaning, but almost bound to prompt the thought ‘well, now that you mention it, that is a great surface for a ball game….’
In the years since that seminar I have lost count of the number of examples of Robert Cialdini’s ‘big mistake’ that I have seen. In most cases the effect has not been evaluated but we know enough about of the power of descriptive social norms to be troubled. Examples include: posters telling immigration officers that some of their colleagues have been caught and punished for selling work visas (‘Never thought of that—I wonder how much they made’); signs in doctors’ surgeries about the number of people who missed their appointments in the last month (‘… so I’m not the only one’); and national campaigns bemoaning the low number of women on top company boards (‘well, we’ve got a woman on our board of twelve, so that’s pretty good, then’).
Unfortunately, certain institutions in society also inadvertently make, or amplify, the ‘big mistake.’ The media itself is one of these, albeit echoing our own human interest in stories about crime, threat or deceit. No wonder that the majority of the public, in the UK and elsewhere, continue to feel that crime is rising (at least at a national level), despite every reliable measure documenting its fall over the last 20 years. Similarly, welfare and regulatory systems often inadvertently signal that most people are not to be trusted, since they are routinely built around the assumption, and tests, that people are cheating and breaking the law. The evidence on social norms suggests that such signaling is likely to increase levels of cheating.
Yet it is often possible to flip around these campaigns and effects. Take the example of getting more women on company boards, an issue widely championed by campaigners and indeed Prime Ministers, but often embodying a clear example of the ‘big mistake.’ The normal centerpiece of campaigns to get more women on boards is a statistic along the lines ‘isn’t it shocking that only 25 per cent of board members are women?’ (less in some countries). It is shocking, but it’s also likely to be a message that inadvertently normalizes the situation. On the other hand, if such campaigns made the equally valid point that ‘90 percent of companies have women on their boards,’ then the signaling is very different. Following discussions with Iris Bohnet, an expert on gender inequality, and Emily Walsh, special adviser to the UK’s Business Secretary, parts of the UK’s campaign to encourage more women on to boards has indeed been reframed in this way.
Similarly, most industrialized countries are wrestling with the problem of obesity, so reflecting these particular social norms is likely to backfire. Indeed, there is specific evidence that obesity is ‘contagious’—when those around you get fat, so do you, and with it your sense of what is healthy shifts, too. This can raise difficult issues, including occasionally the temptation to ‘nudge’ the presentation of numbers to prevent the situation getting worse. The weight curves given to US parents to plot their child’s development, for some years, have not been the actual weight curves of US children. Rather, they are generally distributions of ‘healthy weight,’ driven by the concern that if parents used the actual weight curves of US children it would exacerbate the problem of obesity even further.
Cialdini’s ‘big mistake’ provides a clear example of why governments and businesses can benefit from learning even just a little about behavioral insights. Even if you have no interest in actively using nudge approaches you should at least want to know where you have inadvertently stumbled into deploying them against yourself.
Excerpted from Inside the Nudge Unit: How Small Changes Can Make a Big Difference, WH Allen. Copyright © 2015 David Halpern. Reprinted with permission. Available in the United States December 15, 2015.
David Halpern is the Chief Executive of the Behavioral Insights Team and Board Director. He has led the team since its inception in 2010. Prior to that, David was the founding Director of the Institute for Government and between 2001 and 2007 was the Chief Analyst at the Prime Minister’s Strategy Unit. Before entering government, David held tenure at Cambridge and posts at Oxford and Harvard. He has written several books and papers on areas relating to behavioral insights and well-being, including Social Capital (2005), the Hidden Wealth of Nations (2010), and most recently Inside the Nudge Unit (2015).
More on The Psych Report
Further Reading and Resources
- Halpern, D. (2015). Inside the Nudge Unit: How Small Changes Can Make a Big Difference. London, UK: WH Allen.
- Behavioral Insights Team. (2015). Update Report 2013-2015.
- Behavioral Insights Team. (2010). MINDSPACE: Influencing Behavior through Public Policy.
- Nesterak, M. (2014). Nudging the UK: A Conversation with David Halpern. The Psych Report.
- Schwartz, B. (2014). Why Not Nudge? A Review of Cass Sunstein’s Why Nudge. The Psych Report.
- Nesterak, E. (2015). How Do We Solve the Last Mile? The Psych Report.