The blog Overcoming Bias pointed to an article in Reason Magazine, “More Information Confirms What You Already Know.“
The article cites a study by the Cultural Cognition Project at Yale Law School. “Affect, Values, and Nanotechnology Risk Perceptions: An Experimental Investigation,” which sought to assess attitudes towards new technology but has broader implications:
[R]esearchers polled 1,850 Americans about their attitudes toward nanotechnology. Eighty-one percent of those polled had heard nothing at all (53 percent) or “just a little” (28 percent) about nanotechnology. Nevertheless, after being offered a bare bones two-sentence definition of nanotech, 89 percent of respondents had an opinion on whether the benefits (53 percent) of nanotech would outweigh the risks (36 percent). So how could people who know nothing or almost nothing about a new technology have an opinion about its safety? Pre-existing world views, of course. “The driving force behind these snap judgments, we found, was affect: the visceral, emotional responses of our subjects, pro or con, determined how beneficial or dangerous they thought nanotechnology was likely to be,” write the authors.
The researchers relying on work by social scientist Aaron Wildavsky divided Americans into four cultural groups with regard to risk perception: hierarchists, individualists, egalitarians and communitarians. Hierarchists trust experts, but believe social deviancy is very risky. Egalitarians and communitarians worry about technology, but think that social deviancy is no big deal. Individualists see risk as opportunity and so are optimistic about technology.
“Egalitarians and communitarians, for example, tend to be sensitive to claims of environmental and technological risks because ameliorating such risks justifies regulating commercial activities that generate inequality and legitimize unconstrained pursuit of self-interest,” claim the researchers. “Individualists, in contrast, tend to be skeptical about such risks, in line with their concern to ward off contraction of the sphere of individual initiative. So do hierarchists, who tend to see assertions of environmental technological risks as challenging the competence of governmental and social elites.”
Not surprisingly, the researchers found that people who were concerned about environmental risks such as global warming and nuclear power, were also concerned about nanotechnology. However, the Yale Cultural Cognition researchers made another more disheartening discovery. In their poll they gave a subset of 350 respondents additional facts – about two paragraphs — about nanotechnology to see if more information would shift public risk perceptions. They found that it did. In this case, the more information people had, the more they retreated to their initial positions. Hierarchists and individualists thought nano was less risky, while egalitarians and communitarians thought it was more risky.
“One might suppose that as members of the public learn more about nanotechnology their assessments of its risk and benefits should converge. Our results suggest that exactly the opposite is likely to happen,” note the researchers. What seems to be happening is that individuals use information to affirm their pre-existing cultural identities rather than evaluate risks in purely instrumental terms. Think now of the scientists, technologists and yes, regulators who have to try to bridge these diverse cultural values. More specifically they have to figure out how to persuade communitarians and egalitarians that technology somehow affirms their values. And this is no easy task.
Overcoming Bias elaborated:
This does not bode well for public deliberations on new technologies (or political decisions on them), since it seems to suggest that the only thing that will be achieved in the deliberations is a fuller understanding of how to express already decided cultural/ideological identities in regards to the technology. It does suggest that storytelling around technologies, in particular stories about how they will fit various social projects, will have much more impact than commonly believed. Not very good for a rational discussion or decision-making, unless we can find ways of removing the cultural/ideological assumptions of participants, which is probably pretty hard work in deliberations and impossible in public decisionmaking.
The implications doubtless extend way beyond technology and explain why we have such lousy policies on so many fronts. People can’t agree on facts and then discuss implications in terms of various frames of reference; they tend to interpret facts in light of their value frameworks. And if you can’t agree on facts, you can’t get anywhere very useful in a negotiation.
Let’s relate this notion to one of the topics of this blog, subprimes. Some people believe that subprimes have been good for lower income people and have given them access to housing they couldn’t have had otherwise. Others believe that the product has done so much damage to so many people that it outweights whatever good might have been done (and informed cynics note that the subprimes took share from old FHA loans, and many subprime borrowers did not get access to credit they would not have gotten otherwise).
Now one could get a good bit more granular on the subject of the impact of subprimes. Who really did get them? How many were first time homebuyers (as opposed to prime borrowers who due to defaults refinanced as subprimes)? How many of these are in trouble or (due to resets) are likely to get into trouble? How much smaller would the market have been if there had been only FHA loans? How many of the troubled borrowers would have been eligible for FHA loans?
The answers to this and at most ten other questions would give you a very good basis for analyzing what went wrong with subprimes and what (if anything) should be done about it. But the Yale article says this sort of study will not advance the cause of truth. The resulting information will (in most cases) simply be interpreted in light of one’s existing position.
Maybe we should all give up and become lobbyists…