Sarah Perry’s monumental essay on Cooperative Ignorance makes a compelling point that when interacting with other human beings (as opposed to indifferent nature), it is often beneficial to be deliberately ignorant or otherwise constrained in one’s options. In this overview I’ve compiled a few highlights; read the whole thing when you can make the time. First off, Perry cites J. David Velleman with an obvious example for the strength of ostensibly lacking options in negotiations:
The union leader who cannot persuade his members to approve a pay-cut, or the ambassador who cannot contact his head-of-state for a change of brief, negotiates from a position of strength; whereas the negotiator for whom all concessions are possible deals from weakness. If the rank-and-file give their leader the option of offering a pay-cut, then he may find that he has to exercise that option in order to get a contract, whereas he might have gotten a contract without a pay-cut if he had not had the option of offering one. The union leader will then have to decide whether to take the option and reach an agreement or to leave the option and call a strike. But no matter which of these outcomes would make him better off, choosing it will still leave him worse off than he would have been if he had never had the option at all.
In the absence of pig-headed union members, a suitable replacement is strategic ignorance of any options one could or should exercise. This takes the common form of plausible deniability among politicians and managers. Linsey McGoey describes the usefulness of incompetent experts for disclaiming possible knowledge of a pharmaceutical scandal:
A curious feature of knowledge alibis is that experts who should know something are particularly useful for not knowing it. This is because their expertise helps to legitimate claims that a phenomenon is impossible to know, rather than simply unknowable by the unenlightened. If the experts didn’t know it, nobody could.
Selecting such anti-experts and the topics one wants to avoid researching is itself a mental effort which McGoey dubs negative knowledge: “an awareness of the things we have no incentive or interest in knowing about further.” Negative knowledge can be useful even for oneself, as in the case of quitting smoking. Perry cites a study by Carrillo & Mariotti showing that people greatly overestimate the risk of getting lung cancer from smoking. This ignorance is beneficial because the actual risk is still substantial (5-10%) but not nearly as scary as the imagined risk (~40%) which compels smokers to reduce or quit the habit. Similarly, people will choose to remain deliberately ignorant of some drug’s pleasurable effects so as to avoid its consumption, or of some career’s financial risks so as not to endanger their commitment.
Perry then synthesizes strategic ignorance with belief-based group cohesion to arrive at the titular cooperative ignorance. This describes common deliberate ignorance of any facts or arguments that might shake a member’s faith in his group’s beliefs, rendering his ritualistic in-group signaling suspiciously slow or weak – the essence of George Orwell’s Crimestop. Ignorance protects members from rejection by the group, and the group itself from decohesion through doubt and criticism. This sounds sinister but is to some degree a necessity for any stable group capable of protecting its interests and members.
The more “checks” a group has available to detect deception by a member, ideological or otherwise, the better its chances of reliably detecting deception. So a member must be able to pass as many checks as possible: crying or laughing or becoming angered when necessary, performing rituals without groaning too much, and demonstrating appropriate cognition as evidenced by speech and action. As an example of cognitive performance leakage, the time it takes to make a decision or produce a response is itself a signal of what’s going on mentally. An insincere person or liar, or one playing another strategy that requires more processing time, will take longer to produce a response.
Finally, cooperative ignorance on specific subjects can be deliberately imposed by sufficiently powerful governments or other organizations. Although someone still benefits here, it’s no longer necessarily the ignorant ones themselves. Perry’s closing thoughts on that subject present a highly recognizable picture of contemporary media campaigns and academic careers…
A powerful method to deal with curiosity is to manipulate the results of research by proffering a biased sample: ensuring that when research does occur, the early stages of research confirm the group fantasy and discourage further research. Benevolent governments or organizations might ensure that research on smoking quickly turns up only negative information: short, easy-to-digest information with headings like “Get the Facts!” that obscure the true risk in favor of reporting on harms alone. […] If reading and learning in general are to be discouraged because they are bad for us, a government might fill its schools with boring, irrelevant, tedious literature.
A related method is to disguise a tabooed area of reality as something else. When a curious human attempts to research “medicine” or “education” or “philosophy,” he will be harmlessly diverted into a safe zone of research that conveniently goes by the name of the dangerous area, and never know the difference. […] certain people might be given high status, legitimacy, and attention, on the expectation that this makes them experts who should know something. However, as a condition for their high status and legitimacy, they must be persuaded to only reveal beneficial information, and not defect from the cooperative ignorance pact. Such people can make cooperative ignorance agreements much stronger.