Friday, August 27, 2010

A cultural Uncertainty Principle

This is a kind of extension of that "attitude of epistemic humility" discussed in an earlier post. That stemmed from an article by Jim Manzi speaking of the difficulty in assessing the effectiveness of economic policies, because of the impossibility of assessing counterfactuals -- i.e., of knowing what would have happened without the policy, or if a different policy had been in place. That's a fairly narrow limitation, however, and I've already suggested ways in which such uncertainty could be reduced at least (e.g., examining different policy approaches to the same problem in different jurisdictions, etc.). But Manzi's piece itself refers back to his own earlier and more general article in City Journal on "What Social Science Does -- and Doesn't -- Know" (subtitled "Our scientific ignorance of the human condition remains profound.").

In that he brings up another kind of complication that arises in medicine and then in the social sciences, which he calls "causal density":
... as experiments began to move from fields like classical physics to fields like therapeutic biology, the number and complexity of potential causes of the outcome of interest—what I term 'causal density'—rose substantially. It became difficult even to identify, never mind actually hold constant, all these causes.
Which is a good point, though it's certainly been noted and complained about previously. And in thus merging social science with medicine, it obscures a much more problematic source of trouble. Which is this: behind every hypothesis that's tested in a scientific experiment there lies a theory of some sort, though not always an explicit or well-formulated one, nor even necessarily a conscious one. In the latter case, of course, we're talking about unconscious assumptions that can affect the whole design and outcome of the experiment or study. In the physical sciences, such assumptions are usually the result of insufficient care or analysis (leaving Kuhn aside for now), and there is a strong culture dedicating to exposing just such faults. But in the social sciences, these kinds of assumptions are very commonly interlaced with social and political values, and attempting to expose them can involve much more complex and deeply rooted political issues and conflicts, rather than simply scientific or rational ones.

So this is the first source of inherent uncertainty in this area: in touching upon our social position in the world, social science cannot avoid the political issues involved in its research -- it is inherently politicized. And unfortunately, as we all know, it's politicized largely in one direction -- toward the liberal-left (see, e.g., "The Social and Political Views of American Professors" [PDF], by Gross and Simmons, 2007, as just one among many indicators). All too often, this results in "studies" that show mainly what the studier wants them to show, though of course they usually go through the motions of a "balanced" enquiry. Such studies are what Richard Feynman referred to as "cargo cult science": "they follow all the apparent precepts and forms of scientific investigation, but they're missing something essential", which he calls "scientific integrity", but which is often manifested in a serious commitment to finding holes, flaws, weak points, over-statements, falsifications, in both one's own and others' work. But that's precisely what's so difficult when dealing with issues that touch upon one's deepest convictions.

And there's another, even deeper, source of social science uncertainty. Knowledge or understanding is an important -- in many ways, central -- aspect of what we are, and it has a decisive effect on how we behave.  But, even allowing for the inherent ideological bias mentioned above, social science itself is knowledge, and hence that knowledge itself affects -- i.e., changes -- the behavior of its object of study. There is therefore an Uncertainty Principle at work in attempts to make a science of human behavior -- not as precise as Heisenberg's, certainly, but analogous in the way in which the observed is unavoidably affected by the observer. It's an effect that has shown up repeatedly in economics, where predicted behavior patterns are internalized and taken into account, with the resultant behavior only explicable in hindsight. It's the "unintended consequences" that perpetually operate to limit the designs of would-be social engineers.

Social science, therefore, faces multiple sources of uncertainty that physical sciences do not -- in order of increasing severity or intractability, they are
  • the difficulty in disproving counterfactuals;
  • what Manzi refers to as the "causal density" of social phenomena;
  • the inherent politicization of social questions;
  • and the Cultural Uncertainty Principle: the inescapable and unpredictable effect that social knowledge has on social behavior.
All of these need to be borne in mind whenever encountering the "studies show" line so often used to justify the technocratic manipulation of social outcomes.

No comments:

Post a Comment

You can use some HTML tags, such as <b>, <i>, <a>