Experts for life
A reader, as well as Roger Simon, alerted me to this article by Victor Davis Hanson. It makes some excellent points, as usual, and is well-written, as usual. If you're not familiar with his work, you might want to put him on your reading list. The article is about the failure of many "experts"--on both sides--to predict recent events.
One thing I don't understand: is being an expert like being "President for life?" That is, once you are anointed, appointed, elected, whatever, to "expert" status, is there nothing that can impeach you? Like, maybe, being wrong; like maybe, over and over and over again?
I have noticed that a bad track record on analyzing or predicting events is brushed over and ignored, and that experts keep on prognosticating and experticizing (yes, I know it's not a word, but I think it should be). They are rarely, if ever, called on it--that would probably make the experts angry, and would reduce the field from which the cable news stations can draw. And then what would they do?
It reminds me of a related question I've often wondered about: does anyone check up on psychics? How many of their yearly predictions actually come true--the ones that aren't totally vague, that is? Perhaps people just don't want to know--it's a lot more fun to believe. And a lot more lucrative for the psychics.
5 Comments:
I am sceptical of psychics and I wonder how accurate their predictions are over a period of years. It has always amazed me that intelligent people can put their faith in such people who often turn out to be charlatans. Montel Williams has Sylvia Browne. Can Oprah be far behind ? :)
An expert is someone who happens to know more than the reporter on a given subject and has a PHD. Doesn't matter what subject their degree is in. A psychic is the same thing, without the degree.
A couple of years ago there was an academic conference on the role of intellectuals after 9/11, at which the most useful thing said was the English professor Sander Gilman's quip that it might be a good thing if intellectuals played less of a role at this juncture than they did in earlier crises. Gilman argued that intellectuals had mostly gotten things wrong in the course of the twentieth century and that therefore we ought to reconsider the value of intellectualism or theoretical expertise, as opposed to an older ideal of practical intelligence.
As an academic myself, my study of British intellectual culture in the late 19th and early 20th centuries has led me to a newfound modesty about expertise that I think it would be good for more intellectuals to have. Most of the best-regarded intellectual figures in late-Victorian and Edwardian Britain got most things wrong most of the time. Their intellectual batting averages weren't worse than those of most people in most times and places, it's just that, generally speaking, it's hard to be right about most things most of the time the way that intellectuals claim to be.
What would be salutary is more accountability on intellectuals' parts for the claims they make when those claims are proven wrong by the emergence of contrary evidence.
That has now happened with regard to some of the claims made by experts in the Bush administration about such questions as Iraq's possession of chemical weapons. All to the better.
What clearly hasn't happened is any real acknowledgement by critics of the Bush administration of the many ways in which they have been wrong since 9/11.
A representative example would be Noam Chomsky's claim that the intervention in Afghanistan would result in three million casualties -- a false contention which hasn't resulted in any loss of faith in Chomsky on the part of those invested in the notion of his expertise. All to the worse.
In the end, it's best to take intellectual's predictions with a grain of salt, since they're usually wrong. A helpful test in assessing experts' credibility might be to ask whether or how much money one would bet on things turning out as they predict.
Does this mean Michael Moore is being knocked off the pedestal of expertise?
"A helpful test in assessing experts' credibility might be to ask whether or how much money one would bet on things turning out as they predict."
This is actually not far from the Pentagon's much-maligned terrorism betting-pool idea of 2003. Though that particular plan was unfortunate because of its subject matter (betting on assassinations and terrorist attacks was never going to be politically feasible, and they should have known that), the basic idea of using markets to harness group knowledge is, I believe, still quite promising, and offers a new and relatively untested alternative to expert knowledge. This guy has offered a little introduction.
Post a Comment
<< Home