Sunday, September 2, 2007

Why CIA Experts are no better than the Average Person

A great lecture on Intuition: The Marvels and Flaws by Daniel Kahneman

Some Notes;
How Skills are acquired; To become a chess master;
-10,000 dollars of practice (same to become a good violinist)
-person would have acquired somewhere 50,000 to 100,000 discrete configurations of pieces that are meaningful...
- you need fast feedback of success and failure

-We are enormously suggestive…we can be manipulated

-Military and Strategic Experts when it comes to medium and longterm predictions 'cannot predict better than the average reader of New York Times'

My Index to Talk; Philip E. Tetlock / Two Systems of Thought-Intuition & Rational Thought /How Doctors Make Decisions / associative coherence / Malcolm Gladwell / Theory of Rational Agent / Framing Effects / Gary A. Klein / Anchoring / How we think about History / Heuristic Thinking /Chess Master Intuition / Blink /Amos Tversky

Related;
Why Hawks Win By Daniel Kahneman, Jonathan Renshon
Prospect Theory
Bridging the Gaps
The man who wasn't there
Rethinking thinking;
A quick tour of the key observations made by these psychologists would make even Mr Spock’s head spin. For example, people appear to be disproportionately influenced by the fear of feeling regret, and will often pass up even benefits within reach to avoid a small risk of feeling they have failed. They are also prone to cognitive dissonance: holding a belief plainly at odds with the evidence, usually because the belief has been held and cherished for a long time. Psychiatrists sometimes call this “denial”.

And then there is anchoring: people are often overly influenced by outside suggestion. People can be influenced even when they know that the suggestion is not being made by someone who is better informed. In one experiment, volunteers were asked a series of questions whose answers were in percentages—such as what percentage of African countries is in the United Nations? A wheel with numbers from one to 100 was spun in front of them; they were then asked to say whether their answer was higher or lower than the number on the wheel, and then to give their answer. These answers were strongly influenced by the randomly selected, irrelevant number on the wheel. The average guess when the wheel showed 10 was 25%; when it showed 65 it was 45%.

Experiments show that most people apparently also suffer from status quo bias: they are willing to take bigger gambles to maintain the status quo than they would be to acquire it in the first place. In one common experiment, mugs are allocated randomly to some people in a group. Those who have them are asked to name a price to sell their mug; those without one are asked to name a price at which they will buy. Usually, the average sales price is considerably higher than the average offer price.

Expected-utility theory assumes that people look at individual decisions in the context of the big picture. But psychologists have found that, in fact, they tend to compartmentalise, often on superficial grounds. They then make choices about things in one particular mental compartment without taking account of the implications for things in other compartments.

There is also a huge amount of evidence that people are persistently, and irrationally, over-confident. Asked to answer a factual question, then asked to give the probability that their answer was correct, people typically overestimate this probability. This may be due to a representativeness heuristic: a tendency to treat events as representative of some well-known class or pattern. This gives people a sense of familiarity with an event and thus confidence that they have accurately diagnosed it. This can lead people to “see” patterns in data even where there are none. A closely related phenomenon is the availability heuristic: people focus excessive attention on a particular fact or event, rather than the big picture, simply because it is more visible or fresher in their mind.

Another delightfully human habit is magical thinking: attributing to one’s own actions something that had nothing to do with them, and thus assuming that one has a greater influence over events than is actually the case. For instance, an investor who luckily buys a share that goes on to beat the market may become convinced that he is a skilful investor rather than a merely fortunate one. He may also fall prey to quasi-magical thinking—behaving as if he believes his thoughts can influence events, even though he knows that they can’t.

Most people, say psychologists, are also vulnerable to hindsight bias: once something happens, they overestimate the extent to which they could have predicted it. Closely related to this is memory bias: when something happens people often persuade themselves that they actually predicted it, even when they didn’t.


Freud, finance and folly;


Prospect Theory: An Analysis of Decision under Risk
by Daniel Kahneman; Amos Tversky
Indignation: Psychology, Politics, Law
"The Church of Economics Has Admitted and Even Rewarded Some Scholars Who Would have been Considered Heretics in Earlier Periods"
Everybody's an Expert
The Power of Intuition
Behaviourists at the gates
Neoclassical Theory Versus Prospect Theory: Evidence from the Marketplace

Behavioral Economics: 2 different streams of thought

Multimedia;
Conversation with History Interview (podcast)
Explorations of Mind; Well Being
Maps of Bounded Rationality
Paradox of Choice
A discussion with John Smutniak, Economics correspondent of The Economist, author of survey on Risk

No comments: