Wednesday, March 9, 2011

Basic Information About How the Brain Works and how easy it is to manipulate us all....



Because of bugs in the way our brains work, it is easy to manipulate us....

For example, social proof is the well-known principle stating that many people will believe something if most other people believe it. And see this.

In other words, we have a herd instinct. So if marketers, advertisers or propagandists convince us that most people believe something (even if they actually don't), we'll tend to believe it too.

Psychiatrists and behavioral economists tell us that this is true with investing, as well. Our brains are hardwired to "follow the herd" of investors. Since most investors lose money playing the markets, you don't want to follow the herd.

A study by Barber and Odeon (Journal of Finance, 2000) shows that an average household with an account at a large discount brokerage firm underperforms by an average 15 basis points per month based on the "efficient market" model. This is based on gross returns before expenses. The study also found that the average individual investor would have been better off by not trading.

Indeed, many savvy traders speak of "trading against the dumb money", meaning trading against the majority of people who follow herd instincts and lose money.

Why is the herd so bad at investing? Well, as Paul Farrell wrote last year:

In Mean Markets & Lizard Brains, former Goldman trader Terry Burnham says our primitive [or "lizard"] brain was designed to help our ancestors hunt for food, daily survival stuff.

But “by its very nature, investing requires us to be forward looking, to anticipate events. Our lizard brains, however, are designed to look backward. Thus, the lizard brain causes us to be optimistic at market peaks (after rises) and to be pessimistic at market bottoms (after falls).” So whether it’s optimism or pessimism, greed or fear, your emotions do your investing, not reasoning and logic—and you can’t trust them.

***

Burnham’s summary: “We need to precisely restrain our instincts in order to make money. Unlike neural games of chance, or ancestral problems like gathering and hunting, financial success means suppressing our ‘gut’ instincts.”

Top investors like Kyle Bass like to quote Charles Mackay on this point:

Men, it has been well said, think in herds; it will be seen that they go mad in herds, while they only recover their senses slowly, and one by one.

Speaking of senses, the founder of the field of social psychology showed that the opinions of one's group affect one's own physical sensations. As Wikipedia notes:

In an otherwise totally dark room, a small dot of light is shown on a wall, and after a few moments, the dot appears to move. This effect is entirely inside-the-head, and results from the complete lack of "frame of reference" for the movement. Three participants enter the dark room, and watch the light. It appears to move, and the participants are asked to estimate how far the dot of light moves. These estimates are made out loud, and with repeated trials, each group of three converges on an estimate. Some groups converged on a high estimate, some low, and some in-between. The critical finding is that groups found their own level, their own "social norm" of perception. This occurred naturally, without discussion or prompting.

When invited back individually a week later and tested alone in the dark room, participants replicated their original groups' estimates. This suggests that the influence of the group was informational rather than coercive; because they continued to perceive individually what they had as members of a group, Sherif concluded that they had internalized their original group's way of seeing the world. Because the phenomenon of the autokinetic effect is entirely a product of a person's own perceptual system, this study is evidence of how the social world pierces the person's skin, and affects the way they understand their own physical and psychological sensations.

As fear makes people stupid and prone to bend over backwards and justify the actions of the powerful:

Sociologists from four major research institutions investigated why so many Americans believed that Saddam Hussein was behind 9/11, years after it became obvious that IRAQ had nothing to do with 9/11....

The researchers found, as described in an article in the journal Sociological Inquiry (and re-printed by Newsweek):

  • Many Americans felt an urgent need to seek justification for a war already in progress
  • Rather than search rationally for information that either confirms or disconfirms a particular belief, people actually seek out information that confirms what they already believe.
  • "For the most part people completely ignore contrary information."
  • "The study demonstrates voters' ability to develop elaborate rationalizations based on faulty information"
  • People get deeply attached to their beliefs, and form emotional attachments that get wrapped up in their personal identity and sense of morality, irrespective of the facts of the matter.
  • "We refer to this as 'inferred justification, because for these voters, the sheer fact that we were engaged in war led to a post-hoc search for a justification for that war.
  • "People were basically making up justifications for the fact that we were at war"
  • "They wanted to believe in the link [between 9/11 and Iraq] because it helped them make sense of a current reality. So voters' ability to develop elaborate rationalizations based on faulty information, whether we think that is good or bad for democratic practice, does at least demonstrate an impressive form of creativity.
An article yesterday in AlterNet discussing the Sociological Inquiry article helps us to understand that the key to people's active participation in searching for excuses for actions by the big boys is fear:
Subjects were presented during one-on-one interviews with a newspaper clip of this Bush quote: "This administration never said that the 9/11 attacks were orchestrated between Saddam and al-Qaeda."

The Sept. 11 Commission, too, found no such link, the subjects were told.

"Well, I bet they say that the commission didn't have any proof of it," one subject responded, "but I guess we still can have our opinions and feel that way even though they say that."

Reasoned another: "Saddam, I can't judge if he did what he's being accused of, but if Bush thinks he did it, then he did it."

Others declined to engage the information at all. Most curious to the researchers were the respondents who reasoned that Saddam must have been connected to Sept. 11, because why else would the Bush Administration have gone to war in Iraq?

The desire to believe this was more powerful, according to the researchers, than any active campaign to plant the idea.

Such a campaign did exist in the run-up to the war...

He won't credit [politicians spouting misinformation] alone for the phenomenon, though.

"That kind of puts the idea out there, but what people then do with the idea ... " he said. "Our argument is that people aren't just empty vessels. You don't just sort of open up their brains and dump false information in and they regurgitate it. They're actually active processing cognitive agents"...

The alternate explanation raises queasy questions for the rest of society.

"I think we'd all like to believe that when people come across disconfirming evidence, what they tend to do is to update their opinions," said Andrew Perrin, an associate professor at UNC and another author of the study...

"The implications for how democracy works are quite profound, there's no question in my mind about that," Perrin said. "What it means is that we have to think about the emotional states in which citizens find themselves that then lead them to reason and deliberate in particular ways."

Evidence suggests people are more likely to pay attention to facts within certain emotional states and social situations. Some may never change their minds. For others, policy-makers could better identify those states, for example minimizing the fear that often clouds a person's ability to assess facts ...

The AlterNet article links to a must-read interview with psychology professor Sheldon Solomon, who explains:

A large body of evidence shows that momentarily [raising fear of death], typically by asking people to think about themselves dying, intensifies people's strivings to protect and bolster aspects of their worldviews, and to bolster their self-esteem. The most common finding is that [fear of death] increases positive reactions to those who share cherished aspects of one's cultural worldview, and negative reactions toward those who violate cherished cultural values or are merely different.

And once people form a belief, it can be almost impossible to get them to change their beliefs ... even if confronted with contradictory information.

As NPR noted last July:

New research suggests that misinformed people rarely change their minds when presented with the facts — and often become even more attached to their beliefs.

***

A new body of research out of the University of Michigan suggests ... that we base our opinions on beliefs and when presented with contradictory facts, we adhere to our original belief even more strongly.

The phenomenon is called backfire, and it plays an especially important role in how we shape and solidify our beliefs on immigration, the president's place of birth, welfare and other highly partisan issues.

***

It's threatening to us to admit that things we believe are wrong. And all of us, liberals and conservatives, you know, have some beliefs that aren't true, and when we find that out, you know, it's threatening to our beliefs and ourselves.

***

This isn't a question of education, necessarily, or sophistication. It's really about, it's really about preserving that belief that we initially held.

Investors, consumers and citizens have no chance - and will become easy prey to those who are trying to sell us snake oil - unless we arm ourselves with a basic understanding of how our brains work....