RECENT POSTS
Magic by Numbers
The Weight at the Plate
Times to Remember, Places to Forget
What You Don’t Know Makes You Nervous
Compassionate Commercialism

Blog Home

ARCHIVES BY MONTH
October 2010
August 2010
January 2010
May 2009
March 2007
January 2007
November 2006
July 2006
June 2006
May 2006
April 2006

SYNDICATE


STUMBLING ONLINE
Hedonic Psychology Lab
Q&A with SXSWi
Podcast for SXSWi
Q&A on Powells.com


April 24, 2006: Who Are You Calling Biased?



When the wonderful folks at Random House1 asked if I would be willing to do a blog for this website, I had to look the word up.

Don’t worry. I Googled it. I’m not a Luddite who writes books on yellow legal pads. I get email on my cell phone and can install a video card without setting anything on fire. It’s just that my work requires me to spend a lot of time reading, a lot of time writing, and a lot of time talking (I hear my wife in the background yelling “And a lot of time watching Star Trek!”) which hasn’t left me much time to wander around in cyberspace and learn about blogging. Maybe if they had called it something that doesn’t sound so much like throwing up.

Anyway, I’m delighted to be taking my maiden voyage and I welcome you on board. Feel free to unlock your tray table and lean your seat into the knees of the guy in back of you.

I thought I’d start by sharing with you an Op-Ed piece I wrote for the New York Times last Sunday. (I learned this year that the “op” in Op-Ed doesn’t stand for “opinion.” It stands for “opposite” as in “opposite the editorial page”. Maybe you knew. I didn’t). In this piece I describe what social psychologists and behavioral economists have discovered in the last decade about how people see and resolve conflicts of interest. Other scholars have already done this quite nicely (Don Moore and George Loewenstein at Carnegie-Mellon University have several good papers on this topic) but no one had done it in the pages of the world’s greatest newspaper2, at least not recently, so I took a shot.

I’m O.K., You’re Biased
The New York Times, April 16, 2006

Verizon had a pretty bad year in 2005, but its chief executive did fine. Although Verizon’s earnings dropped by more than 5 percent and its stock fell by more than a quarter, he received a 48 percent increase in salary and compensation. This handsome payout was based on the recommendation of an independent consulting firm that relied on Verizon (and the chief executive’s good will) for much of its revenue. When asked about this conflict of interest, the consulting firm explained that it had “strict policies in place to ensure the independence and objectivity of all our consultants.”

Please stop laughing.

The person who made this statement was almost certainly sincere. Consultants believe they can make objective decisions about the companies that indirectly employ them, just as legislators believe that campaign contributions don’t influence their votes.

Doctors scoff at the notion that gifts from a pharmaceutical company could motivate them to prescribe that company’s drugs, and Supreme Court justices are confident that their legal opinions are not influenced by their financial stake in a defendant’s business, or by their child’s employment at a petitioner’s firm. Vice President Dick Cheney is famously contemptuous of those who suggest that his former company received special consideration for government contracts.

Voters, citizens, patients and taxpayers can barely keep a straight face. They know that consultants and judges are human beings who are pulled by loyalties and pushed by animosities, and that drug reps and lobbyists are human beings who wouldn’t be generous if generosity didn’t pay dividends. Most people have been around people long enough to have a pretty good idea of what drives their decisions, and when decision-makers deny what seems obvious to the rest of us, the rest of us get miffed. Sell our democracy to the highest bidder, but don’t insult our intelligence.

So who’s right—the decision-makers who claim objectivity or the citizens who roll their eyes? Research suggests that decision-makers don’t realize just how easily and often their objectivity is compromised. The human brain knows many tricks that allow it to consider evidence, weigh facts and still reach precisely the conclusion it favors.

When our bathroom scale delivers bad news, we hop off and then on again, just to make sure we didn’t misread the display or put too much pressure on one foot. When our scale delivers good news, we smile and head for the shower. By uncritically accepting evidence when it pleases us, and insisting on more when it doesn’t, we subtly tip the scales in our favor.

Research suggests that the way we weigh ourselves in the bathroom is the way we weigh evidence outside it. Two psychologists, Peter Ditto and David Lopez, told subjects that they were being tested for a dangerous enzyme deficiency. Subjects placed a drop of saliva on a test strip and waited to see if it turned green. Some subjects were told that the strip would turn green if they had the deficiency, and others were told that the strip would turn green if they did not. In fact, the strip was just an ordinary piece of paper that never changed color.

So how long did subjects stare at the strip before accepting its conclusion? Those who were hoping to see the strip turn green waited a lot longer than those who were hoping not to. Good news may travel slowly, but people are willing to wait for it to arrive.

The same researchers asked subjects to evaluate a student’s intelligence by examining information about him one piece at a time. The information was quite damning, and subjects were told they could stop examining it as soon as they’d reached a firm conclusion. Results showed that when subjects liked the student they were evaluating, they turned over one card after another, searching for the one piece of information that might allow them to say something nice about him. But when they disliked the student, they turned over a few cards, shrugged and called it a day.

Much of what happens in the brain is not evident to the brain itself, and thus people are better at playing these sorts of tricks on themselves than at catching themselves in the act. People realize that humans deceive themselves, of course, but they don’t seem to realize that they too are human.

A Princeton University research team asked people to estimate how susceptible they and “the average person” were to a long list of judgmental biases; the majority of people claimed to be less biased than the majority of people. A 2001 study of medical residents found that 84 percent thought that their colleagues were influenced by gifts from pharmaceutical companies, but only 16 percent thought that they were similarly influenced. Dozens of studies have shown that when people try to overcome their judgmental biases—for example, when they are given information and told not to let it influence their judgment—they simply can’t comply, even when money is at stake.

And yet, if decision-makers are more biased than they realize, they are less biased than the rest of us suspect. Research shows that while people underestimate the influence of self-interest on their own judgments and decisions, they overestimate its influence on others.

For instance, two psychologists, Dale Miller and Rebecca Ratner, asked people to predict how many others would agree to give blood for free or for $15, and people predicted that the monetary incentive would double the rate of blood donation. But when the researchers actually asked people to give blood, they found they were just as willing to do it for nothing as they were for a $15 reward.

The same researchers measured people’s attitudes toward smoking bans and asked them to guess the attitudes of others. They found that smokers vastly overestimated the support of nonsmokers for the bans, as did nonsmokers the opposition of smokers to the bans—in other words, neither group was quite as self-interested as the other group believed.

Behavioral economics bolsters psychology’s case. When subjects play laboratory games that allow them to walk away with cash, self-interest dictates that they should get all the cash they can carry. But scores of experiments show that subjects are willing to forgo cash in order to play nice.

For instance, when subjects are given a sum of money and told that they can split it with an unseen stranger in any proportion they like, they typically give the stranger a third or more, even though they could just as easily have given him nothing. When subjects play the opposite role and are made the recipients of such splits, they typically refuse any split they consider grossly unfair, preferring to walk away with nothing than to accept an unjust distribution.

In a recent study, the economists Ernst Fehr and Simon Gächter had subjects play a game in which members of a team could earn money when everyone pitched in. They found that subjects were willing to spend their money just to make sure freeloaders on the team didn’t earn any. Studies such as these suggest that people act in their own interests, but that their interests include ideals of fairness, prudence and generosity.

In short, doctors, judges, consultants and vice presidents strive for truth more often than we realize, and miss that mark more often than they realize. Because the brain cannot see itself fooling itself, the only reliable method for avoiding bias is to avoid the situations that produce it.

When doctors refuse to accept gifts from those who supply drugs to their patients, when justices refuse to hear cases involving those with whom they share familial ties and when chief executives refuse to let their compensation be determined by those beholden to them, then everyone sleeps well.

Until then, behavioral scientists have plenty to study.

Now, when you publish anything in the world’s greatest newspaper3 you get a lot of email, and my email this week has been of three kinds. First, behavioral scientists have written to thank me for telling the public about the kind of work we do. Second, the public has written to ask me how anyone as ignorant as I am could possibly be allowed to (a) do research, (b) teach at Harvard, (c) write for the New York Times, and (d) live. For example, this week I learned from a fellow named Dave O. that psychologists are wasting the taxpayers’ dollars by studying what people think when, as it turns out, Dave already knows what people think. Apparently, we should just ask Dave. Dave adds for good measure: “Your column literally reeks with a liberal political bias. Cheney, Judges and Businessmen—should I do a study to find out?”

The last time I wrote an Op-Ed was on the occasion of George Bush’s inauguration (click here), and liberals quickly filled my inbox with hate-mail, accusing me of being a shameless apologist for the vast right wing conspiracy. This time it was the conservatives who filled my inbox, accusing me of being an unpatriotic freeloader who derogates the government while taking its grant money. All I can say is that being booed from both sides of the aisle leaves me feeling remarkably fair and balanced.

Oh yes, the third email. The third email was a cordial note from Mr. Peter Thonis, who is the Chief Communications Officer at Verizon. He wanted me to know that the CEO of his company did not actually earn as much money as I’d said (or as the New York Times had reported a few days earlier). I replied by thanking Mr. Thonis for his email and explaining that while I appreciated being corrected on these facts, the precise amount of money the CEO earned wasn’t exactly the point I’d been making. My point was that the CEO’s compensation (whatever it was) had been determined by an independent consulting firm that wasn’t really independent.

I never heard back from Verizon, but maybe they didn’t get my message. Can you hear me now?

For those interested in learning more about the research described here, you can read:

• Ditto, P. H., & Lopez, D. F. (1992). Motivated skepticism: Use of differential decision criteria for preferred and nonpreferred conclusions. Journal of Personality and Social Psychology, 63, 568-584.

• Pronin, E., Gilovich, T., & Ross, L. (2004). Objectivity in the eye of the beholder: Divergent perceptions of bias in self versus others. Psychological Review, 111, 781-799.

• Miller, D. T., & Ratner, R. K. (1998). The disparity between the actual and assumed power of self-interest. Journal of Personality and Social Psychology, 74, 53-62.



Posted by Dan Gilbert on April 24, 2006 | Blog home


1This line originally read “When my slave-driving publishers…,” but they monitor every word I write and they edit it before posting. However, they don’t seem to read these footnotes, so please look down here for the truth. Everything else you read on this page is sheer propaganda. And you better smile and nod while you’re reading because if you are connected to the internet they can see you through your computer. I’m telling you, man, these guys own the world.

2 Right. Sure. Like I really wrote that line. Give me a break. The newspaper guys and the book guys are obviously all the same guys. They might all be the same guy for all I know.

3 What kills me is that they really think you are falling for this.







RANDOM HOUSE    |   KNOPF   |   VINTAGE   |   PRIVACY    |   CONTACT