unduly influenced by outside suggestion

Referees favour home teams in judgment calls, particularly those that happen at a crucial stage in a game. If a batter chooses not to swing at a baseball pitch, the pitch is more likely to be called a strike if the home team is pitching. This tendency is most extreme in close games. In soccer, referees are more likely to award penalties to the home team, hand out fewer punishments for offences to home players.

Are referees deliberately biased? The authors (of Scorecasting) think not. Instead, they blame the fact that referees, like the rest of us, tend subconsciously to rely on crowdsourcing, picking up on the mood of the crowd when making their decision.

“Anchoring” is the name economists give to people’s tendency to be unduly influenced by outside suggestion. Take away the crowd and the home bias shrinks, as it did a few years back when 21 Italian soccer matches were played without supporters following incidents of crowd violence. In these games the home bias declined by 23% on fouls called, by 26% for yellow cards and by a remarkable 70% for red cards, which remove a player from the game and have a particularly big impact on the result.

From The Referee's an Anchor in The Economist

Proactive Language

There’s nothing I can do.. Let’s look at our alternatives.
That’s just the way I am.. I can choose a different approach.
He makes me so mad.. I control my own feelings.
They won’t allow that.. I can create an effective presentation.
I have to do that..I will choose an appropriate response.
I can’t..I choose.
I must.. I prefer.
If only.. I will.

A serious problem with reactive language is that it becomes a self-fulfilling prophecy. People become reinforced in the paradigm that they are determined, and they produce evidence to support the belief. They feel out of control, not in charge of their life or their destiny. They blame outside forces - other people, circumstances, even the stars - for their own situation.

Stephen Covey, The Seven Habits of Highly Effective People

Seeking the Best is a Trap

We have this sense that there is an objective best, and in virtually no area of life is that true. It’s not even that, “Well, there’s the best for me, and then there’s the best for you.” It isn’t even clear that there is a best for me. There’s a whole set of things that are probably more or less equivalent.

If you have this mindset that says, “I have to get the best,” it’s so hard to figure out what that is that you end up looking in panic around you at what other people are choosing as a way to help you figure out what is the best. I think it’s partly because they are struggling to define the best, and they can’t do it on their own, so they’re madly checking out other people’s decisions as a way of figuring out what really is the best. It’s extremely destructive.  

Barry Schwartz quoted in Vox

The green fig tree

I saw my life branching out before me like the green fig tree in the story. From the tip of every branch, like a fat purple fig, a wonderful future beckoned and winked. One fig was a husband and a happy home and children, and another fig was a famous poet and another fig was a brilliant professor, and another fig was Ee Gee, the amazing editor, and another fig was Europe and Africa and South America, and another fig was Constantin and Socrates and Attila and a pack of other lovers with queer names and offbeat professions, and another fig was an Olympic lady crew champion, and beyond and above these figs were many more figs I couldn't quite make out.

I saw myself sitting in the crotch of this fig tree, starving to death, just because I couldn't make up my mind which of the figs I would choose. I wanted each and every one of them, but choosing one meant losing all the rest, and, as I sat there, unable to decide, the figs began to wrinkle and go black, and, one by one, they plopped to the ground at my feet.

Sylvia Plath, The Bell Jar

why Facebook survived

While Facebook was just getting on its feet in 2004, a similar social network called Campus Network (or CU Community) was ahead and more advanced. Slate explains why only one survived.

Why did Facebook succeed where Campus Network failed? The simplest explanation is, well, its simplicity. Yes, Campus Network had advanced features that Facebook was missing. While Campus Network blitzed first-time users right away, Facebook updated its features incrementally. Facebook respected the Web's learning curve.

Campus Network did too much too soon. Neither site, of course, can claim to be the first social network—Friendster and MySpace already had large followings in 2003. But both Facebook and Campus Network had the crucial insight that overlaying a virtual community on top of an existing community—a college campus—would cement users' trust and loyalty. Campus Network figured it out first. Facebook just executed it better.

While people want to make their own choices, research shows too many options creates problems. We become overwhelmed. There is no substitute for simplicity and clarity. Whether on purpose or by accident, Facebook was built from the perspective of looking at what users would do with the site rather than building to show off what its creators could do. One approach shows respect for the audience.

Stephen Goforth

Throwing Good Money after Bad

Imagine a company that has already spent $50 million on a project. The project is now behind schedule and the forecasts of its ultimate returns are less favorable than at the initial planning stage. An additional investment of $60 million is required to give the project a chance. An alternative proposal is to invest the same amount in a new project that currently looks likely to bring higher returns. What will the company do? All too often a company afflicted by sunk costs drives into the blizzard, throwing good money after bad rather than accepting the humiliation of closing the account of a costly failure.

(This) fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects. I have often observed young scientists struggling to salvage a doomed project when they would be better advised to drop it and start a new one. Fortunately, research suggests that at least in some contexts the fallacy can be overcome. (It) is taught as a mistake in both economics and business courses, apparently to good effect: there is evidence that graduate students in these fields are more willing than others to walk away from a failing project.

Daniel Kahneman, Thinking, Fast and Slow

a better guide to future success

Psychologist Gerd Gigerenzer argues that much of our behaviour is based on deceptively sophisticated rules-of-thumb, or “heuristics”. A robot programmed to chase and catch a ball would need to compute a series of complex differential equations to track the ball’s trajectory. But baseball players do so by instinctively following simple rules: run in the right general direction, and adjust your speed to keep a constant angle between eye and ball.

To make good decisions in a complex world, Gigerenzer says, you have to be skilled at ignoring information. He found that a portfolio of stocks picked by people he interviewed in the street did better than those chosen by experts. The pedestrians were using the “recognition heuristic”: they picked companies they’d heard of, which was a better guide to future success than any analysis of price-earning ratios.

Ian Leslie writing in The Economist

Behind Door #3

Remember the old television show Let’s Make a Deal? Monty Hall would given contestants, typically dressed in outrageous costumes, a choice of three doors. The contestant would receive whatever was behind the door they selected. One of the doors had a great prize behind it. Pick that door and you get a valuable gift like a car or a vacation. But behind the other two doors were gag gifts. It might be a rooster or a lifetime supply of paper clips.

There was always one extra twist to the show: Once you pick a door, before revealing what was behind it, Monty would do you the favor of opening one of the remaining two doors and show one of the gag gifts. At that point, he'd let you switch doors if you wanted to do so. You could stick with your original choice as well.

What's the right move? Our instinct tells us to to stick to our guns. But you should go against that instinct and switch. Why? The chances you’ve picked the wrong door is two-out-of-three. But with only two doors left, your odds of getting the great prize goes up to 50-50. 

But there’s more afoot here than just winning a prize on a TV game show.

Economist M. Keith Chen says this phenomenon has been overlooked in some of the most famous psychology experiments. He claims The Monty Hall Problem shows there's a logical flaw in the idea of choice rationalization. Choice rationalization is the idea that once we reject something, we tell ourselves we never liked the one we rejected anyway. Psychologists say we do this because it spares us the pain of thinking we made the wrong choice. Chen believes it’s not the act of picking that makes people suddenly prefer one over the other. He claims the preference was there all along. It's just that the preference was so slight it was not initially obvious until other possibilities are cleared out. You can read his own explanation here.

Stephen Goforth