Here's what I know for sure

There are three depths of knowing.

  1. Hearsay: You’ve heard of the president. You’ve heard of Mt. Everest.

  2. Introduction: You’ve been introduced to the president. You’ve visited Mt. Everest.

  3. Intimately: You’re a good friend of the president. You’ve climbed Mt. Everest.

Understanding comes when you wrestle with these questions:

  1. What is the surest thing to you? 

  2. What would be the most impossible thing to doubt?

Stephen Goforth 

Hybrid Entrepreneurship

“There’s this myth that you have to go all in on a project or initiative to be successful, when it’s actually better to do a personal real options approach,” says Nathan Furr.

Nathan and Susannah Furr, authors of The Upside of Uncertainty: A Guide to Finding Possibility in the Unknown, were introduced to the concept after interviewing Ben Feringa, recipient of the 2016 Nobel Prize in Chemistry for his work on molecular machines. The Furrs asked Feringa if he faced uncertainty on his road to a scientific breakthrough.

“He laughed and said, ‘It was all uncertainty,'” recalls Nathan Furr.

Feringa told the Furrs that he encourages his students to have at least two projects going, one certain and one uncertain. “Striving for certainty will lead you down false paths or lead you to commit too long to projects that won’t work, or to uninteresting projects that will work,” he explained.

Stephenie Vozza writing in Fast Company

Black Swains

We have a natural tendency to look for instances that confirm our story and our vision of the world.

Seeing white swans does not confirm the nonexistence of black swans. There is an exception, however: I know what statement is wrong, but not necessarily what statement is correct. If I see a black swan I can certify that all swans are not white! If I see someone kill, then I can be practically certain that he is a criminal. If I don’t see him kill, I cannot be certain that he is innocent. The same applies to cancer detection: the finding of a single malignant tumor proves that you have cancer, but the absence of such a finding cannot allow you to say with certainty that you are cancer-free.

We can get closer to the truth by negative instances, not by verification.

Nissim Taleb, The Black Swain

stomping of the foot (before storming out of class)

I'll never forget the student who charged out of one of my first philosophy classes. The professor had challenged the student's view of religion and the young man stomped his foot, turned red, yelled, and left the room.

Why such an emotional outburst? Perhaps his beliefs were built on a weak foundation. A little rhetoric from an authority figure threatened to topple the structure. When we accept the conclusions of other people, never figuring out the "why" for ourselves, weak lay a weak foundation. Should we intentionally avoid opposing view points? It turns out we naturally steer clear of conflict.

Researchers at the University of Illinois at Urbana-Champaign found the less certain you are about what you believe, the more likely you’ll stay away from opposing viewpoints (and freak out when you run across opposing opinion). After reviewing nearly 100 studies, they came to the conclusion that people tend minimize their exposure when they are less certain and less confident in their own position. In fact, we're nearly twice as likely to completely avoid differing opinions than we are to give consideration to different ideas. For those who are close-minded the percentage jumps even higher. Three-out-of-four times the close-minded person will stick to what supports their own conclusions. Details of the study are in the Psychological Bulletin by Researchers.

Stephen Goforth

The price of avoiding uncertainty

In order to manage the avalanche of information that our senses are absorbing at all times, our brains work to find patterns, simplify information, and look for clarity. That allows us to be able to make decisions and act. But sometimes in the rush to make order of the world, our brains jump to unwarranted conclusions — taking in the myriad of information around us and deducing something that just isn't quite right.

A high need for closure isn't necessarily a bad thing. You may just be the type of person who likes to make plans and avoid surprises. However, the need for closure can lead to two major pitfalls in decision making, says Holmes.

The first is what psychologists call the "urgency effect," which is basically the tendency to jump to conclusions. The second is the "permanence effect" -- a stubborn tendency to stick with your beliefs and not change your mind, even in the face of contradictory evidence. Both of these effects result from your brain trying to avoid feelings of uncertainty.

If you have a high need for closure, research suggests you should be careful making decisions, especially in times of fatigue or stress.

Ana Swanson writing in the Washington Post