Certainty

Dealing with the unknown is vastly difficult, so we tend to find meaning in declaring things to be certain.

We must constantly trust everything around us, but we make decisions to prioritize trusting one thing over another.

We “believe” when we consider something to be reality, even without direct evidence.

  • We all believe (i.e., implicitly trust) we’ll have enough food for our next meal.
  • If you have a job, you believe you’ll have it tomorrow.
  • You believe you will wake up tomorrow morning.

Beliefs are how we establish things when we can’t really know but believe we ought to, which is almost every time we have no direct control over something but still must make decisions.

We start to believe when we value one thing as true over another thing. As children, we start with 100% certainty about various things, but our beliefs become a spectrum as we gain understanding:

  1. A deterrence-based fear of worse alternatives.
  2. A broad belief that we understand something far enough that it’s very likely.
  3. Believing the thing enough to rely on.
  4. Concluding the thing is indisputable.
  5. Identifying close enough with the thing to act boldly and publicly about it, even to the point of confrontation.

A huge portion of what we believe in comes through the authority we hold our beliefs on:

  • If an authority is physically present, we can only trust it as far as reality and consequences hold up.
  • Holding to the authority of the ideas themselves is a strictly philosophical justification.
  • We end up believing ideas based on other ideas, which usually goes back to one of our impressions or cultural teaching.

We can only know a very small set of facts without needing any belief:

  1. By thinking at all, you prove that you are a thinking thing.
  2. Pain, in whatever form it takes, is real.
  3. Feelings exist, in whatever form they express as.

Everything else, from the idea of a perfect circle to what we understand behind others’ language, has degrees of uncertainty that we gloss over with habitual expectations:

Both fools and wise people become more certain as they gain understanding. The major difference is the scope of where they claim that certainty. Wise people claim certainty on specific things and expand outward, while fools start with a broad claim that applies to many more things than it should.

We make commitments and predictions about what we are certain of. Those commitments and predictions (and how well we follow through with them) demonstrate our true beliefs, contrary to what we claim we believe.

  • You expect a noise when you turn the key in your car.
  • The screen on your phone responds when you touch it.
  • People say something back when you say, “hello”.

Bad Information

Belief is a product of a functioning mind, but many things can betray our trust. We’re often deceived or under-informed, people betray us, and we miscalculate.

While we misjudge all the time, we tend to fix misunderstandings immediately, and the self-correcting nature of our minds generally tends to fix those misunderstandings without our knowledge.

Sometimes we’ll place our trust in a bad idea. When that happens, we’ll often get “stuck” on it, and will substantiate it with a variety of hard-to-disprove reasons:

One of the methods that locks us into wrong beliefs is hypnosis, which is any repetitive action that reaffirms on a subconscious thought. By repeating something over and over (e.g., a mantra), we start believing things as if we understood them, even when we don’t. The more specific, the more we can feel it, and the more potent it becomes.

Generally, uncertainty takes more work to maintain as understanding, while on the other hand acting with certainty takes more work to perform.

Decisions

In spite of this shoddy framework of information, we still must make decisions with that information.

We can only trust things that we see as the same or better than other things. Since we understand ourselves more than anyone else, we use our self-perception as the starting point for practically everything else.

Over our lives, we start creating a purpose-based hierarchy of what we can trust:

  • A person will trust their spouse more than anyone else, but will trust themselves over their spouse on trade-specific knowledge.
  • A lawyer is trustworthy to manage a legal situation, but not necessarily on moral matters.
  • Generally, strangers have more things someone can’t know than friends, even if their intentions aren’t as clear.

We decide with a conviction proportional to the clarity of our understanding. To actually create results, we must stop thinking about the matter at some point to avoid constantly second-guessing ourselves.

We don’t always remember what we learned and its context, so we trusting much of what we do from past experience:

  • If a chair breaks, it’s an unlikely event, so we find another chair.
  • A mother sometimes fails a child’s expectations, but the child still trusts their mother because of other things that were good.
  • If someone loses a job they’ll look for another one because they’ve perceived other people who had jobs that fulfilled their purposes.

Crisis of Faith

Sometimes, we make bad decisions we think are good. We’ll invest a ton of resources toward them, but won’t get the results we were expecting. This is completely irrespective of how well we’ve reasoned, and comes mostly from applying experience.

We suffer a type of cognitive dissonance where we trust two incompatible things at once. A crisis of faith is when we become aware of that conflict because an expectation failed with the one that we trusted more.

We tend to have an inner conflict, but many people with trauma will mix their trauma into what they believe about others’ lives. Some political systems take advantage of this.

Eventually, a person must decide: accept what they perceive, or believe something is “testing” their faith. Either way, they’re rebuilding their story, either by changing the logical conclusion (aka “changing their mind”) or adding/removing premises.

When our faith is tested, but we still hold on, we call that “hope”. There’s a fine line between sensible hope and blind hope.

If that person starts denying reality to “prove” something, they’re removing premises. Once this becomes habitual, they will resort to magical thinking, which is interpreting the world to be what they imagine instead of what it is. Magical thinkers tend to share a few traits:

If we choose to listen to the feeling of distrust, we’ll disbelieve what we were trained. We’ll enter a type of “agnosis”, where we’ll live in disbelief about a broad range of related concepts. Sometimes it can span a subject, but other times can possess everything we’ve ever known.

Since we trust our habits of thought and action, the only cure to preventing a crisis of faith from even starting is to continually revisit old habits and beliefs.


Application

We must believe things when we’re not fully certain. The volatile nature of life means leaders and artists venture into less certain places than the rest of society. So, influence loosely correlates with the ability to trust.

Everyone, even leaders, are subject to changing their views. This can create large-scale consequences for everyone, especially if it’s changing a core view.

Conviction requires focus, so smarter people (who can often see complexities) are often lousy at sticking to convictions, which often make them terrible leaders.

The only way we can prepare our minds for maximum understanding is to be open it up to all possibilities. While it’s uncomfortable, all we have to do is convert every inner statement into a question to create an inner conflict about it.

Hypnosis is all around us (especially through our past trauma, public media, and politics), but we can use it for our purposes. By repeating something specific, we can channel our subconscious (e.g., “I will find a new job by the end of March.”). It’s the secret to most success.

Magical thinking exists proportionally to how self-trusting someone is, which often comes with intelligence. Thus, there are tons of magical thinkers in fields with intelligent people like politics, academia, and computer programming who don’t acknowledge additional elements of reality that they can’t know:

  • Many doctors believe the body’s healing processes to be strictly mechanical and often neglect the psychosomatic power we possess through our happiness and beliefs.
  • Most scientists are religiously atheist, but claim to be non-religious.
  • When large groups have a procedure or policy they deem necessary, they’ll become corrupted while trying to maintain it.

Hope is based around a purpose, so if someone feels hopeless they’re failing to see how a thing can accomplish a purpose. Understanding that purpose is key to understanding if something is hopeless.

To find someone’s actual opinion, ask for their advice for fixing a problem or what they think everyone else believes.

MLM, gambling, and lotteries appear to be ubiquitous because people are deceived into expecting tremendous wealth. In reality, they engage in those things because they hope in tremendous wealth from those things, and find meaning in a journey that’s almost never technically fulfilled.

While extremists are typically the most outspoken people of a group, they often don’t understand the thing they’re promoting. They’re typically acting off trauma and magical thinking, but devout believers in a thing don’t really need to convince others to feel the thing is true. Disagree with them and closely observe what they disagree over.

It’s extremely difficult to change a person’s opinions. The only way to do this is to influence them with a compelling story about an alternative opinion, then wait.