Dealing with the unknown is vastly difficult, so we tend to find meaning in declaring things to be certain.
We must constantly trust everything around us, but we make decisions to prioritize trusting one thing over another.
We “believe” when we consider something to be reality, even without direct evidence.
- We all believe (i.e., implicitly trust) we’ll have enough food for our next meal.
- If you have a job, you believe you’ll have it tomorrow.
- You believe you will wake up tomorrow morning.
Beliefs are how we establish things when we can’t really know but believe we ought to, which is almost every time we have no direct control over something but still must make decisions.
We start to believe when we value one thing as true over another thing. As children, we start with 100% certainty about various things, but our beliefs become a spectrum as we gain understanding:
- A deterrence-based fear of worse alternatives.
- A broad belief that we understand something far enough that it’s very likely.
- Believing the thing enough to rely on.
- Concluding the thing is indisputable.
- Identifying close enough with the thing to act boldly and publicly about it, even to the point of confrontation.
A giant portion of what we believe in comes through the authority we hold our beliefs on:
- If an authority is physically present, we can only trust it as far as reality and consequences hold up.
- Holding to the authority of the ideas themselves is a strictly philosophical justification.
- We end up believing ideas based on other ideas, which usually goes back to one of our impressions or cultural teaching.
We can only know a small set of facts without needing any belief:
- By thinking at all, you prove that you are a thinking thing.
- Pain, in whatever form it takes, is real.
- Feelings exist, in whatever form they express as.
Everything else, from the idea of a perfect circle to what we understand behind others’ language, has degrees of uncertainty that we gloss over with habitual expectations:
Both fools and wise people become more certain as they gain understanding. The major difference is the scope of where they claim that certainty. Wise people claim certainty on specific things and expand outward, while fools start with a broad claim that applies to many more things than it should.
We make commitments and predictions about what we are certain of. Those commitments and predictions (and how well we follow through with them) demonstrate our true beliefs, contrary to what we claim we believe.
- You expect a noise when you turn the key in your car.
- The screen on your phone responds when you touch it.
- People say something back when you say, “hello”.
Bad Information
Belief is a product of a functioning mind, but many things can betray our trust. We’re often deceived or under-informed, people betray us, and we miscalculate.
While we misjudge all the time, we tend to fix misunderstandings immediately, and the self-correcting nature of our minds generally tends to fix those misunderstandings without our knowledge.
Occasionally, we’ll place our trust in a bad idea. When that happens, we’ll often get “stuck” on it, and will substantiate it with various hard-to-disprove reasons:
- Conceit or hubris about reputation or knowledge.
- Believing a group’s promises that they can solve our issues.
- Addictions that ravage our ability to think rationally from over-identifying with it.
- Trusting in the unverifiable unknown, such as in many cults and religions.
- Belief in statistics or, on the other end, statistical unlikelihood (i.e., luck).
One of the methods that locks us into wrong beliefs is hypnosis, which is any repetitive action that reaffirms on a subconscious thought. By repeating something over and over (e.g., a mantra), we start believing things as if we understood them, even when we don’t. The more specific, the more we can feel it, and the more potent it becomes.
Generally, uncertainty takes more work to maintain as understanding, while on the other hand, acting with certainty takes more work to perform.
Decisions
Despite this shoddy framework of information, we still must make decisions with that information.
We can only trust things that we see as the same or better than other things. Since we understand ourselves more than anyone else, we use our self-perception as the starting point for practically everything else.
Over our lives, we start creating a purpose-based hierarchy of what we can trust:
- A person will trust their spouse more than anyone else, but will trust themselves over their spouse on trade-specific knowledge.
- A lawyer is trustworthy to manage a legal situation, but not necessarily on moral matters.
- Typically, strangers have more things someone can’t know than friends, even if their intentions aren’t as clear.
We decide with a conviction proportional to the clarity of our understanding. To actually create results, we must stop thinking about the matter at some point to avoid constantly second-guessing ourselves.
We don’t always remember what we learned and its context, so we trust much of what we do from experience:
- If a chair breaks, it’s an unlikely event, so we find another chair.
- A mother sometimes fails a child’s expectations, but the child still trusts their mother because of other things that were good.
- If someone loses a job, they’ll look for another one because they’ve perceived other people who had jobs that fulfilled their purposes.
Crisis of Faith
Occasionally, we make bad decisions we think are good. We’ll invest a ton of resources toward them, but won’t get the results we were expecting. This is completely irrespective of how well we’ve reasoned, and comes mostly from applying experience.
We suffer a type of cognitive dissonance where we trust two incompatible things at once. A crisis of faith is when we become aware of that conflict because an expectation failed with the one that we trusted more.
We tend to have an inner conflict, but many people with trauma will mix their trauma into what they believe about others’ lives. Some political systems take advantage of this.
Eventually, a person must decide: accept what they perceive, or believe something is “testing” their faith. Either way, they’re rebuilding their story, either by changing the logical conclusion (aka “changing their mind”) or adding/removing premises.
When our faith is tested, but we still hold on, we call that “hope”. There’s a fine line between sensible hope and blind hope.
If that person starts denying reality to “prove” something, they’re removing premises. Once this becomes habitual, they will resort to magical thinking, which is interpreting the world to be what they imagine instead of what it is. Magical thinkers tend to share a few traits:
- They trust their preconceived beliefs and their affiliated groups more than the perceptions.
- They believe reality is dictated more by saying and doing specific things than from cause-and-effect.
- If something is outside their understanding, it doesn’t exist or must be purged.
- In their conflicts with others, they presume they’re 100% correct, and will often shift their thoughts if they’ve been logically convinced.
If we choose to listen to the feeling of distrust, we’ll disbelieve what we were trained. We’ll enter a type of “agnosis”, where we’ll live in disbelief about a broad range of related concepts. It can sometimes span a subject, but other times can possess everything we’ve ever known.
Since we trust our habits of thought and action, the only cure to prevent a crisis of faith from even starting is to continually revisit old habits and beliefs.
Application
We must believe things when we’re not fully certain. The volatile nature of life means leaders and artists venture into less certain places than the rest of society. So, influence loosely correlates with the ability to trust.
Everyone, even leaders, are subject to changing their views. This can create large-scale consequences for everyone, especially if it’s changing a core view.
Conviction requires focus, so smarter people (who can often see complexities) are often lousy at sticking to convictions, which often makes them terrible leaders.
The only way we can prepare our minds for maximum understanding is by opening it up to all possibilities. While it’s uncomfortable, all we have to do is convert every inner statement into a question to create an inner conflict about it.
Hypnosis is all around us (especially through our past trauma, public media, and politics), but we can use it for our purposes. By repeating something specific, we can channel our subconscious (e.g., “I will find a new job by the end of March.”). It’s the secret to most success.
Magical thinking exists proportionally to how self-trusting someone is, which frequently comes with intelligence. Thus, there are tons of magical thinkers in fields with intelligent people like politics, academia, and computer programming who won’t acknowledge additional elements of reality that they can’t know:
- Many doctors believe the body’s healing processes to be strictly mechanical and often neglect the psychosomatic power we possess through our happiness and beliefs.
- Most scientists are religiously atheist, but claim to be non-religious.
- When large groups have a procedure or policy they deem necessary, they’ll become corrupted while trying to maintain it.
Hope is based around a purpose, so if someone feels hopeless, they’re failing to see how a thing can accomplish a purpose. Understanding that purpose is key to understanding if something is hopeless.
To find someone’s actual opinion, ask for their advice for resolving a problem, or what they think everyone else believes.
MLM, gambling, and lotteries appear to be ubiquitous because people are deceived into expecting tremendous wealth. In reality, they engage in those things because they hope for tremendous wealth from those things, and find meaning in a journey that’s rarely fulfilled.
While extremists are typically the most outspoken people of a group, they often don’t understand the thing they’re promoting. They’re typically acting off trauma and magical thinking, but devout believers in a thing don’t really need to convince others to feel the thing is true. Disagree with them and closely observe what they disagree over.
It’s extremely difficult to change a person’s opinions. The only way to do this is to influence them with a compelling story about an alternative opinion, then wait.