Sunday, May 09, 2021
feeling things out
Tuesday, May 04, 2021
unbalanced
I exited slowly from religious belief (though it was common to call your emotional connection to God a "relationship, not a religion"). A major reason I didn't move any faster was because it took time to accept that my entire perspective needed an overhaul. Obviously, year after year, I either learned about or experienced flaws in my former beliefs. But my rigid perspective acted as a comforting frame around the flaws. It was as if my brain automatically placed each flaw under a protective glass dome and calmly set it aside, like a collector of insect specimens. A flaw wasn't treated as anything more than a harmless curiosity to ponder for a little while, perhaps to feebly show that followers don't dodge the hard questions. It wasn't permitted to genuinely disturb one's bedrock assumptions. To the contrary, it was an opportunity to further solidify assumptions by coming up with a rationalization...and of course the rationalization could be crude. It didn't need to be able to persuade anyone else.
As problematic as this mode of thinking is, some clever people who are stuck in it may offer up a reasonable-sounding justification. "I see the flaws you've mentioned. I don't claim that those are nonexistent or easy to explain. However, I have reasons of my own for why I do follow my beliefs. These two collections of clues are in competition. And that just means that all of us are balanced between thinking that my beliefs are accurate or inaccurate. The fragile balance is like a pencil standing on its eraser or unsharpened end. It could tip either direction. I fall one way, and I follow my beliefs. You fall the other way, and you don't. Both you and I can present evidence for our stances, and the evidence we present is more than enough for each of us to be satisfied with opposite conclusions. The only difference is a choice that each of us makes about which collection of evidence speaks to us more as individuals."
This approach is far more appealing than insults. It's tactful. It doesn't shut down conversation immediately. It encourages mutual tolerance, because no side's evidence is said to be superior. It portrays the sides more like people with differing preferences. By doing so it echoes the common-sense advice that attacking anyone for having other tastes than yours is mean and unsophisticated (yet typical in internet venues).
Unfortunately, it has a shortcoming that's hard to overlook: it's a deception. The reasons and the flaws that anyone happens to count as "evidence" aren't all on equal footing. A supposed clue, whether it's a reason to accept an idea or a flaw in it, must do more than exist. It must be weighed in order to see how much it deserves to affect the total balance. This weighing refers to asking essential questions about the clue and facing the answers fearlessly:
- How was it obtained? A story about an event in someone's life is vulnerable at first to incorrect interpretations of sensations, then vulnerable to revisionist human memory, and finally vulnerable to the gap between the storyteller's intended meaning and the listener's understanding. Another source of clues is what someone "feels" to be right, but the motivations underneath this feeling need to be examined. It's very possible that the clue is really an expression of the feeler's deep wishes, thoughtless instincts, or narrow preconceptions. The thoughts that make people relax or make them queasy vary greatly between cultures and time periods. Statements about how things are "meant to be" are actually prone to widespread disagreement and evolution.
- How is its accuracy determined? Until a statement has been checked and supported, its level of accuracy cannot be assumed. After all, it takes extremely little effort to spit out an inaccurate statement. The expectation should be that accurate statements are rarer, and any given statement is more likely to belong in the inaccurate pile. And the clue's accuracy could have limits because the methods for obtaining the clue have limits. Nobody should ever be penalized for forthrightly stating that a clue's accuracy isn't absolute!
- What is the context it came from? A clue is less convincing if it emerged from someone whose consciousness was in an abnormal state. Brains in abnormal states are known to produce total fantasies and display impaired judgment.
- Is it self-consistent? A clue is less trustworthy when it can't be reliably reproduced, or is frequently found among facts that throw doubt upon it ("just...overlook all the times when that other thing happens!"), or even contains a logical contradiction.
- Is it plausible? A clue requires a greater amount of support if it demands a perfect coincidence, a sequence of improbable events, or a thinly stretched thread of arguments that have few observable connections to, well, anything. For example, explaining a dubious claim's lack of effect on reality with a dubious excuse doesn't inspire confidence.
- Is it distinctive? A clue doesn't speak for itself. The more effort someone needs to put in to clarify how a clue can be viewed as a significant contribution to their side, the weaker the clue. Or, if it could easily be twisted to support multiple points of view, then it's not an excellent contribution to any single one. That said, ambiguity isn't necessarily avoidable—we live in complex realities where ambiguities abound.
- Is the source credible? Someone earns more consideration if they avoid sloppy generalizations, go into gory details about the work they did to find the clue, and have shown that they value truth more than "winning". If they have a habit of pompously spewing whatever fiction that they have a "gut feeling" about or whatever statement will benefit them if a sufficiently large group eats it up, then it's best to cover one's ears and eyes and sprint away. Communicating with such a person is futile.
- Does it clash with clues that carry a lot of weight? Real clues coexist in harmony with other real clues. When an idea is well-verified, it should prompt questions about the clues that don't fit with it. When one fresh clue disagrees with an idea that's been confirmed repeatedly, the idea isn't in danger of being thrown out momentarily; the clue is.
After the clues have been sorted by "weight", the perception that the sides are well-balanced quickly falls apart. Instead it's easier to realize that each side's ability to endlessly suggest clues doesn't lead to an evenly matched contest. Quality matters. A little mound of many wafer-thin clues isn't enough when the clues on the other side are much more substantial. In fact the outcome is a pronounced tilt, not a balance.
This shift to analyzing clues more closely happened at larger scales than individuals' philosophies. It took place in a variety of subjects and had massive effects. Theories about the motions of physical objects were tied to measured experiments and mathematics, not creative philosophizing. Chemical reactions were openly shared instead of performed in hidden rooms and written in arcane books. Historians distinguished between primary and secondary sources and remembered that people are often biased or enjoy telling tales that have been...dramatized. Medical treatments were thoroughly vetted in large trials. Journalists adopted standards instead of spreading unverified rumors.
At the same time, some people don't always appreciate the value of it. A tilt toward one side might not impress such a person at all. They may blatantly choose to override the tilt and follow the beliefs they want. That's not what I did after I learned to weigh clues honestly. However I'm grateful when anyone at least admits it rather than bluff that their beliefs are backed by equivalent evidence.
Postscript: There's a position on the materialistic naturalism side that also appears to assert that evidence itself hasn't resolved the debate. "I haven't yet been presented with sufficient reasons to think that there's a supernatural realm. Meanwhile, as a general principle it's impossible to prove that anything definitely doesn't exist. Therefore, I still need to be alert for the required evidence if it does arrive. Until then, I'll be neutral, which in practice means that I won't prematurely speculate that there's a supernatural realm."
This second case doesn't arouse the same antagonism from me because it has a crucial difference. It assigns the burden of proof appropriately. It recognizes that, echoing the comments above about accuracy, falsehoods vastly outnumber truths. There's a sea of incompatible possibilities. So most of the group must turn out to be false most of the time. Ideas aren't scarce. Each one should need to earn more attention than the numerous others available. Opting to not hastily accept a belief is wise, if one's goal is to accept as few falsehoods as one can. Simply put, skepticism is a shrewd default.
Another characteristic of this difference is the complexity of the belief that's followed or declined. In the first case, the belief being followed consists of a whole system of interrelated concepts, not to mention a stack of astounding stories. That's a lot to go along with. It's a stretch to argue that there are enough clues to furnish satisfying rationales for every concept in the system. The decision that's presented is an extensive one, as if the only two options are not believing at all or believing in A+B+C+D+E+F+G+H...
In the second case, the belief being declined is relatively minimal. The two options are logical opposites; there is or isn't a supernatural realm. Furthermore, "supernatural realm" is left vague on purpose. Ideas about what's in the realm or the rules by which it operates are separate from bare existence. The second case acknowledges that convincing someone of the realm's existence is step one of many. Once someone has demonstrated that a First-Cause god exists or that human souls exist, then the tasks of demonstrating the myriad details of a particular belief system come next—keeping in mind that these details differ dramatically in the huge range of belief systems.