tag:blogger.com,1999:blog-298763142024-03-14T04:13:41.459-04:00Rippling BrainwavesInformation dropping infrequently from my brain into a vast ocean of network packets.Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.comBlogger480125tag:blogger.com,1999:blog-29876314.post-48039675512765149462022-09-10T08:25:00.000-04:002022-09-10T08:25:40.310-04:00footprints<p>I've published a lot of entries here about the seismic shift in thinking I went through when I journeyed out of supernatural beliefs. That shift involved confronting a lot of philosophical hairsplitting and arguments—of varying quality. I must admit that some could seem too abstract to matter. But it's still worthwhile to counter the faux-intellectual justifications that people wave around, even as they're committed to their beliefs for different reasons altogether.</p><p>Despite what other faults philosophical discussion may have, it's managed to establish one conclusion beyond all doubt: <i>metaphors are fantastic for communicating these ideas</i>. I'll heed this lesson and offer one:</p><blockquote><p>On one glorious mid-morning in summer, the owner of an all-too-costly house on a small hill near the sea decided to take a short walk along the beach. She made her way down by the water. To her surprise she found a set of fresh footprints already there. The trail of footprints started off in the distance from the left and continued to the right as far as the eye could see. She noticed that the footprints were like her own, but she hadn't gone for a walk here for more than a week. This shook her to the core, because she thought of herself as someone who had specifically paid a premium for special access to a secluded private stretch of beach. So...she needed to explain how these footprints could have happened.</p></blockquote><p>The first explanation for the surprise footprints is the one anyone would leap to: another neighborhood resident took their own extended walk earlier in the morning. This explanation is obviously reasonable. But at the same time it might disturb the homeowner in the story, because it challenges her desire for privacy. </p><p>Fortunately for her, people could easily come up with far more explanations. For example:</p><p></p><ul style="text-align: left;"><li>The laws of physics have changed over time and diverge between places. Different laws of physics were in effect when the footprints were laid down, and these footprints only survived to the present day because of this. </li><li>The footprints are a hoax carried out by a frighteningly competent yet invisible conspiracy. This conspiracy is motivated by a scramble for grant money to investigate footprints. Or perhaps the evil corporations in the conspiracy are hoping to provoke people into paying for beach security or upkeep.</li><li>At the time that Earth was created from out of nothing in the blink of an eye, footprints were also simultaneously created on the beach. Perhaps the footprints were included to test the faith of anyone who'd dare to believe that the Earth wasn't created from out of nothing in the blink of an eye.</li><li>The footprints were miraculously formed by a sea-witch, whose goal is to draw people away from the truth. Or perhaps the sea-witch merely touched the soul of the homeowner so that she was blinded to the truth and thought that she saw footprints (or forgot that she made the footprints herself).</li><li>The footprints were made by a spiritual being (albeit with human-like feet) that made a short visit to our mundane plane of existence. Or perhaps there was a visit from an earthly mystery creature unknown to biologists. Or perhaps a creature from far space traveled a tremendous distance for the sole purpose of taking a stroll on exotic Earth.</li></ul><p></p><p>The point of comparing the first explanation to the alternatives isn't to necessarily claim that any in the group are flatly <i>impossible</i>. It's to underline that, at least to those who <i>aren't</i> trying to drag in their preexisting assumptions or aims, the alternatives deserve far more skepticism than the first. Could someone work hard to defend their preferred alternative as a valid "theory" just like any other? Could they concoct chains of reasoning that make their preferred alternative seem a bit more plausible for other reasons than the footprints themselves ("presuppositions")? </p><p>Well...yes. Regardless, the first explanation continues to be the one that requires the least stretching to fit the plain footsteps that were seen. I came to a similar realization as I changed my perspective about the supernatural beliefs I was raised to follow. All the straightforward observations of reality I learned about were like multiple sets of footprints. Each set was unfabricated and unbiased and pointed in the same direction. Meanwhile the difficult project to integrate these observations into my former beliefs began to seem like outlandish avoidance of the inference that <i>human footprints are usually left by human feet</i>. </p><p>I kept returning to the undeniable truth that I wasn't simply arguing with slick philosophies written by combative loudmouths who hated the viewpoint I identified with; I was in some sense strenuously arguing with the direct implications of the <i>objective data</i>. I was twisting my thoughts to steer clear of what mere footprints were "saying".</p><p>Furthermore, I had an ingrained queasiness for the theologically-liberal solution of fully adapting supernatural beliefs to fit reality. In my mindset <i>at that time</i>, following it would've involved dropping and/or rewriting significant chunks of my beliefs. I wasn't ready for that...and I couldn't get past having a lingering distaste for it. As I mentioned in past blog entries, I was taught to wholeheartedly follow beliefs that were <i>accurate</i>, not beliefs that were "inspirational". (Let me note that in the present I'm much less opposed to the liberal solution, although I'm still not interested in putting any energy or time into following it myself.)</p><p>The opposite strategy, preserving the beliefs and <i>mentally rewriting the facts</i>, also didn't occur to me. As quaint as it might sound now, I didn't assume that I could easily discard the well-supported facts I disliked. Experience since then has shown that all that's needed is to reframe them as lies told by journalists and societal elites...who are the Enemy of the nation. Or I could do the same by reframing them as lies told by people with socioeconomic power who are either consciously or subconsciously enforcing the oppressive status quo of inequality between classes and/or races and/or sexes. The next step after that would have been to latch onto infamous "alternative facts" instead, which come from sources that are careful to amplify only the bits of dodgy information that can be bent to fit the target's imagination. </p><p>Nope, I was stuck. I accepted the strange notion that findings can be independent of people and their desires, even to the point of justifying a broad change in their thoughts. Eventually, my thoughts did indeed change. The apt metaphor for this phase is another well-known one for de-conversion: I began to understand that I was looking through the "goggles" of belief. The better I became at cautiously removing the goggles and taking a second look at observed reality, the more I had to confess that those beliefs were distorting instead of magnifying what I saw. Or it might have been worse than that. The goggles had misled me to interpret the footprints as the traces of something that had never really been there at all throughout the various scenes of my life.</p>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-24858521032546779932022-03-25T10:57:00.000-04:002022-03-25T10:57:37.652-04:00evasiveness was an essential ingredient in the message of LostThe tv show <i>Lost</i> evoked a wide range of reactions. One of these was the common complain that the show's writing was perceived as excessively <i>evasive</i>, especially during the show's second and third seasons. "I'm tired of the way that <i>Lost</i> never stops to explain everything that's going on." Fortunately, the creative minds (I will never refer to people as "creatives"!) working behind the scenes were able to negotiate the endpoint, which allowed them to plan and pace their episodes more evenly.<div><br /></div><div>Of course, the project of keeping the show going was a crucial reason why some degree of evasiveness was always going to be needed. It would've hurt the show tremendously if its plot took a halt and had a character—probably a mythological figure suddenly appearing out of nowhere by a burning campfire—deliver a monologue with answers for every question. That would've been a <i>lecture</i>, not a show. And in the case of <i>Lost</i>, the unsettling lack of knowledge was actually one of its major emotional engines (complete with quivering strings on the soundtrack). Eliminating that would've drained its effect as surely as eliminating the "will they/won't they" question has drained the effect of many shows that have a central couple. Out of the many shows that copied the <i>Lost</i> formula, some were much quicker to reveal the explanation of the mystery in the premise...and then those shows spun wheels in-place trying to go somewhere interesting after the big revealing.</div><div><br /></div><div>Nevertheless, as I watch the show again from my present mindset, I'm convinced that evasiveness was more than a practical requirement. It was <i>essential</i> to one of the very ideas that it played with: <i>faith</i>. The writers have admitted that the concept of faith was one of their inspirations. The characters have conversations about it several times, and their attitudes toward it change over time. </div><div><br /></div><div>To put it bluntly, <i>Lost</i> couldn't portray faith as thoroughly as it did and yet <i>not be evasive</i>. Faith needs evasiveness in order to be maintained and to be labeled "faith". If faith in the show was intended to be meaningfully comparable to the supernatural faiths of real people, then it had to <i>share</i> those faiths' evasiveness about precisely how reality is affected in any way by the things in those faiths. Evasiveness was a show ingredient, not an accidental oversight.</div><div><br /></div><div>Why did the show need to be evasive about...</div><div><ul style="text-align: left;"><li>...faith in the personification of "The Island"? So that it could imitate the evasiveness about faith in the personification of a "Good Force". </li><li>...the source and meaning of various visions that characters have? So that it could imitate the evasiveness about the source and meaning of various visions that religious followers have. </li><li><i>...why</i> numbers are bad luck or significant in some other way? So that it could imitate the evasiveness about <i>why</i> numbers matter in numerology. </li><li>...how or why people are healed—or not—by their faith? So that it could imitate the evasiveness about how or why religious followers are healed—or not—by faith. </li><li>...the connections between paranormal "scientific" occurrences and the island faith? So that it could imitate the evasiveness about the connections between paranormal "scientific" occurrences and religious followers' notions about the supernatural. </li><li>...the connections between popular religions and the island faith? So that it could imitate the evasiveness about the connections between popular religions, which would be part of explaining how it's logically possible for all the popular religions to be equally accurate.</li><li>...the unseen actions of the competing supernatural figures on the island? So that it could imitate the evasiveness of religious followers about the unseen actions taken by their competing supernatural figures. </li><li>...the details in the plans and goals of those figures, as well as how those plans and goals directed events? So that it could imitate the evasiveness about the details in the plans and goals of religious followers' supernatural figures, as well as how those figures' plans and goals direct real-world events. </li><li>...the definition of a good or bad person in the eye of the island? So that it could imitate the evasiveness about how to definitively sort people into good or bad according to religious followers.</li><li>...whether any one character is lying or simply incorrect about island secrets at any moment? So that it could imitate the evasiveness about whether someone is sincere and knowledgeable about what they say about supernatural things.</li><li>...the so-called rules about what people (and human-shaped beings with uncanny powers) are allowed to do? So that it could imitate the evasiveness of religious followers about exact behavioral rules, which they debate amongst themselves endlessly.</li><li>...abnormal abilities or mental links that some people have? So that it could imitate the evasiveness regarding the abnormal abilities or mental links that some religious followers claim to have, such as clairvoyance and divine guidance, etc.</li><li>...the ancient past of the island's inhabitants who left behind exotic ruins and other structures? So that it could imitate the evasiveness of religious followers about admitting that ancient societies' "false religions" were every bit as vibrant as current religions. </li></ul><div>Having said all this, I must be clear that I wasn't one of the people who objected to <i>Lost's</i> perceived evasiveness. I don't demand that fictional stories be upfront about everything at all times. I believe the show's writing staff have been honest about their motivation to leave some things unsaid so that the audience could have differing interpretations, similar to the way that songwriters may refuse to decisively explain their lyrics. Questions are fascinating; the unknown lurking below the surface is thrilling and dangerous. (<i>Lodge 49</i> too has a vibe of people pursuing an amazing realm which always seems perpetually a little out of reach,..although it's far more laid-back about it.)</div></div><div><br /></div><div>That's why I'm in a position that's curiously opposite to any complainers who are also religious followers. When it comes to stories that anyone claims to be nonfictional, I echo their complaints. Evasiveness is an appalling quality for supposed truth-tellers to have. I'm stumped by how they aren't displaying the same attitude and criteria toward the stories that they consider to be simultaneously nonfictional and dreadfully important. Why should the faith stories someone was raised in be granted blanket exceptions—considering that the same people often have no qualms whatsoever in denying the same exceptions to the faith stories they <i>weren't</i> raised in ("how could <i>they</i> believe such nonsensical things?"). <i>Lost</i> can be evasive because, in the end, it's a TV show. Real-life ideas about how the universe works should not be.</div>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-9995418477641960762022-03-08T16:14:00.001-05:002022-03-08T16:14:58.541-05:00mission inadvisable<p>It's an excellent idea to step back from time to time and approach a belief as a total <i>outsider </i>would. Otherwise, there's a strong tendency to slip into thinking a belief is reasonable merely due to comfort and familiarity. "Common sense" is often no more than a label for "the culture I grew up in". Unexamined assumptions invade and settle into someone like plant roots tunneling into cracks in rocks.</p><p>In the religious environment I emerged from, one of these assumptions was the <i>mission to evangelize</i>. Every follower had this mission, at least theoretically. Followers were expected to try their hardest to turn more people into followers. And this mission was for everyone's benefit: newly turned followers received transformed lives and afterlives, but they were also useful additions to the existing group of followers too. The whole concept seemed like a quite natural consequence of being a follower.</p><p>Yet now, after both literally and figuratively removing myself from that setting, I can look at this mission with fresh eyes. Through my present viewpoint, it appears...just...poorly thought-out. Its reliance on religious followers is so very <i>inadvisable</i>. If one doesn't assume upfront that it's <i>their</i> mission, then there's a truckload of superior alternative strategies for gathering new followers.</p><p>To be more specific: after granting the point that it's necessary for people to be followers before a god may show mercy to them, then using existing followers is an <i>awful</i> option for that god to take. It has vast powers and it would (supposedly) be <i>overjoyed</i> to have more followers to show mercy to. What could it be doing instead to lure more people into following?</p><p></p><ul style="text-align: left;"><li>Distributing some more recent holy writings would be a good start. The current set has become overshadowed by insolvable controversies over the <i>intended modern meaning</i> of numerous sections. At the same time, debates have arisen over what to do with the sections that "appear" to suffer from the incorrect perspectives and barbaric social mores that were in effect at the time of writing. For that matter, the simple project of translating the writings into other languages has had its own share of controversies. Publishing an updated edition of the existing set of holy writings, and clearing up which writings in the set probably shouldn't have been included in the first place, would be good at a minimum.</li><li>A god could speak out more frequently to prospective followers and even to self-admitted enemies. "Speak" refers to expressing audible words to multiple people at once, rather than popping up in hazily-remembered dreams or nonverbal impulses to attend a service. Although some wouldn't welcome the message, a lot of them would probably relish hearing from the one actual god—the experience would give them a solid reason to believe rather than keep them in undecided, half-hearted suspense.</li><li>It could carry out undeniable, inexplicable acts of goodness in the world. What route to popularity could be more endearing? What could possibly be easier for announcing that it cares for people? Of course the good acts will need to be claimed so nobody blames any other gods that they have ideas about.</li><li>A less showy but still deeply appreciated gesture would be for it to communicate its staggering knowledge to a wide variety of domains. At this point in human history, people who have expertise in domains other than theology have somehow gotten the strong impression that neither this god nor any other is involved in any way in their domains. Some have stated that their expertise indicates that <i>no god</i> of any real relevance can exist at all. It should resolve such conflicts and patiently go into detail about at what point their expertise went astray. Its sharing of knowledge would also have the side effect of immediately advancing society as a whole; who knows what inventive people could manage to do with the new knowledge.</li><li>Most obviously, it could dispatch evangelizing beings who are far better suited than fallible finite followers. It created everything from scratch, so creating such beings would be no obstacle. They could have abilities to teleport, project loudly enough to reach entire crowds, shrug off dangers of all kinds, execute the occasional miracle, know every language, memorize the entire set of holy writings. Furthermore, they could complete the mission without the weight of a long history of self-righteousness and hypocrisy, which followers need to try to deflect in order to get in the door. Some people won't have a conversation on the topic until they get a sincere answer to the question, "Why has a good god been represented on Earth by <i>people like that?</i>"</li></ul><p></p>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-24293270182434403602022-02-09T16:21:00.000-05:002022-02-09T16:21:21.625-05:00the mystery con<p>Sometimes the urge rises to simply call something what it really is. <i>Mystery is a con</i>. Arguably the more impressive trick isn't to convince otherwise-shrewd people to believe in "truths" that lack confirmation (wishful thinking does most of that work); it's to convince them that the lack of confirmation is <i>thrilling and sophisticated and broad-minded</i>. Rather than honestly saying "I can't do a good job of <i>thoroughly showing why</i> you should accept my statements about the supernatural domain", the strategy of the mystery con is to say "My statements are far too amazing and otherworldly to be confirmed the way you would confirm any other far-fetched statement." The more that the speaker leaves to the listener's imagination, the more that the listener's <i>own brain</i> can complete the rest of the picture in whatever way they happen to prefer.</p><p>And it's doubly effective because, like any excellent con, it tugs on feelings in the listener. First, it emphasizes the speaker's special access to knowledge and the feeling of authority that comes with that: "Because I won't tell you <i>how I know</i>, you have no choice but to trust me as your <i>primary source</i> of information." Second, it stirs the inborn fascination that people have for the unknown. By nature the unknown is more interesting than the known. Through the glamour of mystery, the refusal to be upfront about a statement's base of evidence appears less like <i>sly cheating</i> and instead like suggesting that there are wonders out there—on the other side of the the limits of credibility, that is.</p><p>Third, it provides the invigorating experience of <i>novelty and escape</i>, because the statements of everyday life don't give off the same stink of mystery. It's nowhere to be found in the mundane statements that come up in the process of getting real tasks done or observing real characteristics of real things. The bluff that "Yes, there's more to the universe and it's very exciting!" can only be kept up if the speaker never quite answers reasonable questions about why anyone should be firmly convinced of their claims. </p><p>Fourth, the mystery con boosts the pride of the listener. One would expect a feeling of embarrassment about believing in statements with inadequate support; no one wants to be the fool. Yet a statement that's been whitewashed with the color of intriguing mystery works in the opposite way. The listener is enabled to boast that they themselves are extra-special, or that their beliefs have extra-special layers, because they are chained to extra-special mystery statements—statements that go over the heads of dull normal people who merely use time-tested and well-established means to sniff out deception. (To be forthright and say the emperor has no clothes is to show that you aren't an extra-special multilayered person, of course.)</p><p>On the other hand, the lesson to be drawn from the mystery con isn't that <i>uncertainty</i> is repulsive. The attitude to take shouldn't be excessive in either direction. It shouldn't be <i>reverent</i> in the manner that the mystery con encourages, but neither should it reflect distaste. Being clear-eyed about uncertainty comes from admitting the undeniable limitations of the data and methods used to gather knowledge. Uncertainty <i>just is</i>. Idolizing the mystery of not-knowing is off the mark, but idolizing absolute certainty is too.</p><p>Furthermore, this difference in attitude contributes to an important difference in the approach to <i>counter</i> uncertainty. If someone isn't ruled by their craving for certainty, then they're far less tempted to do what countless humans have done ever since they mastered language: <i>make something up</i> to fill the gaps in knowledge and use the word "mystery" to bat away questions about their inventions. Both 1) to place flimsy statements in the gaps and 2) to take such statements at face value have been common practices for literal millennia. </p><p>By contrast, the two corrective practices of 1) obtaining knowledge through painstaking work and 2) demanding that speakers go into detail about the work they did to get their knowledge, weren't all that common...and to an appalling extent aren't nearly common enough in the present day either. The purpose of the mystery con is to <i>distract</i> listeners from relying on these corrective practices and to give the older practices more credit than was ever deserved.</p>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-61751901547114077492021-10-21T20:20:00.000-04:002021-10-21T20:20:42.210-04:00socialized truthIt's too easy to accuse the other side of being mindless. I can say that back when I was also someone whose life and thoughts revolved around sincere belief in supernatural concepts, thinking in depth about those concepts wasn't that strange. People would at least admit that it was good to love God with your mind and to learn more than the basics of the religion rather than be shallow. Of course, this part of the subculture sits side by side with a strong anti-intellectual bent of "stop analyzing and have faith instead". And fearful hostility toward publicly funded universities has only grown as the <i>political</i> divide between college attendees and the rest of society is purer than ever. <div><br /></div><div>Within that group, the strategy is to <i>channel</i> critical thinking into approved directions, similar to how there were approved "Christian" contemporary songs, approved "Christian" movies, approved "Christian" games, etc., etc. One approved direction for contemplation was to "engage with society" by focusing on specific targets. <i>Scary</i> <i>postmodernism </i>was a popular one: a caricature of postmodernism that claimed all beliefs were completely produced by, not merely colored by, cultural context. In a sense the traditional notion of "truth" didn't exist at all in scary postmodernism. It was often linked with total moral relativism, which was another approved target. </div><div><br /></div><div>The general line of argument was to contrast scary postmodernism to the timeless and objective statements of religion. Religion was a fixed point, neither coming from the surrounding society nor changing itself to fit in. Attempting to poke holes in scary postmodernism was a smug intellectual pastime that appeared in Christian media. The shoe was on the other foot; those postmodernists were the ones who weren't thinking coherently. Followers of religion obtained solid truth through time-tested methods. They weren't controlled by the ever-changing whims of popular sentiment. </div><div><br /></div><div>However, recently I had an amusing realization. Despite the attention that this subculture devotes to contrasting itself with scary postmodernism, I'd say that in practice it is an excellent example of truth that's been socially determined. One might call it <i>socialized truth</i> in order to playfully invite a comparison with the socialized medicine they dread. It seems to me that their truth is thoroughly socialized. Pursuing ideas that are too individualistic is one definition of grave <i>heresy</i>, after all.</div><div><br /></div><div>They gather not only for mutual support but to hear <i>other people</i> tell them what the truth is (sermonize). They read their prized book, but it's sufficiently difficult and ambiguous that they're forced to rely on <i>other people</i> (or a predigested lesson written by other people) to decode it into "timeless objective truth" that has something comprehensible to say about current everyday life. They adhere to their inflexible moral rules, but deciding on what set of moral rules to have, such as minute restrictions about behavior and diet, depends on the <i>other people</i> in their particular religion and sect. Negative reactions from them are enough of a deterrent from committing violations. </div><div><br /></div><div>They vote as a bloc according to nonnegotiable political platforms, but the political opinions they hold, as well as the categorization of opinions into negotiable and nonnegotiable, come from what <i>other people</i> tell them is core to their identity. As for a wide variety of topics that are entirely <i>separate</i> from their supernatural beliefs, strangely their thoughts about those are still directed by <i>other people</i> who have supernatural beliefs that match theirs. (For instance, their thoughts about the best course of action to safeguard their own health and their community's health might come from their group rather than people who know what they're talking about.)</div><div><br /></div><div>Overall, the overwhelming pattern is that their thinking is transparently steered by their social context. "I think about it this way because that's just <i>what we think</i>." Most followers of supernatural beliefs aren't hermits or prophets. They may not even be especially countercultural, really, if their beliefs are in fact held by a large subset of the people around them. Although they may loudly object that no one tells them what to think, I'm inclined to assume that they haven't worked very hard at deeply examining the beliefs that they say are precious to them. What an amazing coincidence that everybody in their context happened to "independently" reach the exact same conclusions over and over again...</div><div><br /></div><div>On the other hand, their reliance on socialized truth is far from unique. Each individual has limits. Everybody needs to get truths from others. The vital distinction is the manner in which someone <i>relates</i> to socialized truth. As always the question to never stop asking is "<i>How</i> does the speaker know what they say they know?" The problem with cultures that explicitly endorse <i>faith</i> is that it encourages the mental habit of brushing over that important question! Socialized truth should be <i>considered and chosen</i>, not thoughtlessly absorbed. Trust in the speaker's credibility is essential, but credibility isn't a total substitute for the speaker explaining the work they did before they spouted something. To the contrary, the speaker's reluctance or inability to explain the work they did is a hint to <i>lower</i> trust in their babbling. If they worked hard to back up their statements then they should be proud to explain!</div><div><br /></div><div>There are helpful signs either that someone is deliberate about socialized truth or that they're the opposite: nothing more than a yo-yo yanked to and fro by a culture they identify with. The signs I'm listing here aren't a complete list but only some of my favorites that echo the themes I tend to return to. First, a bad sign is when the socialized truth is conspicuously <i>vague</i>. "Truths" that seem to be avoiding verifiable details should raise suspicion about whether these supposed truths are undergoing any scrutiny before being accepted. Generalizations rest on top of specifics, not motivate frantic searches for justifications to shore up the generalization that <i>came first.</i></div><div><br /></div><div>Second, <i>lack of nuance</i> can be a possible sign of an undeserving socialized truth. The reality of many people is that they could claim multiple social groups, and the groups might not be in strict agreement. Reconciling the competing truths of their groups would take effort. They'd need to compare the "truths" coming from each and the basis for the truths. The end result would be a more complicated view with many sides and compromises. "As a _____ I think this, but as a ______ I also combine it with that." </div><div><br /></div><div>The lazier albeit less confusing alternative would be to make a blanket choice of a singular group and then enthusiastically accept whatever that group proposes. Then the socialized truths pushed by someone's other social groups have no moderating effect. The villains of that singular group can do no right, and the heroes of that singular group can do no wrong. Therefore another sign is when there's virtually no difference between <i>all</i> of someone's thoughts and the common thoughts of the group they idolize. One can be partially <i>affiliated</i> or allied with a group without reflexively echoing it in every way. Is someone thoughtfully choosing to identify with a group because some of its socialized truths are good matches, or is the group <i>rewriting</i> someone's ideas with socialized truths? "I once thought ____ but now I realize this is something our enemies think, not us."</div><div><br /></div><div>Third, when a socialized truth <i>fits perfectly</i> with preconceptions, that might be a reason to doubt. Just as the full reality of a person's social position is a complex mixture of group cultures, the full reality of all facts is a complex mixture. The expectation should be that some facts fit well with a preconception, some are merely compatible with it, and some clash with it. Each situation isn't identical with another, so the facts of the situations might or might not be identical. The speeds of falling feathers and hammers on the moon are famously different than the speeds observed at the gas-wrapped surface of Earth. Disagreements between facts and preconceptions can point to the need for deeper understanding. Socialized truths that solely <i>confirm</i> could be coming from someone who either intentionally <i>selects</i> facts to flatter and strengthen the group's beliefs...or twists facts to be more suitable...or passes along the unsupported rumors they like...or fabricates stories from start to finish.</div><div><br /></div><div>Fourth, extremely rapid shifts in beliefs can be a sign of an improper acceptance of socialized truths. If there are <i>established reasons</i> for holding one belief, then uprooting that belief and replacing it would be a process. Those reasons would need to be carefully judged side by side with reasons to reject the belief. But if a socialized truth was adopted without much analysis, then replacing it is far easier. The one reason it was adopted was that it was belched out by the group; when that group belches out something else then that one reason for the old "truth" no longer exists. An individual who dances to the group's tune will switch dances immediately when the tune switches. (Cue the obligatory reference to <i>Nineteen Eighty-Four</i>: we've always been at war with Eastasia.)<br /><br />That said, I wouldn't recommend beliefs that never shift at all. These could also be signs of an unreasonable group loyalty that controls someone's thinking. Some groups' cultures have foundational precepts, especially if a group is supposedly defined by commitment to such things. Even to ponder the limits or flaws of these precepts is forbidden. Doing that is viewed as a loathsome <i>betrayal</i> of the group. In countless times and places, people's own innermost thoughts have been effectively corralled by their desire to not be seen as a "traitor". Although it's been said that the one freedom that cannot be violated is the freedom of innermost thoughts, the tyranny of socialized truth does so all the time—and with the consent of people who can't be bothered to think for themselves.</div>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-64137445584460217182021-09-05T10:34:00.000-04:002021-09-05T10:34:48.166-04:00seeing the love<p>Discussion about the love of some particular religion's god—or the absence of such—generally centers on the problems of evil and suffering. Why would a miracle-working and generous god permit <i>so much</i> evil and suffering in the past and present, including a lot that serves seemingly no purpose? It's an excellent question. And it's inspired many interesting (attempted) justifications.</p><p>It's also not the topic of this blog entry. Highly committed religious followers raise other questions when they continually insist through sermon and song that their god is loving, because many would say that they aren't merely repeating a piece of foundational doctrine. They'd say (testify) that their god is always loving in terms of what it does day by day in their very lives. It's active now. Its love continues past the long-ago actions it took according to stories passed down through tradition.</p><p>But when they list examples, they fail to sway listeners who don't have the same loyalty to the underlying beliefs. The examples might be minor, coincidental, and possible to explain without invoking supernatural intervention. If this is pointed out to them, they may respond, "Love isn't something that can be slid under a microscope. To believe in love sometimes requires faith and trust. Faith allows you to see a beneficial occurrence for what it <i>really means</i>, which is a loving act of my god. I cannot prove that my god is filled with overflowing love, but why do you say this is strange? I also cannot prove beyond all doubt that the people who are dearest to me feel love in their hearts. Faith is an essential part of the picture."</p><p>I'm pretty sure that for them, their response feels well-considered and even somehow natural. However, this is yet another illustration of the way that discarding supernatural beliefs reverses one's outlook. Seen from my current perspective, it's almost <i>nonsensical</i>. It verges on insulting to the people who actually are loving to me. How does it take <i>faith</i> to believe in the love of people who have done so many good things for me, directly and in plain view? When they've been a comforting presence in bad times, not through a cryptic email but through sitting nearby and patiently listening, or through assisting with getting food and performing other tasks? When they've laughed at my jokes and tried to cheer me up when I'm sad? When they've offered the insights they learned from their mistakes, to prevent me from making the same ones? When they gladly spend time with me, not only on big occasions but on a casual whim?</p><p>To repeat something I've written before in other contexts, the main point isn't that love is defined by what someone <i>gets out of it</i>. The point is whether or not a concept displays a strong if not undeniable link to the detectable differences it makes in reality. It's not an ambiguous "sign" of something that demands a skewed viewpoint to comprehend it. ("One of my life-long friends has invited me to a cookout. <i>What could it possibly mean?</i>") It might not be visible to the eye but it will certainly be easily traceable, like someone making a payment on the recipient's account. Picking up the pattern is like connecting numbered dots, not like stretching threads between tacks on a big board. It's <i>possible</i> for there to be someone who expertly manipulates various things, sight unseen, in order to eventually produce a loving outcome. This possibility requires a huge mental leap, though.</p><p>Unfortunately, there is a flipside to perceptible loving acts. Unloving acts by someone who loves you can be just as perceptible. And the category of unloving includes both cruelty and indifference. Would it be more understandable to say that it <i>does require</i> faith to believe in an individual's love regardless of their cruel and/or indifferent acts? It's true that real people are a mix of characteristics with varying moods and motivations. There's no question unloving acts make it harder to believe in their love. It's also harder to trust that they'll act in a loving manner again in the future.</p><p>Nevertheless, I wouldn't stoop to using the word "faith" for it. Because reality is complicated, there are a lot of factors and moving parts. That's why conclusions often aren't perfect in every circumstance. A cause that's very strong can still be subject to competing causes, at least from time to time. The result is that the full set of evidence is likely to be mixed. Statistical analysis comes into play. Genuine strong love might be reflected in committing loving acts "significantly more" than acts of cruelty and indifference. I wouldn't say that it's faith to infer love from a heap of obviously loving acts despite a few unloving acts. It's more like noticing that two fair dice have a sum of 5 more frequently than a sum of 12. </p><p>To properly apply this analogy to the influence of an <i>omniscient and omnipotent </i>actor, it shouldn't be necessary to point out that a complete view implies tallying up a mountain of facts about reality's ups and downs. Unlike a person, an omniscient actor has countless more opportunities to act in ways that are loving, indifferent, and cruel. Of course, I'd maintain that the far greatest amount of hypothetical "acts" are in the category of inaction. <i>If there were</i> something that wasn't acting, then indifference would be the explanation that isn't strained. Perhaps it's true, with a focus that's narrowed and aimed, someone can sometimes feel that "someone up there is watching out for me". I'd say that when someone uses a focus that isn't so narrowed and aimed, this feeling is outweighed by the sheer number of times that it isn't applicable. And I'm not referring to the large important evils and sufferings that are usually brought up in the philosophical Problem of evil and suffering; I'm merely referring to an individual's own experience. </p><p>One additional defensive analogy between personal love and the love of a god deserves some attention. It goes like: "I think your view of love is superficial. Part of maturity is recognizing that love has well-defined <i>boundaries</i>. Love can actually hamper someone when it doesn't allow them to make decisions, face consequences, and learn. The loved person also can't reach their full potential if love doesn't allow them to face and defeat challenges without immediate help. Boundaries establish that people believe in each other's capabilities and that they're entitled to a realm of independence or non-interference (autonomy). In the same way, my god is obligated to hold back from taking action a lot of the time. Its superior love goes hand in hand with its wise boundaries."</p><p>As analogies go, it's not that bad. Yet it seems to me that it doesn't quite work for another reason that's commonly associated with a mature concept of love: <i>open communication</i>. Boundaries that aren't openly communicated are misleading. Nobody knows for sure where they stand and how to correctly interpret each other's acts. If someone <i>distinctly says</i> they're not doing something loving because it would violate boundaries, then I agree that it's a reasonable course to take. If someone simply doesn't act, and they don't <i>express clearly</i> that it's because of boundaries, then I disagree. If it's for the purpose of teaching a memorable "lesson", then specific communication matters so that the right lesson is learned (i.e. not the lesson that the teacher is undependable). A god that honors valid boundaries <i>without</i> communicating the meaning behind its refusals to act is a god that's, once again, suspiciously reminiscent of an indifferent god rather than a loving god with boundaries. </p><p>Furthermore, the huge range of possible actions for such a god remains a problem. If the analogy of love that honors boundaries is implicitly parent-like—noting but setting aside the psychoanalytic comparison of gods to parental figures—then this analogy has another failure: parents have sensible limits on how far they will extend the decision to not act. The god that would be compatible with what we experience would be a god of <i>archaic brutality</i>. Would we say that it's a loving balance of help and boundaries when the loved individual's choices are allowed to result in, for instance, losing a leg or a hand? Would we say that it's a loving balance when their choices are allowed to result in diabetes or cancer? Would we say that it's a loving balance when their choices are the right ones but those choices give someone else the opportunity to exploit them? Would we say that any kind of boundary or lesson explains why someone's house is ripped apart by a natural disaster? (Supposed divine judgments on subjectively "sinful regions" notwithstanding...) </p><p>Forget about a love that can't even be seen without taking a leap of faith, projecting hidden influences onto commonplace events, and ignoring numerous counterexamples. Give me a love between people, concretely demonstrated and unmistakable, any day. I understand why the most admirable followers of supernatural beliefs comment that their beliefs are communicated most effectively by being their god's hands and feet on Earth. I'm not convinced of its existence, but if I were then "its" most convincing acts of love on an ongoing daily basis would be the acts of <i>people</i> who care.</p>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-1600307972999758772021-08-15T22:23:00.000-04:002021-08-15T22:23:37.394-04:00knowledge breachI often mention the frankly awe-inspiring ability of people to <i>compartmentalize</i>, because even now I think it's not always appreciated enough. They can separate their views of reality into compartments, consciously or not, neatly or not, and then they can treat their compartments much differently. Part of the reason I emphasize this is to explain that <i>I wasn't</i> totally irrational when I followed supernatural beliefs, and that's also true of a lot of people who still do. Thanks to compartmentalization, they may live in ways that mostly seem sensible, or they may hold a variety of opinions that mostly seem sensible, yet they assert that once upon a time a god was human.<div><br /></div><div>At the same time, as I've previously described, compartmentalization gradually quit working for me. That's why I ended up discarding the beliefs I had followed. There was a <i>breach</i>. It wasn't a data breach but a knowledge breach. My knowledge about reality kept invading the supposed "knowledge" provided by my former beliefs. It poured in faster and faster until it swept away the counterfeits. </div><div><br /></div><div>I had grown thoroughly convinced of the fundamental principle that an idea is <i>real</i> to the extent that its predicted impacts are confirmed by actions, observations, and reasoning. Furthermore, the very definition of realness is this principle. The breach happened as I questioned <i>why</i> I had been raised to have a compartment that was exempt from this superb principle—and then I realized that the need for the compartment was due to the failure of the beliefs inside it to meet the standard. In retrospect, of course the least credible beliefs were the ones accompanied by the loudest appeals to faith. </div><div><br /></div><div>But that was my experience. Lately I've been noticing that knowledge breaches occur in the other direction too. Rather than a breach that exposes unverified knowledge to attack, there are breaches that expose the verified sort. The principle behind these is simply the distorted mirror image of the other principle. I'd characterize it as a lengthy ramble: "Knowledge gained through human senses and reason is fallible and unimpressive. Impartial and systematic investigation is an impossible myth. It mocks the all-important contributions made by what we <i>feel</i> to be right and what we've been taught through time-honored tradition and communication with the divine. Everybody lies all the time and prizes their goals more than objective truth. Their goals might be profit, political power, or flat-out hatred of goodness. Anyone who says they can demonstrate the accuracy of their ideas deserves no more consideration than someone who uses their common-sense to rant. Plus, as long as it's possible to cite a different poorly-run 'study', an anecdote on a website, or someone who's an expert in an entirely unrelated field, then that means there's equal proof for the conclusions I'm comfortable with. Confirmation bias is something I need to watch out for? Whatever, I've never heard of it. Every idea that's contrary to my wishes or deeply-held intuitions is being spread by evil widespread conspiracies, and the goal of those conspiracies is to destroy everything I care about. Experts and organizations made up of experts are wholly devoted to these conspiracies, no matter what they may say. Naturally, everyone who says they're similar to me but has good reasons to disagree with me is a traitor and a fraud. Trust is based solely on whether someone grew up in the correct culture and whether they're committed to the correct groups, causes, and figureheads. <i>There are more of us than them</i>, which implies that they only win through deception. Anything they claim, we'll automatically accept the opposite. Our scriptures literally use sheep as a metaphor for us, and that's a good thing."</div><div><br /></div><div>That ramble was intended as an exaggerated extreme...but sad to say it's not that far off from the mindset of some people. The point is the knowledge breach that it can cause. Like I did, they probably have compartments in which they selectively do or don't define an idea's realness by its confirmed impacts. The difference is the path they're going down. I wondered why I was allowing the reality of some of my beliefs to be defined under laxer philosophical rules. In their minds they're wondering why they're allowing the reality of some of their beliefs to be defined under the illegitimate rules of the culture they consider themselves at war with. </div><div><br /></div><div>Perhaps truths in the supernatural domain are completely decided by the irrefutable sayings and writings of one's culture, or by the dictatorial authorities within that culture, or by the truths someone feels, or by judging the degree that something is in subjective harmony with one's preexisting assumptions and aims. If these are the methods for absolute truth in the domain that controls everything they think and do, then they may be easily convinced to reapply these faulty methods in other compartments as well. At that point the original dividing line is blurred and the high-quality knowledge they did have is exposed to the breach. Falsehoods masquerading as truths move in. The lack of skepticism spreads. It metastasizes like cancer.</div><div><br /></div><div>Unfortunately, the risk of this is plain to see. This knowledge breach is like a keypad lock programmed with the numbers 1-2-3-4. Anyone who uses the relevant combination of inputs could use the breach to cynically push "knowledge" in for their own purposes. I hate to see it happen. When the consequences are worse health, wasting money, acting against someone's own interest, feeling fear and anger toward others or even toward imaginary foes, paying all attention to short-term effects, and on and on, discussions about the definition of realness start to not seem so trivial after all. </div>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-55360312480585650272021-08-09T20:02:00.000-04:002021-08-09T20:02:22.646-04:00calling a temptation cease-fireI've been critiquing some of the attitudes and opinions that I've frequently seen in people who structure their lives around supernatural beliefs. Sometimes these are far different from mine, but sometimes there are similarities. For instance, I think it's clear that we share a common experience of <i>facing temptations.</i> One of the major pros of leaving the beliefs behind is that plenty of the "wrong" thoughts and actions simply don't seem wrong to me now, and so some of these often trivial "temptations" have stopped being problems.<div><br /></div><div>Nevertheless, some temptations apply to me still. It might be the temptation to overeat an unhealthy snack or surrender to an unpleasant habit. It might be the temptation to procrastinate about a task that must be done sometime. It might be the temptation to assume the worst about someone else's motivations. It might be the temptation to litter. In any case, the temptation is for a temporary and/or minor benefit that comes with a substantial cost. Worse, the cost might only take effect in the future or gradually over time, or it might affect others more than oneself. </div><div><br /></div><div>The crucial differences start to appear when the basic idea of temptation is put into the larger context of how someone views reality. Within the culture that I left, temptation is consistently seen as a deadly serious <i>battle</i>. It's an attack from an antagonist such as the evilness someone is born with, the pleasures of the world, or diabolical spirits. Because of that, the struggle is treated as a test of inner strength. It's an arm-wrestling match between the temptation and force of will. The eight-year-old who has the impulse to steal a candy bar is in the middle of a cosmic theater of war in which good and evil are firing heavy artillery.</div><div><br /></div><div>When I consider this picture in my present frame of mind, I'm struck by how counterproductive it is. Regardless of what the temptation is about, handling it as a hard-fought battle isn't a wise strategy. I've become more familiar with alternatives. I'd argue that it's better, when the temptation first forms, to not charge at it full-speed. Instead, leave it be and let it pass.</div><div><br /></div><div>I don't refer to pretending the temptation doesn't exist. On the contrary, I mean that it is to be seen fully, without flinching away from it, as the <i>bare thought</i> that it is. It's a "bare" thought in the sense that it doesn't need to be inflated into something big and scary. It doesn't need to be focused on or connected up to anything else. Does it produce a <i>reaction</i> of desire in the person tempted? Of course. And neither is it useful to pretend that this reaction doesn't exist. Bare thoughts of all kinds evoke reactions constantly. The fact that a thought evokes a reaction doesn't imply that the thought merits <i>even more</i> of a reaction (a counter-reaction and then a counter-counter-reaction, and so forth). </div><div><br /></div><div>Admittedly, there are challenging subtleties to this approach. First is emphasizing that it's not the same as simplistically <i>ignoring</i> temptation. To focus on banishing the thought, forcefully replacing it, or coming up with counterarguments, is to <i>prolong</i> the thought and "play its game". The recommendation is to recognize it, observe its futile efforts to be provocative, and endure its temporary effects. Simultaneously its pull is only observed too, in the sense that it's felt without being linked to corresponding action. It's like tugging on a rope tied around a tree trunk. </div><div><br /></div><div>The main thing is to steadily break the mentality that if a temptation thought <i>isn't</i> fought, then it's only a matter of time before the tempted person will <i>act</i> on it. Fear of the experience of having the thought, and the self-defeating determination to ignore it all costs, is tied to the <i>assumption</i> that the sole path to not perform the particular behavior is to strenuously never have the thought. However a thought is a thought; it's not the act and it's no more than brain activity. Someone who's sitting quietly and experiencing a temptation is still only sitting quietly.</div><div><br /></div><div>On the other hand, I wouldn't say that it involves <i>embracing</i> the temptation thought either—granting it a stamp of approval or relishing it. Merely <i>watching</i> it calmly until it fades doesn't imply that someone is welcoming the thought. Refusing to feed the thought is to withhold either strong acceptance or strong rejection from it, until it starves by itself or its repetitiousness just grows uninteresting (or annoying). This is why it works better when it's applied as early as possible, when the temptation pops up, before it's progressed to dominating mental attention. Someone doesn't think "This is just a bare thought; I can enjoy it for a bit as long as I don't take action". They do think "This is just a bare thought; I see the desires it provokes but I'm not under the dictatorship of something that appears and vanishes without hurting me in a lasting way". </div><div><br /></div><div>Another motto to describe it is that <i>temptation is normal</i>. It's no cause for dread once someone is comfortable with that fact. The existence of an unfulfilled desire isn't a never-ending torture! If it's genuinely left alone and not dwelt upon, it will likely diminish in minutes. If it arises again in an hour's time then so be it. It will again likely diminish in minutes at that time too. (This doesn't work for something like hunger pangs, naturally.) Depending on personality, keeping a sense of humor about the whole thing might be good advice. <br /></div><div><br /></div><div>Furthermore, a coping strategy for temptation pairs well with<i> prevention</i>. Shrinking temptation down to the size of a bare thought is good, but stopping the thought from forming at all is better. One of the reasons it's a relief to think of a temptation as a thought is that thoughts can have plain origins like anything does. If a temptation thought regularly comes about in a specific situation, then that situation can be avoided. Being less worried about the experience of temptation <i>when it happens</i> isn't a reason to be careless about <i>letting it happen</i> in the first place. </div><div><br /></div><div>Prevention is immensely useful, but it comes with its own layer of subtlety. The situations to prevent have both external and internal contributions. A person's surroundings can matter but so can the condition of the person. Tempting thoughts could be frequently seen alongside boredom, fatigue, rage, sadness, isolation, or something else. These general conditions might not be preventable—sometimes life gets hard—but it's smart to recognize that these can affect the number and the strength of tempting thoughts. And then someone will know to <i>expect</i> the increase when the conditions return. </div><div><br /></div><div>The final subtlety is that letting temptation pass is far from a perfect strategy. There will probably be instances in which someone sees the temptation thought for what it is...and proceeds to choose to obey it. (Any strategy at all can give someone "room" to analyze a decision, but it won't make the decision for them.) Afterward, the tempting thought's return will stir up ruminations on the past and the associated shame and regret. </div><div><br /></div><div>But responding to it by ruminating on the past is <i>yet another </i>method to fuel it. Obviously, the only time someone can act is in the present. A failure in the past isn't an absolute prediction for how someone will respond now. Perhaps someone has obeyed a specific temptation numerous times, with little hesitation. Such a trend doesn't inspire hope, but perhaps <i>this case today </i>could very well be the start of a <i>new</i> trend. This case could be the turning point if someone chooses it to be. The amount of control someone has to change their future is limited, but that amount of limited control is infinitely greater than the zero control they have to change their past. </div>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-28368862487346461982021-07-17T20:30:00.000-04:002021-07-17T20:30:02.019-04:00tossing out changeI've written that people frequently—and perhaps strategically—don't respond to the actual statements made by those who discarded their supernatural beliefs; instead they simply respond to what they <i>guess or wish</i> we'd say. So it's fair for me in turn to put a spotlight on the favorite <i>words</i> of followers of supernatural beliefs. <i>Permanence </i>is an underlying pattern of many of these words: "forever", "never", "timeless", "faithful", "always", "never-ending", "everlasting", "enduring", "solid", "unceasingly", "unfailing" (and "unchanging", needless to say). <div><br /></div><div>Once someone has shaken off this mindset, its weaknesses stand out. The sort of perfect permanence expressed by these words starts to appear so plainly <i>unrealistic</i>. The more that people have examined the universe in depth, the more apparent it is that change itself is fundamental. From the largest collections of matter to the smallest, individual energy fields and particles don't remain in the same positions. Even massive stars and black holes are predicted to go through different phases. Mountains erode. Atmospheric gas concentrations shift. Although important quantities are conserved at specific scales, conservation isn't a rule against change. It's for restricting the changes that can possibly coexist.</div><div><br /></div><div>Someone might laugh and retort that they're still who they've always been. But this is an illusion: human bodies replace individual cells constantly (and also, er, accumulate wrinkles and spots and such). Work is necessary for anything to stay the same. A thing that doesn't appear to change a lot is generally going through changes of maintenance and repair <i>to reverse</i> changes of decay. Like a person on a treadmill, it's running to hold position. Similar metaphorical truths apply to relationship commitments and personalities. People do change over time in how they think and behave, and their goals and preferences can too. Commitments between people are <i>repeated recommitments</i> as they choose to persist regardless of changes in circumstances and personalities. </div><div><br /></div><div>To demand that something <i>never change</i> is to refuse to face something <i>as it really is over time</i>. I'd say that it's beneficial to acknowledge one's own honest reactions about change, whether the reactions are positive or negative. But there is a point at which clinging too much to the past or trying to resurrect it is a waste of the finite time and resources someone has in the present. </div><div><br /></div><div>Change is central and inescapable. So how is it that followers of supernatural beliefs can confidently proclaim that the very real beings and otherworldly realms in their beliefs absolutely never change? From the vantage point of someone on the outside looking in, the claim doesn't seem right. Its characteristics are like a flashing red warning light. Questions are raised. What are these beings and realms made of? What are the different rules of motion and composition and why are the rules so different? How do the different rules fit into or violate the rules of everything else that's real? And to return to the main point, how is that these rules work without change being fundamental?</div><div><br /></div><div>Of course, there are well-known things that don't change: concepts and information <i>created and communicated by minds</i>. Mathematical definitions. Written stories and songs (that have been accurately copied and stored). Theoretical abstractions. It's true that some groups of followers are more than willing to agree that the contents of their beliefs exist purely as historical myths that are useful as inspiring metaphors. And some cerebral followers might be willing to state that the contents of their beliefs are no more than the stark conclusions or axioms of a philosophical logical system. </div><div><br /></div><div>However, this comparison isn't a workable solution for many common followers. The more that they say that their preferred deities are as timeless as an integer or a catchy tune, the more that they're placing their deities directly next to human creations...which are understood and preserved through human mental efforts. After all, a supernatural concept can indeed be timeless because, well, a concept can be whatever someone imagines it to be. </div><div><br /></div><div>It doesn't help their case that they regularly gather to <i>insist</i> to themselves and each other that the objects of their beliefs have timeless qualities. If these objects are intrinsically timeless, then outsiders have good reason to wonder why people clearly pour so much mental effort into reinforcing the idea. It's almost as if the timelessness of the concepts is a reflection or projection of the unending <i>craving</i> for timelessness and not a discovered independent reality... </div>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-14419346199505813162021-05-09T15:40:00.002-04:002021-05-09T15:40:36.277-04:00feeling things outIt's obvious that in-depth intellectual debates are often fruitless at convincing people to discard their supernatural beliefs. An analogy for this is attempting to play tennis with someone who responds to a gentle serve by leaping out of the ball's path, picking it up after it lands, and hurling it away from the court ("Hah! You weren't ready for <i>that</i>!"). If pressed, they might state that they follow their beliefs because of how the beliefs make them <i>feel</i>. They aren't pondering what <i>facts</i> they would expect to see if their beliefs were accurate and then judging how closely those expectations are met by the facts on hand. They judge by their emotional expectations: "How could the beliefs I follow be inaccurate when they line up with my emotions so well?"<div><br /></div><div>One answer is that the beliefs have been designed to do this. They took modern form after the tremendous effort of councils and prophets and theologians. Alternative forms with jarring inhuman qualities simply didn't win the arguments. The second answer is that the beliefs have <i>evolved</i> too. The more that they could pull on their followers' heart-strings, they more that they were likely to be accepted, faithfully kept, and passed on. The more that they failed to connect with common human feelings, the more likely they were to fail to compete and then die out. As others have observed, countries without official supernatural beliefs actually boost this evolution because beliefs can lead to mutated competing variants instead of a vigorously enforced monopoly or monoculture.</div><div><br /></div><div>On the other hand, to some degree the ability of supernatural beliefs to match followers' emotions isn't because of the beliefs at all. When a belief of any kind is <i>conventional</i>, it's integrated into a context. It plays its particular part. It satisfies needs that the context evokes, so of course it seems fitting.<i> </i>In a context in which family ancestors are heavily revered, a belief that grants lasting afterlife and oversight to those ancestors will "feel right". In a context in which individual freedom is a top value, a belief that emphasizes the import1ance of a personal conversion decision will "feel right". In a context in which people's hunger for thrilling mystery isn't well-served, a belief in paranormal phenomena will "feel right". </div><div><br /></div><div>Even apart from specific content and favorable contexts, supernatural beliefs tend to have a general advantage for validating sentiment: constant certainty. They provide larger-than-life inspirations and targets that aren't <i>messy</i>. A supremely evil thing merits uncomplicated condemnation, and supreme good merits total adoration. The (alleged) support of an utterly powerful ally is a reason to remain calm despite circumstances. Statements that cannot be false (...or disproven...) give an unbending structure that can encourage trust. Detailed laws about morality offer up categories of acts and thoughts that deserve total disgust. Missions to save the world gratify cravings for purpose. </div><div><br /></div><div>Essentially, while emotions paint huge importance <i>onto</i> normal many-sided existence, supernatural beliefs can directly present an existence that in itself has huge importance and is made out of one-sided ideas. No wonder the diligent gathering of verified <i>impartial</i> knowledge can't fit emotions as well. But it's a superb fit for one: the drive to embrace reality <i>as it is</i> and escape the oppressive thoughts that don't.</div>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-64637294315726066852021-05-04T11:26:00.000-04:002021-05-04T11:26:07.185-04:00unbalanced<p>I exited slowly from religious belief (though it was common to call your emotional connection to God a "relationship, not a religion"). A major reason I didn't move any faster was because it took time to accept that my entire <i>perspective</i> needed an overhaul. Obviously, year after year, I either learned about or experienced flaws in my former beliefs. But my rigid perspective acted as a comforting <i>frame</i> around the flaws. It was as if my brain automatically placed each flaw under a protective glass dome and calmly set it aside, like a collector of insect specimens. A flaw wasn't treated as anything more than a harmless curiosity to ponder for a little while, perhaps to feebly show that followers don't dodge the hard questions. It wasn't permitted to genuinely disturb one's bedrock assumptions. To the contrary, it was an opportunity to further solidify assumptions by coming up with a rationalization...and of course the rationalization could be crude. It didn't need to be able to persuade anyone else.</p><p>As problematic as this mode of thinking is, some clever people who are stuck in it may offer up a reasonable-sounding justification. "I see the flaws you've mentioned. I don't claim that those are nonexistent or easy to explain. However, I have reasons of my own for <i>why I do</i> follow my beliefs. These two collections of clues are in competition. And that just means that all of us are <i>balanced</i> between thinking that my beliefs are accurate or inaccurate. The fragile balance is like a pencil standing on its eraser or unsharpened end. It could tip either direction. I fall one way, and I follow my beliefs. You fall the other way, and you don't. Both you and I can present evidence for our stances, and the evidence we present is more than enough for each of us to be satisfied with opposite conclusions. The only difference is a <i>choice</i> that each of us makes about which collection of evidence speaks to us more as individuals."</p><p>This approach is far more appealing than insults. It's tactful. It doesn't shut down conversation immediately. It encourages mutual tolerance, because no side's evidence is said to be superior. It portrays the sides more like people with differing preferences. By doing so it echoes the common-sense advice that <i>attacking</i> anyone for having other tastes than yours is mean and unsophisticated (yet typical in internet venues). </p><p>Unfortunately, it has a shortcoming that's hard to overlook: it's a deception. The reasons and the flaws that anyone happens to count as "evidence" aren't all on equal footing. A supposed clue, whether it's a reason to accept an idea or a flaw in it, must do more than <i>exist.</i> It must be <i>weighed </i>in order to see how much it deserves to affect the total balance. This weighing refers to asking essential questions about the clue and facing the answers fearlessly: </p><p></p><ul style="text-align: left;"><li><i>How</i> was it obtained? A story about an event in someone's life is vulnerable at first to incorrect interpretations of sensations, then vulnerable to revisionist human memory, and finally vulnerable to the gap between the storyteller's intended meaning and the listener's understanding. Another source of clues is what someone "feels" to be right, but the motivations underneath this feeling need to be examined. It's very possible that the clue is <i>really</i> an expression of the feeler's deep wishes, thoughtless instincts, or narrow preconceptions. The thoughts that make people relax or make them queasy vary greatly between cultures and time periods. Statements about how things are "meant to be" are actually prone to widespread disagreement and evolution.</li><li><i>How</i> is its accuracy determined? Until a statement has been checked and supported, its level of accuracy cannot be assumed. After all, it takes extremely little effort to spit out an inaccurate statement. The expectation should be that accurate statements are rarer, and any given statement is more likely to belong in the inaccurate pile. And the clue's accuracy could have limits because the methods for obtaining the clue have limits. Nobody should ever be penalized for forthrightly stating that a clue's accuracy isn't absolute! </li><li>What is the context it came from? A clue is less convincing if it emerged from someone whose consciousness was in an abnormal state. Brains in abnormal states are known to produce total fantasies and display impaired judgment.</li><li>Is it self-consistent? A clue is less trustworthy when it can't be reliably reproduced, or is frequently found among facts that throw doubt upon it ("just...overlook all the times when that other thing happens!"), or even contains a logical contradiction. </li><li>Is it plausible? A clue requires a greater amount of support if it demands a perfect coincidence, a sequence of improbable events, or a thinly stretched thread of arguments that have few observable connections to, well, anything. For example, explaining a dubious claim's lack of effect on reality with a <i>dubious</i> <i>excuse</i> doesn't inspire confidence.</li><li>Is it distinctive? A clue doesn't speak for itself. The more effort someone needs to put in to clarify how a clue can be viewed as a significant contribution to their side, the weaker the clue. Or, if it could easily be twisted to support multiple points of view, then it's not an excellent contribution to any single one. That said, ambiguity isn't necessarily avoidable—we live in complex realities where ambiguities abound. </li><li>Is the source credible? Someone earns more consideration if they avoid sloppy generalizations, go into gory details about the <i>work</i> they did to find the clue, and have shown that they value truth more than "winning". If they have a habit of pompously spewing whatever fiction that they have a "gut feeling" about or whatever statement will benefit them if a sufficiently large group eats it up, then it's best to cover one's ears and eyes and sprint away. Communicating with such a person is futile. </li><li>Does it clash with clues that carry <i>a lot</i> of weight? Real clues coexist in harmony with other real clues. When an idea is well-verified, it should prompt questions about the clues that don't fit with it. When one fresh clue disagrees with an idea that's been confirmed repeatedly, the idea isn't in danger of being thrown out momentarily; the clue is.</li></ul><p></p><p>After the clues have been sorted by "weight", the perception that the sides are well-balanced quickly falls apart. Instead it's easier to realize that each side's ability to endlessly <i>suggest</i> clues doesn't lead to an evenly matched contest. Quality matters. A little mound of many wafer-thin clues isn't enough when the clues on the other side are much more substantial. In fact the outcome is a pronounced tilt, not a balance. </p><p>This shift to analyzing clues more closely happened at larger scales than individuals' philosophies. It took place in a variety of subjects and had massive effects. Theories about the motions of physical objects were tied to measured experiments and mathematics, not creative philosophizing. Chemical reactions were openly shared instead of performed in hidden rooms and written in arcane books. Historians distinguished between primary and secondary sources and remembered that people are often biased or enjoy telling tales that have been...dramatized. Medical treatments were thoroughly vetted in large trials. Journalists adopted standards instead of spreading unverified rumors. </p><p>At the same time, some people don't always appreciate the value of it. A tilt toward one side might not impress such a person at all. They may blatantly choose to <i>override</i> the tilt and follow the beliefs they want. That's not what I did after I learned to weigh clues honestly. However I'm grateful when anyone at least <i>admits</i> it rather than bluff that their beliefs are backed by equivalent evidence.</p><p><b>Postscript:</b> There's a position on the materialistic naturalism side that also appears to assert that evidence itself hasn't resolved the debate. "I haven't yet been presented with sufficient reasons to think that there's a supernatural realm. Meanwhile, as a general principle it's impossible to prove that anything definitely <i>doesn't</i> exist. Therefore, I still need to be alert for the required evidence if it does arrive. Until then, I'll be neutral, which in practice means that I won't prematurely speculate that there's a supernatural realm."</p><p>This second case doesn't arouse the same antagonism from me because it has a crucial difference. It assigns the burden of proof appropriately. It recognizes that, echoing the comments above about accuracy, falsehoods vastly outnumber truths. There's a sea of <i>incompatible</i> possibilities. So most of the group <i>must</i> turn out to be false most of the time. Ideas aren't scarce. Each one should need to <i>earn</i> more attention than the numerous others available. Opting to not hastily accept a belief is wise, if one's goal is to accept as few falsehoods as one can. Simply put, skepticism is a shrewd default. </p><p>Another characteristic of this difference is the complexity of the belief that's followed or declined. In the first case, the belief being followed consists of a whole <i>system</i> of interrelated concepts, not to mention a stack of astounding stories. That's a lot to go along with. It's a stretch to argue that there are enough clues to furnish satisfying rationales for <i>every</i> concept in the system. The decision that's presented is an extensive one, as if the only two options are not believing at all or believing in A+B+C+D+E+F+G+H... </p><p>In the second case, the belief being declined is relatively minimal. The two options are logical opposites; there is or isn't a supernatural realm. Furthermore, "supernatural realm" is left vague on purpose. Ideas about what's in the realm or the rules by which it operates are <i>separate</i> from bare existence. The second case acknowledges that convincing someone of the realm's existence is step one of many. Once someone has demonstrated that a First-Cause god exists or that human souls exist, <i>then</i> the tasks of demonstrating the myriad details of a particular belief system come next—keeping in mind that these details differ dramatically in the huge range of belief systems.</p>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-13034678137438360252021-04-05T19:28:00.000-04:002021-04-05T19:28:03.055-04:00reality is in the details<p>The saying goes that the devil is in the details. But I would add that <i>reality</i> is in the details. This is borne out by day to day existence. By contrast, <i>thoughts</i> can leave out details—in fact this is an essential strategy for sketching out the basic outlines of a complex whole without getting lost partway through. Details cannot be left out forever, though. When thoughts are confronted by the reality of the universe outside one's skull, the overlooked details often take revenge. Hasty plans are ruined. Naive hopes are dashed. Again and again, the details of reality are the tests that eliminate flimsy thoughts. The thoughts that mesh the best with reality keep the details in.</p><p>This reasonable and experienced point of view is the opposite of the frequent advice to not "get hung up on the details". Offering that innocent-seeming advice can quickly deflect any question...rather than the ordeal of attempting an answer. Even so, the above comments show why it's inherently flawed. It's not as convincing as intended. For the more that someone is nonchalant about the details of the position they're arguing for, the more they deepen the impression that their position is <i>independent of reality</i>. </p><p>This isn't an insult; it's simply the immediate consequence. Real things have many nonnegotiable details in many categories: characteristics, histories, forms, appearances, patterns, locations, interactions with other things, etc. Furthermore, the thing's details are the exact "handles" for grasping its existence. The different existences (substances) of, say, a marshmallow and a bowling ball, are observed through differing details. The more that details don't matter, the more that the nature of a thing appears to be closer to that of a hazy thought than a solid reality.</p><p>(Naturally, there are all sorts of creative and nuanced versions of deities that embrace the quality of being more like human thoughts than real things. In these, "God" isn't an anthropomorphic mind that ponders and takes action. It exists more like an abstract ideal such as Order or Oneness. I'd guess that most who say "don't get hung up on the details" aren't going that route.)</p><p>On the other hand, the advice to drop the details could have several pluses. First, it's probably an <i>earnest</i> reflection of how the speaker really treats their own beliefs. It isn't a ploy. It isn't someone pretending that their beliefs are a logically constructed system of propositions. It isn't the pretense that they arrived at their beliefs after carrying out a long intellectual study. Instead, it's what someone would say after they've <i>already decided</i> what to think. When someone is committed to a stance, they don't need to know the details. They've already signed up, so they're uninterested in double-checking the fine print. Unsurprisingly this attitude is far less appealing to someone who <i>isn't</i> using a belief as a basic assumption or granting it "the benefit of the doubt"—when a belief is under the magnifying glass, the full details do make a difference.</p><p>The second plus of dropping the details is that it's diplomatic. The fewer details that someone insists on, the easier it is to potentially reach a common ground, and the less hardheaded they appear to be. Yet once more this quality has an inherent flaw. If the details of a real thing can easily be viewed differently by different people, then that thing suspiciously resembles <i>subjective</i> thoughts: preferences, wishes, fantasies. Wording is crucial. To say that a detail can be whatever <i>you like</i> or however <i>you see it</i> is to undercut the objectivity of the thing the detail applies to. If the details of a thing can be poured into people's minds, and the details expand to fill each mind's unique shape, then that thing must not have much of a shape of its own. It's worth remembering that these "details" aren't mere interpretations of a thing but its fundamental attributes.</p><p>Admittedly, this result can be dodged with enough imagination. <i>Retreating</i> into paradoxes and mysticism has a long tradition. Religious followers could opt to make themselves very slippery indeed. They may bluntly claim the equal accuracy of many contradictory details. (They may need to do this anyway after they've been forced to harmonize incompatible doctrines.) The key is to propose that a thing is so <i>special</i> that contradictions are united in it: it's too huge or beyond understanding or many-sided. Its form of "realness" is unlike normal realness. It doesn't obey the usual rules. It can't be analyzed or translated into language. This time around, the inherent flaw is so blatant that it hardly needs stating: this level of specialness amounts to the demand to remove the thing from the dangerous realm of logic and debate. The advice to give up on asking a particular question, i.e. not get hung up on the details, mutates into the more drastic advice to give up on the entire <i>mode of thought</i> that the question sprang out of. </p><p>In effect, the suggestion is to compartmentalize the beliefs and apply a lower standard. The scary aspect of this suggestion isn't its strangeness; it's the complete <i>ordinariness </i>of it. Like the brain's cerebral cortex coexisting with the amygdala, the deliberative frame of mind coexists with competing frames of mind that operate along different lines. It consumes more conscious attention and develops at a slower rate. Extracting, collecting, and judging details is harder than accepting a detail-free statement at face value. It's also more intuitive for some personalities than others. These challenges highlight the <i>preciousness</i> of plain details in the search for objective reality. The alternative can't compete: a pile of superficial and/or ambiguous decrees made by authorities who cannot be contested.</p><p>The final plus of the advice to drop the details is that it might be a sign of the faint level of loyalty that the religious follower has. And that would be the happiest outcome—from my perspective. If a follower treats their "belief" as nothing more than a creation of their native culture, then of course the details aren't of vital importance to them. (Anthropologists and historians know that putting <i>belief</i> at the center of religious practice isn't a universal or constant social norm anyway.) They may openly state that they pick out the bits that give them inspiration or reassurance and discard the rest. Or they may value their belief purely as raw material for drawing analogies. Given that it doesn't rule them or impair their comprehension, some may cheerily consider themselves quite "secular" otherwise...perhaps to the point of declaring "I'm an atheistic ______ ." Whether they're idly repeating words that mean nothing to them, or undergoing empty rituals to <i>feel</i> connected to their traditions, we de-converted tend to let them be. They might even concede that they too would want the beliefs to firmly stand upon bold details—but only if they were trying to equate the beliefs with realty in the first place!<br /></p>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-38976207232427103582021-03-27T08:28:00.000-04:002021-03-27T08:28:17.433-04:00no, personal gain is not the main pointSoon after finishing the previous post, I found myself easily anticipating one of the reactions it could provoke. Clearly I'm able to slide right into that well-worn supernatural mindset like a pair of old loafers. The reaction I imagine comes out sounding like, "Pfft. I can see what happened to you. You think that you became disillusioned, but you really were discouraged. You weren't showered with blessings and floating on a cloud, so then you retaliated by throwing your arms up and storming out. God didn't march to the beat of your drum. The cost of living a virtuous life wasn't adequately paying off, so you cut your losses. Mysterious doctrines weren't presented to you in an obvious manner that you could test through experiments. What a shame that religion wasn't simple and easy enough to fit your egocentric demands."<div><br /></div><div>This reaction shows something about how religious followers approach belief: it's a matter of <i>character</i> not analysis (in fact, they may advise someone to stop thinking so hard and "choose to just believe"). They're eager to shift blame from the content of the belief to the vices of the unbeliever. They're content to assume that <i>personal gain</i> drives people to dismiss their beliefs. </div><div><br /></div><div>Moreover, the flaw of greediness fits their ingrained preconceptions about everybody outside their group. <i>Of course,</i> they say to themselves, it's only natural for someone who's not living by the light of truth to view beliefs as a means instead of an end. Reasoning too much about the beliefs' outcomes is nothing more than fixating on what someone can <i>get</i> by having the beliefs. It indicates that someone is on the wrong track entirely. That's the secret of how anyone who once said they believed could go on to fail to be convinced by the beliefs' accuracy.</div><div><br /></div><div>Unfortunately, in addition to oddly shaming the de-converted for the error of taking their beliefs' claims too <i>seriously</i>, this reaction misses the main point. The aim of contrasting the list of grandiose claims to the claims' tight restrictions isn't to whine about how little the claims amount to in practical terms. It's to thoroughly establish the pattern of the restrictions: each one is verrrrrrrry similar to the kinds of restrictions that would be necessary for beliefs that spring out of cognitive biases and communal/ritual reinforcement. True, the restricted claims are still about overlaps between reality and the supernatural realm...but the overlaps are so curiously subtle that someone might reasonably suppose that the overlaps aren't there at all. Or the overlaps are dependent on the lenient mindset of the believer or on the thoughts and actions of other people who conform to the beliefs—thereby <i>making</i> the beliefs real through human rather than divine intervention. </div><div><br /></div><div>The actual observed fulfillments of the amazing claims fall short because they don't provide <i>objective signs of the supernatural</i>, not because they're minor ("thank you for a close parking space, ruler of the cosmos"). A supernatural realm of wondrous claims that merely touches ordinary life in ordinary ways is hard to distinguish from, well, ordinary religions that have been created by humanity for millennia. Some have placed greater emphasis on what the religion supposedly does for you, and some have placed greater emphasis on what you must do without expecting a lot in return. Neither strategy successfully stands up to scrutiny.</div>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-52622033480562471892021-03-21T19:14:00.000-04:002021-03-21T19:14:23.122-04:00restrictions may apply to 15 claims<p>Every once in a while I'm abruptly reminded—accidentally—of the vast differences between my materialistic naturalism and the supernatural beliefs followed by many people I know...and by me too in the past. Of course, the size of this gulf doesn't imply that I crossed it with one stupendous leap. My long journey was a sequence of one small step after another. The change was so gradual that at the end it took some additional self-evaluation to simply <i>realize</i> where I'd come to.</p><p>It's rare for most religious followers to closely examine us de-converted people (and listen to what we plainly say). They're much more apt to think of total outsiders as their opponents. As they see it, their beliefs are unbeatable. Anyone with a thorough understanding would be convinced. Logically then, in the case of anyone who <i>isn't</i> convinced, the conclusion is that their understanding is faulty or incomplete. Total outsiders are opposed because they don't grasp the full truth and the whole story. They would be followers too if only the message were communicated in the right terms and then they surrendered to its charms. </p><p>Just by existing, de-converted insiders derail this line of reasoning. We decline to follow the beliefs that we were regularly taught for years and years. We not only learned but <i>practiced</i> the beliefs within a community, so our view is neither second-hand nor shallow. We were committed, yet we dropped that commitment after it didn't survive further consideration and self-honesty. Greater familiarity wasn't enough to keep us content. It contributed to our ultimate disbelief! We saw up-close that <i>restrictions may apply</i> to the numerous claims that we heard (sometimes via artistic forms). A brisk and incomplete list of the restrictions will emphasize that, while no single restriction to a claim could be convincing enough to overturn someone's core mentality, the sheer number piles up too high to be ignored forever. </p><p></p><ul style="text-align: left;"><li>Claim: God will never abandon you. Restriction may apply: Not only may you never see any concrete sign that God continues to be a personal companion of yours, there may not have been any concrete sign that it <i>ever</i> was.</li><li>Claim: Make bold petitions to God and whatever you ask will be given to you. Restriction may apply: Your request might be ridiculous or premature, so it will be rejected for good reason. Or it might not fit into the grand unknown plan of the universe; after all, every mortal has "their proper time" to succumb to death. No matter what, you'll be left guessing about what God's reaction to the request actually was. </li><li>Claim: Your beliefs will give you joy in the midst of life's troubles. Restriction may apply: For the joy to reliably trigger, you might need to spend an extended time training your brain to reflexively obsess about an almighty being, whose smile you can't see, or perhaps the reward of an afterlife, which you cannot see for yourself beforehand.</li><li>Claim: Jesus was an idealized version of you. Restriction may apply: <i>Anyone</i> who lived at that time and place, and raised in a highly different culture, didn't significantly resemble you in behavior, appearance, or general outlook.</li><li>Claim: God's activity in human affairs will be obvious to you. Restriction may apply: Seeing God's caring intervention everywhere you look will depend on the mental lens that you view events through. By approaching every situation with high expectations, the smallest clue that might be construed as God's fingerprint can be magnified into solid evidence.</li><li>Claim: God will heal the sick. Restriction may apply: Sicknesses that are vulnerable to skilled physicians will be healed by their hard work. Of course, God can still have been assumed to play an unseen role in that...somehow. </li><li>Claim: God controls everything. Restriction may apply: Tragedies with no apparent meaningfulness will happen. Societies will be governed by oppressors. Religious organizations will be led by people who can be motivated in petty ways. </li><li>Claim: There is one set of religious beliefs that's genuine. Restriction may apply: Religious beliefs have multiplied into a bewildering variety of sets, and subsets, and subsets of subsets. Each one contains one or more details in contradiction with the others and there's no objective method to decide among the group.</li><li>Claim: The single firm foundation for ethics is the unchanging commands of God. Restriction may apply: The moral stances of the commands might seem outdated. Specific examples might require contorted interpretations that attempt to explain the <i>intended</i> moral for the modern age. Alternatively, the multiple commands might be exchanged for the simplistic sentiment, "Maybe just try caring about someone else for a change, huh? If you do that then feel free to pretend these other commands are unimportant relics and make up the rest for yourself. Always let your conscience by your guide."</li><li>Claim: Every wrongdoing is forgivable thanks to the <i>unearned</i> mercy of God. Restriction may apply: Forgiveness is granted through the arduous task of being a bona fide follower, rather than a mere pretender who parrots the right words. This task demands the giving of time, money, and effort. Sincerity is a prerequisite. Obedience is not sufficient; you must <i>love</i> your spirit lord. </li><li>Claim: God will provide inner strength to do right instead of wrong. Restriction may apply: Inner strength is something you must laboriously cultivate by thinking regretfully about your past evil actions (confessing helps with that), growing accustomed to strictly policing your spontaneous thoughts, and choosing a new duller lifestyle that prevents common temptations. As with followers of <i>any</i> ideology, the human flair for compartmentalization might lead to the outcome that someone has extreme inner strength for some repulsive evils and yet they have zero resistance to their favorite evils.</li><li>Claim: Official religious documents are accurate and as trustworthy as God. Restriction may apply: Careful and impartial investigation of the universe might appear to strongly disagree with official religious documents. The disagreement can be overlooked as long as the corresponding parts of the documents are classified as <i>metaphorical and poetic.</i> Fortunately, if the documents descended from legends, the original writers probably would've readily admitted that they couldn't know for sure whether the legends had been embellished in countless retellings. In addition, the original decision to decide what documents became official might have been a difficult struggle between competing followers, some of whom would've said that the documents that didn't become official were more accurate. </li><li>Claim: Divine guidance will be provided when someone earnestly seeks it. Restriction may apply: The process of seeking might be a demanding one of prolonged prayer sessions and fasting. The guidance received might be minimal or a nonverbal feeling of "peace" about a tricky decision. Experienced followers will warn that divine guidance should be compared with other sources such as the aforementioned official religious documents, or someone's peers and authorities. As with every claim, the fulfillment of it might come from one or more people who are being moved around like uninformed <i>pawns</i> on God's massive four-dimensional chessboard.<br /></li><li>Claim: God's presence is sensed directly during times of mass singing or praying or times of quiet contemplative solitude. Restriction may apply: The ease of sensing God's presence might vary depending on the follower's imagination, suggestibility, personality, mood, skill of visualization, and the level of distraction in their environment. It might be necessary to brush aside the fact that people can enter similar emotional states in nonreligious circumstances. Consistently getting the best results might require associating the psychological state to symbolic external cues, in the manner that Pavlov documented.</li><li>Claim: The part of people that provides a sense of identity and makes decisions is a nonphysical soul that persists after the body stops functioning. Restriction may apply: No satisfying answer will be offered to the classic philosophical question of how to strictly define the boundaries and interactions between the nonphysical and physical domains. Meanwhile, experts of all areas except theology manage perfectly well without the concept of a soul. (A more interesting tactic would be to reject this claim and argue the doctrine that the whole bodies of the faithful dead will be resurrected/restored prior to entering heaven in the unspecified future.)</li></ul><p></p><p>I'm aware that none of this is revolutionary news. And it's most clearly applicable to my former culture of typical U. S. evangelical Protestantism, a religious category that's losing relevance daily through its own efforts (albeit not without the political equivalent of a kicking and screaming tantrum). Nevertheless, I have faith in the worthwhileness of a passing reminder about exactly why the "sales pitch" wore thin for many of us and still occasionally grates on us now.</p>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-66050768991062521262020-06-09T19:25:00.000-04:002020-06-09T19:25:00.629-04:00unwinnable conditions in the game of defining free willThe sad truth is that games can slip into <i>unwinnable</i> <i>conditions</i>: game situations that cannot lead to an eventual victory. Many one-player card games routinely become unwinnable through an unlucky order of cards in a stack...or short-sighted mistakes in how the player moved the cards around. Some two-player games with multiple pieces, such as chess and checkers, become unwinnable by either player once they're left with next to nothing on the board. <a href="https://en.wikipedia.org/wiki/Adventure_game" target="_blank">Adventure-style computer games</a>, which depend on collecting and cleverly using items to escape danger, become unwinnable through losing (or overlooking) items that <i>will be</i> surprisingly crucial to survival.<br />
<div>
<br /></div>
<div>
Philosophical debates have a tendency to become unwinnable too. This happens when precious yet poorly defined ideas are somehow expected to meet clashing requirements. The result is that the corresponding debates cannot be <i>won</i>, if "winning" means hammering out a self-consistent version of the idea that's satisfactory to everyone in the debates. But the proper response isn't giving up. It's returning the debate to a winnable condition by dropping or reshaping some requirements. </div>
<div>
<br /></div>
<div>
The idea of <i>free will</i> is a prime example, since discussions about it frequently end in stalemates. And it's particularly appropriate here because game-playing itself is an analogy for it. Regardless of how relatively trivial a game might be, the players carry out the widely recognized aspects of free will. They analyze the circumstances, then select from a set of actions, to attain goals. (Of course, topics have been usefully compared to games for a long time. Wittgenstein described language games. Game theory is a branch of mathematics that's constantly referenced in a range of contexts.)</div>
<div>
<br /></div>
<div>Some common unwinnable conditions of the effort to define free will are worth examining. To echo the statement from earlier, the aim of doing so isn't to abandon free will but to pinpoint one or more troublesome requirements. The most fundamental of them should be attacked first: that free will needs to be capable of violating the physical laws that are followed by things that don't have free will. In a word it needs to be unabashedly <i>miraculous</i>. The equivalent in gameplay is spontaneously disobeying the game's rules—which is not the same as when all players agree upfront to slight modifications.<br />
<br />
This places the debate into an unwinnable condition because the best evidence available <i>simply hasn't</i> uncovered this kind of violation. <i>At the scale</i> of the person-centered decisions customarily placed in the category of free will, the rules are firm and uncontroversial. Atoms aren't created or completely destroyed, although some decay. Energy only changes form, and it tends to become more diluted when it does. Velocity stays the same unless something acts to change it—including a velocity of zero. Objects cannot be accelerated to the speed of light. Differing electric charges are attracted. If free will doesn't follow the known rules or patterns, then it cannot be reasonably integrated to the <i>rest</i> of reality. And presumably it could contradict any precise definition assigned to it.<br />
<br />A believable idea of free will should stay within the same broad boundaries that allow for the behavior of galaxies, continents, algae, clouds, platypi, etc. Furthermore, there's an upside: reliable consistency. This is crucial for effective short-term decisions and also long-term plans. Acquired items don't vanish. Notebook pages filled with ink don't switch back to blankness. Projectiles descend at an expected speed. Food temperature doesn't suddenly diverge from the environment it's in. Arguably, without impartial rules to govern the consequences of actions, free will wouldn't be worth a lot in practice. The rules are tools as well as limits. People are participants in rule-governed existence, not spectators.</div>
<div>
<br /></div>
<div>Admittedly, committing crime against physical laws is extreme. Most probably don't lump together free will and sorcery. By comparison, a seemingly moderate compromise is imagining the power to <i>just disconnect</i> from potential influences. Through the application of this power, decisions could be rendered perfectly independent, separated from external things, internal pressures, or the past. Decisions in this state could <i>not possibly</i> be manipulated by anything.</div><div><br /></div>
<div>Nevertheless, this requirement's drastic solution has two problems. The first problem is that it leaves decision-making with a hollow core. It's certainly good that nothing can indirectly yank around decisions like pulling the strings of a puppet. But if the decision isn't determined by anything, then it's uninformed. It's cut off from the context that provides meaning to <i>why</i> action A was selected instead of B. If it's made for no deeper reason, then it's due more to chance than thoughtfulness. High unpredictability isn't synonymous with willful freedom. The problem isn't eliminated by merely using a fancy source of randomness, such as the precise probabilities of quantum mechanics. A heavily randomized decision is still the opposite of taking responsibility. It's more like the game act of rolling dice than the act of choosing to skip an optional dice roll or not. (Or it's like flipping a coin to choose to roll the dice?...)<br />
<br />The second problem with free will hinging on the power to disconnect is the sheer <i>implausibility</i>. Modern knowledge shows, in addition to the previously mentioned physical rules at the human scale, a grand web of cause and effect between the multitude of particles and energy fields of the universe. Therefore it's frankly <i>bizarre</i> to picture a tiny strand wandering off whenever the urge strikes. How could it possibly do that? Why should it have such authority over other strands joined to it? Breaking cause and effect at will qualifies as a superpower. Demanding it isn't a winning approach for arguing that free will is realistic.</div><div><br /></div><div>An alternative is therefore necessary. <i>Adaptation</i> fits the role. It's when actions vary based on influences. The definition is vague because the actions fall into a host of categories. Adaptation might consist of taking the influence's impact and neutralizing/dampening, or amplifying, or aggressively countering, or preventing, or redirecting. In general terms it's perception followed by corresponding action. </div><div><br /></div><div>Importantly, adaptation in the vague sense isn't a unique characteristic of human consciousness. It appears in many things of differing complexity. Single-celled organisms react depending on events in their minute but brutal ecosystems. Computer systems jump between Wi-Fi routers. Hedgehogs roll up into balls. Metaphorically, a spring could be said to adapt, because its "actions" vary based on the stress of squeezing. Obviously, its adaptation is far different from, say, people adapting to their circumstances by revising their life insurance. The part that matters is that <i>feedback</i>, the primitive building block of adaptation, isn't unusual or ghostly at all.</div><div><br /></div><div>Moreover, advanced multi-layered adapting can come quite close to gaining effective control over decisions. If multiple influences are observed, understood, and consciously managed, then the influences are <i>mastered</i>. No single influence <i>dictates</i> the decision. The decision-maker isn't metaphysically separated from influences, but they don't need to be. All the paths from influences to the decision can be, er, adapted: continuing to exist but now much more complex. Within a person, these adapted paths are literal. The signals from the influences take more complicated routes through the brain's network of cells. </div><div><br /></div><div>For one person, the path from stimulus to rage (or panic or gloom, etc.) could be extremely simple and predictable. For another who has adapted by noticing the first signs of oncoming rage, and who has spent significant time weighing the pluses and minuses of how they act, the path traveled by the stimulus could be twisty and stop in any of several outcomes. For the latter, the "natural" or quicker path hasn't been vaporized by psychic force. It's been circumvented through the development and activation of competing paths. And since everything is material, the process of overriding costs actual energy. It's like two crowds attempting to outshout each other. With this in mind, free will can be more of an ideal to strenuously pursue than an inborn capability. </div><div><br /></div><div>One last feature deserves mention. As a temporary adaptation of behavior or thought is repeated over time, it's reinforced. This is an aspect of how brains evolved to learn. And with enough reinforcement, the corresponding brain path functions as the relatively "natural" or quicker one. So adaptation may become a method of <i>self-change </i>and even a type of <i>self-determination.</i> It has the caveat that it's gradual and difficult. Uttering the magic spell "I am different" doesn't instantly engrave adaptations into the self, any more than uttering a magic spell doesn't instantly etch a design into stone. It calls for visionary thinking and persistence, a combination which few things are capable of. To become an expert at a game, nothing helps as much as the "work" of playing countless times, making good and bad moves, in order to replace beginner instincts with seasoned ones. </div><div><br /></div><div>Excessive focus on self-governance has a subtle risk. It might evoke a requirement that makes a definition of free will unwinnable: treating <i>denial</i> as too pivotal. To suggest that rejecting an impulse reflects free will, but fulfilling an impulse doesn't, runs into trouble. First, if denial isn't serving the purpose of a desired result, then it's not inherently meaningful. That purpose might be nothing beyond "proving that I'm the sort who doesn't do that" or "getting some enjoyment out of punishing myself", but it's present. Denial isn't a self-sufficient motivation. Second, the statement can be flipped. A decision-maker who's absolutely ruled by denial isn't convincingly freer than a decision-maker who's absolutely ruled by fulfillment. </div><div><br /></div><div>Third, some cases don't involve a strong element of denial in the outcome. Yet the decision-making process that led to fulfillment could be as rigorous as a process that led to denial. In either decision, the influences could've been thoroughly mastered (...or not). Picking an influence to obey can be a free choice. Free will isn't viewing decisions as stark intellectual puzzles, after pushy motivations have been completely drained out. Relative preferences at the very least should be factored in. If someone were playing a game, nobody would maintain that a craving for victory is incompatible with a reasonable style of play.</div><div><br /></div><div>Clearly, a kind of free will can function without dragging along unsupported assumptions. That might not stop especially stubborn debaters from insisting on unwinnable conditions anyway, by raising a final requirement that to them seems equally clear. As they would say: for people to be <i>truly</i> free, they must make use of something apart from particles to make decisions. After all, a particle doesn't have the freedom to deviate from the predictions of a skilled onlooker. Predictable particles adding up to a person with unpredictable free will doesn't make common sense.</div><div><br /></div><div>The retort won't surprise anyone who's heard this before. Despite their particles, people are in fact complicated. Particles connected in certain ways <i>can</i> rapidly "add up" to more unpredictability than one in isolation. Once a particle is connected to its surroundings, then predicting the particle depends on predicting the effect the surroundings will have on it...and that means predicting the movements of these surroundings. Then those surroundings in turn cannot be predicted without predicting the <i>outer</i> surroundings that affect the closer surroundings. The mandatory edges of the predictions keep getting pushed farther and farther. </div><div><br /></div><div>For instance, the full motion of a particle in one of the bones in one of the fingers of a person typing an email isn't predictable unless the next letter to be typed is predicted. The next letter depends on the word being typed. If the email is about an amusing memory gleaned from the previous day, then the word being typed depends on the person's mental reconstruction of the past. Or they may jump up from the keyboard after the doorbell rings, and then the motion of the one particle can't have been predicted without also predicting the doorbell ringer. </div><div><br /></div><div>The essential problem is there are too many particles interacting and too many independent motions to track. Transforming these inputs into accurate particle predictions would consume an impractical amount of space and time. And the dense simulation might start failing to match reality less than five minutes later, when the person receives a text asking for help...sent from someone on the other side of town. Sure, human free will without something apart from particles <i>could</i> be calculated, but it's pure fantasy. And it's irrelevant next to the immense realm of possible actions for people made of particles, as they sift through all of their intricate concerns and justifications and construct endlessly creative variations. Given that deceptively limited games have turned out to have fascinating depths of strategy, there's no need to worry about feeling restricted by the few real walls around human decisions.</div><div><br /></div><div>(Addendum: A short while ago I read <i>How Physics Makes Us Free </i>by J. T. Ismael. Parts of this blog entry are almost certainly surface-level restatements of ideas I picked up in it. But the book covers interesting topics such as the formation of coherent self-voices, entropy in human experience, and the network model of causes and effects. It also teaches words like diachronic and modal and immanent, with a glossary in the back.)</div>
Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-58551228193911620342019-09-02T09:18:00.000-04:002019-09-02T09:18:57.080-04:00mid-realSome common questions demand surprisingly nuanced answers in practice. For instance: "How real is this thought?" My unremarkable position is that a thought's realness is equal to its connections to actions whose outcomes affect and are affected by realities. These actions may be relatively mental, such as reviewing a mathematical proof, and/or relatively passive, such as observation. But at times the action is more or less the opposite of abstract: perhaps I claim that the thought of a tall, wide, concrete wall on my near left side is real because I can lean over and extend my left arm until I touch it.<br />
<div>
<br /></div>
<div>
This definition can be visualized as <i>semantic distance</i>. How much of a leap is there between the thought and the action outcomes that back it up? Is the thought a bit of a reach...thinly stretched...tenuous? Are there many links in the chain? Is it mostly a plain description or does it sneak in assumptions and speculation? If it's a fruitful substitution of one concept for another, then what qualities are added or subtracted? Depending on the wildly varying answers to these fine-grained considerations, the semantic distance could be like an inch or 500 miles...<i>or in-</i><i>between</i>.<br />
<br />
The consequence is that a thought's realness must also sometimes lie between real and unreal: <i>mid-real</i>. To deny this possibility would be inconsistent. Yet as exotic as it seems, a mid-real thought isn't that unusual. It's not mid-real like a ghost. To start with, it could be an ordinary thought that summarizes, such as the mean age of the people at a family reunion. It's very possible that <i>nobody's</i> age is equal to the mean age of everybody. Nevertheless, it manages to represent the group as a whole through a method that's transparent and clear-cut.<br />
<br />
Someone may point out that summarizing thoughts are obviously real because they're not mere fantasies. The problem with that argument is that not only fantasies but metaphorical communication in general is capable of conveying real meanings; these meanings are simply emotions and analogies rather than bare facts. The dramatic thought "I couldn't take another step" is more expressive of current foot pain frustration than the reality of complete leg exhaustion. The thought is therefore mid-real. Its obvious exaggeration is less connected to the superficial meaning of the sentence than to the the communicator's real state of mind. The semantic distance is a longer scenic journey. Grand fictional narratives can be mid-real in a loose sense also. Whether it's a talking animal fable or a dramatization of true events, the accompanying message or theme that it carries is precisely the part of it that's more real.<br />
<br />
However, narrowing focus onto purely literal thoughts isn't a surefire escape from mid-real status. Too many complications crop up. For example, a straight line from a cause to an effect isn't always the case; many factors can affect one thing simultaneously. In these situations, the naive thought "my favorite single factor is enough to thoroughly explain the state of the complex thing" is neither strictly real nor strictly unreal. It's just mid-real because the factor is surrounded by others, and its own very real effect is partial. I'm reminded of thoughts about diet correctness. Both a person's body and the items they ingest are highly complex. It's tempting to pick a dietary factor and embrace the thought that it's the <i>one real cause</i> of desirable (...or undesirable) physical welfare. Then the thinker is freed from the burden of admitting that the factor's level of control might be greatly modified by the presence or absence of other factors: how the food is prepared, what else is ingested at the same time, personal sensitivities, how much of the food is eaten, and the time of day. But the topic isn't hopeless. Well-grounded (albeit boring) dietary recommendations abound. One or more minor improvements are better than none.<br />
<br />
While the messiness of realities can lead to certain thoughts being mid-real, the messiness of actions can too. The truth is that actions' outcomes are routinely imperfect. Besides the inherent limits of all feasible actions, there might be some kind of unpredictable interference, or natural degrading of tools (our own aging sense organs qualify), or an outcome that mimics another. No matter the root causes, honesty calls for the language of <i>probability and range</i>. "The reality is 60% likely to be in the narrow range 300 to 800, and 95% likely to be in the wide range 50 to 1200." Regardless of the amount of precision<span style="background-color: white; color: #4a4a4a; font-family: "arial" , "open sans" , "helvetica" , sans-serif; font-size: 18px;">—</span>or the "numbers" being gut feelings<span style="background-color: white; color: #4a4a4a; font-family: "arial" , "open sans" , "helvetica" , sans-serif; font-size: 18px;">—</span>the result is that the connected thoughts are colored mid-real. Fortunately, of course, mid-real thoughts are valuable anyway for planning. Flexible plans can be constructed to succeed based on the whole range. A mid-real all-or-nothing situation is less manageable, but knowledge of the probability is at least good information.<br />
<br />
Lastly, thoughts can be mid-real <i>all by themselves</i>. Some thoughts are essentially mid-real. Due to the thought's construction, the supposedly supporting actions actually can't settle the question of its realness. The issue might stem from the thought being too vague, or self-contradictory, or accommodating of any conceivable circumstance, or demanding circumstances that probably can't ever be achieved. Mid-real thoughts of this kind may be comforting or fun to play with, but allowing them a status beyond mid-real would be inconsistent with the definition applied here. Admittedly, the thoughts' creators and followers may not <i>mean</i> for the thoughts to be mid-real. Maybe they haven't noticed the gradual <i>dilution</i> of the thought they champion; now, for whatever reason, they're continuing to treat the thought the same although it's been emptied of firm assertions. Then again, that might be the characteristic that attracts them. They cannot be shown to be <i>conclusively wrong</i>. The downside is that they cannot be shown to be conclusively right either.<br />
<br />
I realize I'm obligated to confront one particular objection. What if the practical, genuine existence of mid-real thoughts is only a distraction from the main question of realness rather than a valuable clue about which answers are serious? Did I start out by skipping too casually over the terms "thought", "action", "outcome", and of course "realities"? If the subtle difficulties of mid-real thoughts were escaped, what would the ultimate source of realness be? Unfortunately, my response is to concede that I offer no escape. Those terms are indeed circular (interdependent). Thoughts are events in the brain. The judgment of connections between thoughts and actions happens in the brain but so does the judgment of which realities are in harmony with which outcomes. Thoughts are approximations of realities but realities are only represented in the brain by thoughts. ("I'm 70% sure I see the back of Bill's head across the room" is a thought.) The hypothetical actions to execute in order to verify thoughts <i>originate</i> as thoughts. The thoughts can't be infallibly real because the actions need to tie them to realities. Even the realities can't be infallibly real because realities are in the form of sometimes-incorrect thoughts. The actions can't be infallible because of the flaws that have been explained. No single component is enough to be an ultimate source of progress toward realness. The path requires a team effort. Though, on occasion, the teamwork might resemble a melee as realities force thoughts to change or shatter. </div>
<style type="text/css">
p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica}
</style><style type="text/css">
p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica}
</style>Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-20572704662234533642019-01-19T10:37:00.002-05:002019-01-19T10:37:56.959-05:00in the eye of the decoderThe saying goes that beauty is in the eye of the beholder. Well, what if a symbol's meaning is in the eye of the decoder?<br />
<br />
Significant philosophical disagreements stem not only out of holding differing ideas but also out of holding differing <i>degrees of interpretation </i>of the same ideas. That's why beliefs frequently come in strong/hard versions versus weak/soft versions. Surprisingly, the distinction between the strong and weak interpretation of "meaning is in the eye of the decoder" indicates a lot about someone's whole philosophy. If two people are divided on this, they're probably divided on more too. <br />
<div>
<br /></div>
<div>
The weak interpretation should be undeniable: meaning is <i>affected</i> by the decoder fairly often. They're part of the context, and any single symbol's perceived meaning can be greatly modified by whatever context it's in. For instance, the meaning of "boot" to UK listeners can be unlike its meaning to US listeners. Or words that are offensive to one society can be relatively less offensive in another. Or a film that enchants one viewer can be dumbfounding to another; meanwhile its writer and director might have had another intent altogether.</div>
<div>
<br /></div>
<div>
However, in my view—and in the view of countless other thinkers now and in the past—the relationship runs deeper. I side with the stronger interpretation: the meaning <i>is literally in</i> the decoder (...just not in the eye). It's an <i>event</i> of a decoder. This event is the meaning's real essence (locus). And it probably occurs a multitude of times in a multitude of decoders, unless the meaning truly is novel. The converse is that a symbol without a decoder cannot have meaning.<br />
<br />
<a href="https://en.wikipedia.org/wiki/Linear_B" target="_blank">Linear B</a> is an illuminating case. It's a puzzling and old form of writing that was discovered by archaeologists. It wasn't well-understood at first, and several insights led to success some years afterward. In order to be consistent with my view, I'm forced to assert that all the tablets of Linear B didn't have real meanings for an extended period. Bluntly put, once the original writers/readers all died there <i>weren't</i> meanings for centuries, until the breakthroughs that enabled scholars to translate the symbols. Of course, they had excellent reasons to expect that they <i>could</i> <i>reconstruct</i> coherent meanings eventually. The source markings had the characteristics of language elements; they repeated in lengthy sequences but not in an unchanging or a random pattern.<br />
<br />
Despite how it appears, distinguishing "they found the meaning" and "they reconstructed a meaning" isn't pointless hair-splitting. If someone objects to the idea that the meaning itself <i>exists purely</i> in the decoder at the time of decoding, then they face a deluge of sensible follow-up questions. Where else is the meaning? How is it created and destroyed? Does it move or metamorphose or duplicate? Why can it be perceived differently?<br />
<br />
Because these questions revolve around the <i>metaphysics</i> of the objector's alternative notion of meaning, their answers reveal whatever stuff they prefer to <i>tack onto </i>physical reality. Again, this is a striking contrast to simply accepting that meaning is an event of the decoder. Then everything involved can be matter alone and standard physical phenomena. The meaning consists of the state of the decoder's matter after the task of decoding has changed it.<br />
<br />
In effect, symbols are like the steps to follow to shape the decoder into an internal arrangement that embodies the meaning. At a low level they function like pressures, sometimes quite subtle, on the motions/physics of particular segments. Having the ability to elaborately shift in response to symbols is how something <i>qualifies</i> to be a decoder. Even decoding is <i>transcoding</i>, in which the result's new code is the inner code for meaning used by the decoder's substance.<br />
<br />
The crux, previously mentioned, is that the symbols are matter, the path that the symbols take to the decoder is a path taken by matter or energy, e.g. waves of sound, and the consequences of the received symbols in the decoder happen in matter. This overall picture has obvious appeal to people whose views leave out popular supernatural concepts such as souls and eternal realms. It's relatively less common to stubbornly combine an irreligious stance with a metaphysical understanding of meaning. One possible fusion, which echoes panpsychism, is that matter in general has a "mind property" in addition to its detectable properties.<br />
<br />
On the other hand, regardless of the number of issues avoided when meaning doesn't have an extra-special kind of existence, the important issue of telling apart subjectivity and objectivity becomes a little problematic. How can meaning ever be objective at all if it's the <i>subjects'</i> matter? The challenging answer is that it's certainly not <i>by default</i>. Greater levels of objectivity are <i>progressively earned </i>through diligent work to connect meanings to objects rather than only subjects.<br />
<br />
Thus the meanings that are most objective are precisely those that have been most thoroughly backed by such work. Ideally the full details of the work are then communicated and recorded so that everyone may judge how much objectivity has been earned. The meaning of "the nation of Suriname is north of the nation of Uruguay" is considered highly objective because recent maps are plentiful, trustworthy, and in unwavering consensus. For this reason the action of viewing a South America map verifies that the meaning is tied into objective reality, regardless of how many subjects the meaning is formed in at any moment. If a meaning can be reliably applied in relevant actions, then it's objective enough. It doesn't <i>need</i> a mystical external abode. The metaphorical eye of the decoder suffices.</div>
Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-19651927102114254062018-11-19T22:19:00.001-05:002018-11-19T22:19:51.073-05:00lotto balls and snap judgmentsThe mindfulness meditation craze has led to a few odd misconceptions about its goals. I'm convinced that <i>imprecise wording</i> is partly to blame. Like other subtle mental states, mindfulness can be easier to describe by emphasizing <i>what it's not</i>. So teachers note that it <i>isn't</i> analyzing any one item in depth, and it <i>isn't</i> reacting emotionally—positively or negatively—to any one item. They may refer to the meditator's role as "bare awareness" of the flow of present experience. Apart from re-centering attention when it becomes entangled, the only thought process is receptive observation. And the only attitude is undisturbed neutrality. Unfortunately, enthusiastic beginners may confuse these instructions' overall intent. They may jump to the conclusion that these are absolute laws for a mindful life: thinking and judging are no-nos.<br />
<br />
But the slightest investigation into Buddhist tradition reveals that mindfulness meditation coexists with strong reverence for reasoning and morality. Obviously, wisdom and right living are values of equal or greater importance. Given this context, the proper aim <i>can't</i> be to extinguish every form of mental activity that supposedly falls into the vague categories of "thinking" or "judging".<br />
<br />
As I understand it, the target of mindfulness is more specific: <i>lotto ball</i> thinking. A lottery's mixing machines rapidly churn masses of numbered balls until several emerge from the chaos in single file. Brains are like mixing machines, but the objects are nerve impulses that travel and combine and fizzle out along myriad pathways. A deluge of signals comes in from the senses and the body itself. It's no surprise that surprising thoughts pop out of the cerebral cortex's vortex like lotto balls, sometimes for seemingly little reason.<br />
<br />
Meditation helps people to recognize the existence, the extent, and the nature of lotto ball thoughts. However, I for one wouldn't claim that lotto ball thoughts are <i>inherently</i> useless or harmful. Some are flashes of creative brilliance. Even the most distasteful could be beneficial for discovering uncomfortable truths about self-destructive thought patterns.<br />
<br />
The point is the ongoing insight that lotto ball thoughts <i>just happen naturally</i>. Meditation is practice for seeing them for what they are and then responding to them deliberately and productively. Without it, lotto ball thoughts have a greater chance of pushing and pulling the thinker in various contradictory directions, prompting them to develop a sour mood, encouraging their selfishness, distracting them with trivia, etc. The tyranny of lotto ball thoughts is the enemy; thinking isn't.<br />
<br />
Similarly, I'd argue that mindfulness has the specific target of <i>snap</i> judgments. Snap judgments are either instinctive or embedded by long-term associations. They follow immediately on the heels of the judged item. They're so basic that writers frequently call them attractions or aversions. They're perceived as powerful because they're raw and deeply rooted. Needless to say, they're not obligated to make much sense.<br />
<br />
Once more, I for one wouldn't claim that the lesson to be learned is that snap judgments are <i>inherently</i> incorrect. (Did I mention that I'm neither a mainstream nor secular Buddhist?) Instead, meditation is for practicing the often difficult task of refusing to act on these snap judgments or to dwell/ruminate on them. Then these will gradually fade, like the electricity-saving screen of a device that's merely stared at rather than interacted with.<br />
<br />
The effort to defang snap judgments doesn't mean pretending that pleasant experiences aren't pleasant or that awful experiences aren't awful. It doesn't mean indifference about the world and society. Arguably, it <i>enables</i> sophisticated and coherent judgments to take the place of error-prone gut feelings. At the same time it assists with effectively <i>carrying out</i> whatever principles have been chosen. High-minded ideals can hardly be followed while someone is really ruled by the whims of the moment.<br />
<br />
I hope it's clear that mindfulness isn't an alternative to rationality or a moral center. It's an admirable tool for pursuing both!Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-30884343725155759792018-05-18T22:24:00.001-04:002018-05-18T22:24:49.078-04:00the column of sievesIn the midst of controversial debates, it's too common for either side to demand a "smoking gun" proof from the other. They insist on an item of evidence that's easily understood, beyond all doubt, and strikingly dramatic. They need to be <i>impressed</i> before they'll budge an inch. (Of course, whether or not their demand is sincere is a separate question.)<br />
<br />
The strategy works in the heat of a debate because grand indisputable tests are relatively rare. Tests that are actually <i>workable</i> tend to have built-in limitations. Ethical reports of such tests attach sober <i>probabilities</i> to the corresponding conclusions. Imperfection is normal. Tests have holes. The genuine reality could "slip through". A favorable result <i>might</i> still be wrong.<br />
<br />
Nevertheless, the big picture is more hopeful. Uninformed people often fail to grasp the total value of an <i>array</i> of imperfect tests. Although one test has known weaknesses in <i>isolation</i>, combining it with other tests can make a vast difference. If one test's holes are like the holes in a sieve (...a sieve with fewer holes than a real sieve would have...), then an array of favorable tests is like a line of sieves arranged in a column. Through this arrangement, anything that slipped through one sieve's holes would probably not slip through another's as well.<br />
<br />
The rules of probability correspond to this analogy. As long as the tests are statistically <i>independent</i>, i.e. running one test doesn't affect another, the likelihood is very low that <i>all</i> the favorable results of the array of tests are simultaneously false. That coincidence would be unlikely. One example is five successful tests, and each success had a one in six chance of being false. Having every success be false would be comparable to getting only sixes in five rolls of a die.<br />
<br />
Unfortunately, the convincing persuasiveness of an array of tests is a much more difficult <i>story</i> to tell. The focus is abstract and awkward. In place of a lone heroic test to admire, there's a team of mediocrity. It's broad and messy, like reality. Sleek certainty is easier to find in fantasies.<br />
<br />
And the situation is generally less clear than this simplified portrayal. The mathematics are more complex and the conclusions more arguable. Perhaps the test results aren't all successes. Or opponents may suggest that the probability that a test is wrong should be greater; they may assert that it's far more flawed than it was said to be. Or, if they're more subtle, they may try to claim that the tests in the array just have identical holes ("pervasive blind spots"), so multiple tests aren't an improvement over one.<br />
<br />
Attacks on the details would be progress compared to the alternative, though. Better that than a childish sweeping rejection of tests that can't be faultless. Perhaps the usual tests are suspect, but the recommendation is to round up all these usual suspect tests anyway. Four honest approximations reached through unconcealed methods deserve more credit than an "ultimate answer" reached through methods which are unknown or secret or dictatorial or unrepeatable.Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-82351612859023922082018-02-27T22:30:00.001-05:002018-02-27T22:30:07.965-05:00verifiable definitions for everybodyIt's a frequent mistake to try to elevate one culture's notions into universal laws. That's why I don't mind considering the charge that the philosophical stance I encourage is a "purely Western invention". If it had no chance at broader relevance or appeal, I'd rather know than not.<br />
<br />
Nevertheless, at least the core principle isn't a Western-only concept. It's binding an idea's meaning and accuracy to the "shadow" it should cast on outcomes, i.e. realities which are exposed via human actions. (These actions might be mental and/or passive, such as calculating or unbiased observing.) The important instructions to follow this principle properly aren't especially Western either: to be <i>diligent, honest, and fair</i> throughout the tasks of gathering up and evaluating the outcomes that <i>supposedly</i> reveal the corresponding idea's meaning/accuracy; to be alert to any outcomes that are equally supportive, or <i>more</i> supportive, of <i>competing</i> ideas; and most difficult of all, to recognize the <i>absence</i> of the idea's shadow on outcomes that really should've been affected <i>if</i> the idea were meaningful and accurate.<br />
<br />
The reason I think this stance is widely applicable is because of the widely occurring human <i>problems</i> that it was created to address. When one person is trying to narrowly determine what another person is talking about, comparing the idea to specific actions and outcomes helps. "I'm talking about the position you would reach by traveling to latitude and longitude coordinates X and Y in decimal degrees...or the position you would see on a map by finding the intersection of them." A second problem might be trying to decide whether another person is using differing ideas to express a meaning that's close to identical. A third problem might be trying to estimate an idea's overall feasibility by estimating the feasibility of the actions associated with it. A fourth problem might be trying to deflate another person's deception (or ignorant self-delusion) through noticing that their idea lacks adequate rationales that anyone can check. Wherever and whenever there are groupings of people whose symbolic communication empowers them to refer in detail to things that <i>aren't here right now</i>, they're empowered to carelessly refer in detail to things that <i>aren't so</i>.<br />
<br />
Actually, speaking from where I sit, it's debatable how<i> widely</i> this stance is firmly embraced within any of the cultures that are said to have a Western heritage. It's in a continual contest with other cultural currents, some of which have the backing of social pressure and a formidable history. The effect is that people within these cultures have also long used alternative "methods" such as superstition or the opaque rulings of unchallenged authority figures. True, the exact <i>content</i> of the shoddy ideas changes over time, because fashions—and blind spots—change. But the mechanism is depressingly consistent. For instance, anxiety about counteracting bad spirits gives way to anxiety about counteracting bad energy (or counteracting minute quantities of bad toxins?). I'm forced to confess that even many of the members of my own culture aren't in favor of the stance I preach.<br />
<br />
A more general truism is at work here: sorting both ideas and cultures by a single Western/non-Western <i>split</i> is too coarse-grained to reliably predict people's responses. There are significant differences among all the cultures which are said to have a Western heritage. The result of these differences is that my stance is more welcomed in some of them than in others. And it's more welcomed in some <i>subcultures</i> more than others.<br />
<br />
Just as receptiveness varies throughout the group of Western cultures, it varies throughout non-Western cultures as well. In the same way that everybody placed on one side cannot be assumed to be completely open to it, everybody placed on the other side cannot be assumed to be completely closed to it. Despite eager attempts both to neatly assign an idea to one side and then to emphasize a rift between the sides, large numbers of people on either side have always been willing and able to absorb the bits they like from the ideas that reach them. Voluntary exchange of useful ideas has been going on for as long as the voluntary exchange of goods has. The cultural divide might be a good representation of the accidental misunderstandings that can happen so easily...but it's hardly an <i>impassable</i> barrier in practice.<br />
<br />
My expectations of finding common ground don't stop there. I suspect that "they" might find that portions of the idea feel familiar. Worthy ideas have the tendency of springing up independently in several times and places, although the triggering situations and the forms of expression are of course unique. In this sense, I highly doubt that the origins of stances <i>like</i> mine have always been in the cultures in the Western pigeonhole. As I stated earlier, the most abstract debates about meaning and accuracy still have natural motivations. People in non-Western cultures must have developed their own versions of the triangle of ideas/actions/outcomes...though perhaps less formally or less fully. If so then their reaction will be "Oh, you're describing a view that has a few strong resemblances to ___ in my culture" instead of "You're describing a view which is entirely alien to anything that I've known before".<br />
<br />
The risk of focusing on culture is to miss the other levels in which opposing ways of thinking clash. Clashes at the level of culture are obviously pertinent, but so are the clashes at the levels of the individual's everyday struggles with their own thoughts. The truth is that my upbringing in a Western-categorized culture <i>wasn't</i> sufficient to stop me from restricting the introspective reach of this "very Western stance" for years. Else it would've clashed with the unsound beliefs that I shielded from its standards. The turning point came when I aimed it at my base assumptions; I stopped treating it as an optional <i>tool</i> suitable solely for limited areas of knowledge. As tough as it might be to introduce a way of thinking to someone, it's not as tough as the next challenge of convincing them to value a rigorously examined, intellectually coherent life. The end goal isn't to convert every part of their culture to be more "Western" (how boring), it's to equip them to better analyze the parts of their own many-sided cultures for themselves and to free their minds if they wish. Some Westerners like I have had the <i>same</i> task.Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-76609285621906490832018-01-06T11:01:00.001-05:002018-01-06T11:01:15.102-05:00Force-curious but not Jedi?By springing surprise after surprise on its fanatical audience,<i> Star Wars: The Last Jedi</i> has triggered an avalanche of differing reactions. I for one was neither passionately against or in favor of it. My personal interest in anything related to Star Wars has been lukewarm for years. I liked it enough while I was at the theater, but I didn't have the urge to go again. I'm less disturbed by its own plot twists than I am about the seemingly slim possibilities that it left for the <i>next</i> movie. What will the heroes do now to save themselves and defeat their foes? What mysteries remain unsolved? Perhaps the third Star Wars movie of recent years will be subtitled "The Search for Lando".<br />
<br />
The movie managed to stun me with a few jaw-dropping scenes, though. All of these were playing with one revolutionary concept: <i>the</i> <i>Jedi voluntarily becoming extinct</i>. I need to hastily add that the dialogue pointedly clarifies, however, that the hypothetical extinction would apply to the Jedi "religion" alone. The Force itself could never become extinct. As a result, individuals who have Force awareness and/or abilities would still be around, too. Without Jedi scholarship or apprenticeship to guide them, their belief status would be Force-curious but <i>not Jedi</i>. They'd be non-Jedi or "Nons" for short.<br />
<br />
A Star Wars universe of Nons offers plenty of fuel for speculation. Rather than benefit from the findings (and mistakes!) of their predecessors, each generation of them would fumble anew at understanding elementary topics and performing novice feats. They might opt to pick and choose from a hundred contradictory understandings of the Force. With their meager knowledge, they'd try to judge for themselves whether particular actions were "light-side" or "dark-side". Their ignorance of the dangers would inevitably result in a few—maybe more than a few—using the Force for selfish aims and eventually having a destructive effect on everything around them. In addition, some of them would be motivated by their loyalty to the specific groups they identify with to use the Force purely to advance the glory of their group instead of the common (galactic) good.<br />
<br />
Their contact with the Force would be outside of time-tested paths. They wouldn't be weighed down by the irritating restrictions of petty Jedi taboos. They'd be free of the drudgery of antiquated teachings and heavy-handed institutions or councils or elders. On religion surveys they'd check the box for "no affiliation". If asked for more details, they'd say that they're <i>spiritual but not religious</i>. If pestered what they're <i>spiritual about</i>, they'd say that they can sorta sense an invisible Force of benevolence when they reach out with their feelings. And they can lift rocks or janitorial tools through telekinesis.<br />
<br />
I suppose that this is well-aimed at the current cultural context. My guess is that many would be open to seriously considering that informal laid-back Nons going back to the basics of transcendence would be an improvement from haughty, intrusive, inflexible, child-indoctrinating Jedi who oversee big budgets and vast initiatives in a mega-Temple. Organizations, religious or otherwise, are loathed for how they can be abused to empower terrible leaders and control the members.<br />
<br />
That's not all. In some, this attitude also operates at a deeper philosophical level. They loathe committing fully to definite statements of <i>their</i> <i>own beliefs</i>. In the spiritual realm, and almost everywhere else, it's thought to be more acceptable for everyone to be on their separate individualized journeys. Clear and verifiable thinking is devalued. Or, worse, it's purposely shunned to ensure that no opinion can ever be viewed as more accurate than another. That's why it's optional to confidently ground one's abstract beliefs in concrete actions, facts, duties, etc.<br />
<br />
I doubt I'm spoiling the movie by revealing that, in the end, it didn't overturn one of Star Wars' core parts. It only proposes and discussed the shocking concept of a complete handover from Jedi to Nons. Nevertheless, it shows the pluses and minuses that stand out <i>to me</i> when I listen to the mushy words of the spiritual but not religious. If they're always able to think <i>whatever</i> they wish about "spirituality", then the odds are excellent that they're merely imagining a fluid external "cause" that they fill with their wholly subjective experiences. By its nature this phantom cause couldn't have an innate form or shape to take into account. But if they object to this impolite characterization and insist that their spooky spirit is as real as the Force is within the Star Wars fictional universe, then why does It exist by vaguer rules than <i>every other</i> real thing? Unlike those other real things, why can't they or anyone speak as conclusively about It in ways that are compatible with neutral confirmation...or disproof?Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-50616759611339571082017-12-09T09:26:00.002-05:002017-12-09T09:26:14.827-05:00hunting for licenseFollowers of materialistic naturalism like myself have a reputation of scoffing at wishful thinking. We're pictured as having an unhealthy obsession on bare inhuman facts. Despite that, I'm well aware that one of my own ideals is closer to a wish than to a reality a lot of the time. I refer to the recommended path to accurate thoughts. It starts with collecting all available leads. After the hard work of collection comes the tricky task of dispassionately sorting and filtering the leads by trustworthiness. Once sorting and filtering are done, then the more trustworthy leads form the criteria for judging among candidate ideas. The sequence is akin to estimating a landmark's position after taking compass bearings from three separate locations, not after a single impromptu guess by eye.<br />
<br />
<div>
If I'm reluctantly conceding that this advice isn't always put in practice, then why not? What are people doing in its place? We're all creatures who by nature avoid <i>pain and loss</i>, including the pain of alarming ideas and the loss of ideas that we hold. That's why many people substitute the less risky objective of seeking out the leads which would <i>permit</i> them to retain the ideas they cherish for reasons besides accuracy. They're after information and arguments to give them <i>license</i> to stay put. As commentators have remarked again and again, the longest-lasting fans of the topics or debates of religious apologetics (or the dissenting counter-apologetics) are intent on cheering their side—not on deciding to switch anytime soon.</div>
<div>
<br /></div>
<div>
Their word choices stand out. They ask whether they <i>can believe</i> X without losing respect, or whether they <i>must believe</i> Y. Moreover, they significantly <i>don't ask</i> which idea connects up to the leads with less effort than the others...or which idea introduces fewer shaky speculations than the others. The crux is that their darling idea isn't absurd and it also isn't manifestly contrary to one or more indisputable discoveries. They may still care a bit about its accuracy relative to competing ideas, but by comparison this quality is an afterthought. They're gratified as long as the idea's odds exceed a passable threshold. They can honestly envision that the idea could be valid in some sense. The metaphor isn't searching for a loophole but snatching up every usable <i>shim </i>to fix the loose fit of their idea within the niches that need filling. The undisguised haphazardness of it at least ensures that it's adaptable and versatile.<br />
<br />
Of course, at root it's a product of <i>compromise </i>for people who are trying to navigate all the forces which push and pull at them. It soothes them about the lower priority they're consciously or unconsciously assigning to accuracy. By scraping together an adequate collection of leads to make an idea viable, they're informing everyone, including themselves, that their selection of the idea isn't unreasonable. If the pursuit of authenticity were a game, they'd be insisting that their idea isn't out of bounds.<br />
<br />
Irritatingly, one strange outcome of their half-cured ignorance might be an overreaction of blind confidence. The brashest of them might be moved to declare that their idea is more than just allowable; it's a first-rate "alternative choice" that's as good or better than any other. By transforming it to a matter of equal preference, they can be shameless about indulging their preference.<br />
<br />
In isolated cases it really could be. But the relevant point is that, successful or not, the strategy they used was backwards. They didn't go looking honestly for leads to the top idea. All they wanted was greater license to keep the idea they were already carrying with them while they looked. In effect, thinking in these terms motivates more than a mere <i>bias</i> toward noticing confirmations of their idea: they're sent on the <i>hunt</i>. Needless to say, they can't expect <i>anybody else</i> to be as enchanted by their hunt's predictable findings.</div>
Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-62410874667320219892017-12-01T10:42:00.001-05:002017-12-01T10:42:34.243-05:00trading postsFrom time to time I'm reminded that it's misguided to ask, "Why is it that faulty or unverified ideas not only survive but thrive?" In my opinion the opposite question is more appropriate: "What has prevented, or could prevent, faulty or unverified ideas from achieving <i>even more</i> dominance?" The latter recognizes the appalling history of errors which have seized entire populations in various eras. The errors sprout up in all the corners of culture. The sight of substandard ideas spreading like weeds isn't exceptional; it's <i>ageless</i>.<br />
<br />
The explanations are ageless too. One of these is that the ideas could be sitting atop a heaping pile of spellbinding <i>stories </i>of dubious origin. After a story has drawn its audience toward an unreal idea, it doesn't vanish. Its audience reproduces it. It might mutate in the process. When it's packaged with similar stories, its effect multiplies. Circles of people go on to trade the stories eagerly, because when they do they're trading mutual reassurance that they're right. There's a balance at work. Possessing uncommon knowledge is thrilling, especially when it's said to be both highly valuable and "suppressed". But the impression that at least a few others subscribe to the same arcane knowledge shields each of them from the doubt that they're just fantasizing or committing an individual mistake.<br />
<br />
As is typical, the internet intensifies this pattern of human behavior rather than creates it out of nothing. It's a newer <i>medium</i>, albeit with a tremendous gain in convenience and visibility compared to older forms. In the past, the feat of trading relatively obscure stories depended on printed material such as newsletters or pamphlets or rare books. Or it happened gradually through a crooked pipeline of conversations that probably distorted the "facts" more and more during the trip. Or it was on the agenda of meetings quietly organized by an interested group that had to already exist in the local area.<br />
<br />
Or a story could be <i>posted</i> up somewhere for its intended audience. This method especially benefited from the internet's speed and wide availability. Obviously, a powerful computer (or huge data center full of computers) with a memorable internet address can provide a spectacular setting for modern electronic forms of these posted stories—which have ended up being called "posts". Numerous worldwide devices can connect to the published address to rapidly store and retrieve the posts. Whether the specific technology is a bulletin board system, a newsgroup, an online discussion forum, or a social media website, the result is comparable. Whole virtual communities coalesce around exchanging posts.<br />
<br />
Undoubtedly, these innovations have had favorable uses. But these also have supplied potent petri dishes for hosting the buildup of the deceptive, apocryphal stories that boost awful ideas. So when people perform "internet research", they may trip over these. The endlessly depressing trend is for the story that's more irresistible, and/or more smoothly comprehended, to be duplicated and scattered farther. Unfortunately <i>that</i> story won't necessarily be the most factual one. After all, a story that isn't held back by accuracy and complex nuance is freer to take any enticing shape.<br />
<br />
It might be finely-tuned in its own way, though. A false story that demands <i>almost no</i> mental effort from its audience might provoke disbelief at its easiness. It would have too close of a resemblance to a superficial guess. That's avoided by a false story that demands a nugget of sustained but not strenuous mental effort—or a touch of inspired cleverness. It more effectively gives its audience a chance of feeling smug that now they're part of the group who're smart and informed enough to <i>know better</i>.<br />
<br />
I wish I knew a quick remedy for this disadvantage. I guess the superior ideas need to have superior stories, or a mass refusal to tolerate stores with sloppy substantiation needs to develop. Until then the unwary public will be as vulnerable as they've ever been to the zealous self-important traders of hollow stories and to the fictional "ideas" the stories promote.Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-17661504930814102222017-11-27T19:45:00.005-05:002017-11-27T19:45:56.196-05:00no stretching necessaryI'm often miffed at the suggestion that <i>my</i> stance is particularly extremist. According to some critics, materialistic naturalism is an excessive interpretation of reality. It's a <i>stretch</i>. They submit that its overall judgment is reaching far beyond the tally of what's known and what isn't. It's displaying its own type of dogmatism: clinging too rigidly to stark principles. It's declaring a larger pattern that isn't really present.<br />
<br />
This disagreement about who's stretching further is a useful clue about fundamental differences. It highlights the split in our assumptions about the proper neutral starting point of belief. When two stances have little overlap, the temptation is to hold up a weak mixture of the two as the obvious default, i.e. the most minimal and impartial position. Like in two-party politics, the midpoint between the two poles receives the label of "moderate center". As the poles change, the center becomes something else. I gladly admit that according to their perceived continuum of beliefs about the supernatural domain's level of activity and importance, mine lies closer to one of the clear-headed ends than to something mushier.<br />
<br />
From my standpoint, their first mistake is imposing the <i>wrong continuum</i>. Applying it to me is like saying that the length of my fingernails is minuscule next to a meter stick. Although a continuum <i>can</i> be drawn to categorize me as having an extreme stance, the attempt doesn't deserve attention until the continuum itself is defended. It's not enough to decree that people more or less radical depending on how much their stance differs from yours. For a subjective estimation to be worth hearing, the relative basis for the estimation needs to be laid out fairly. Part of that is acknowledging the large role played by culture. Rarities in one culture might be commonplace in a second. Many times, perhaps most of the time, the customary continuum of "normal" belief isn't universal but a reflection of the setting.<br />
<br />
The good news is that the preferable alternative isn't strange or complicated: it's the same kind of continuum of belief that works so wonderfully in myriad contexts besides this one. This kind begins with the stance that people have before they've <i>heard</i> of the belief. This beginning stance is perfect uncertainty about the belief's accuracy. It's at 0, neither positive nor negative.<br />
<br />
Yet until the scales are tipped for high-quality reasons, the advisable approach is to <i>continue</i> thinking and acting on the likely guess that the belief isn't to be trusted. This is a superb tactic simply because unreliable beliefs are vastly <i>cheaper</i> to develop than reliable beliefs—the unreliable should be expected to outnumber the reliable. In the very beginning, before more is known, to make an unjustified leap to a strongly supportive stance about the belief...would be a stretch.<br />
<br />
That's only one point of the continuum. But the rule for the rest is no more exotic: the intensity of belief corresponds to the intensity of validation. Information raises or lowers the willingness to "bet" on the belief to serve some purpose. The information accumulates, which implies that new information doesn't necessarily replace older information. Each time, it's important to ask whether the belief is significantly better at explaining information than mere statistical coincidence.<br />
<br />
A popular term for this kind of continuum is <i>Bayesian</i>. Or, to borrow a favorite turn of phrase, it could be called the kind that's focused on <i>fooling ourselves less</i>. It's a contrast to the <i>myth-making</i> kind of continuum of belief in which stances are <i>chosen</i> based on the familiar belief's inherent appeal or its cultural dominance to each individual believer. At the core, Bayesian continua are built around the ideal of studiously <i>not</i> overdoing acceptance of a belief. This is why it's a futile taunt to characterize a Bayesian continuum stance as a fanatical overreach. The continuum of how <i>alien</i> a stance feels to someone is entirely separate. For that matter, when the stance is more unsettling largely as a <i>result</i> of staying strictly in line with the genuinely verified details, the reaction it provokes might be an encouraging sign that pleasing the crowd isn't its primary goal. If someone is already frequently looking down to check that they're on solid ground, they won't be disturbed by the toothless charge that they've stepped onto someone else's definition of thin ice.<br />
<br />
The accusation has another self-defeating problem: the <i>absolutist</i> closed-mindedness that it's attacking isn't applicable to most people of my stance. All that's needed is to actually listen to what we say when we're politely asked. Generally we're more than willing to admit the distinction between impossible and improbable beliefs. We endorse and follow materialistic naturalism, but we simultaneously allow that it <i>could</i> be missing something frustratingly elusive. We <i>could</i> be shown that it needs to be supplemented by a secondary stance.<br />
<br />
But by now, after a one-sided history of countless missed opportunities for supernatural stuff to make itself plain, and steadily shrinking gaps for it to be hiding in, the rationale would need to be stunningly dramatic. It would need to be something that hasn't come along yet, such as a god finally speaking distinctly to the masses of planet Earth. Corrupted texts, personal intuitions, and glory-seeking prophets don't suffice. The common caricatures of us are off-target. We aren't unmovable. We'd simply need <i>a lot more</i> convincing to prod us along our Bayesian continua. (The debatable exceptions are hypothetical beings defined by mutually contradictory combinations of characteristics; logic fights against the existence of these beings.)<br />
<br />
There is a last amusing aspect of the squabble over the notion that materialistic naturalism stretches too far to obtain its conclusions. Regardless of my emphatic feelings that intermediate stances aren't the <i>closest</i> match to the hard-earned knowledge that's available, I'm sure that I'm not alone in <i>preferring</i> that more people followed these in place of some others. While I disagree that their beliefs are more plausible than mine, deists/pantheists/something-ists upset me less than the groups whose gods are said to be obsessed with interfering in human lives. I wouldn't be crushed if the single consequence of <i>losing</i> were more people drifting to the supposed "middle ground" of inoffensive and mostly empty supernatural concepts.<br />
<br />
Because outside of staged two-person philosophical dialogues, it's a short-sighted strategy to argue that my stance <i>presumes too much</i>. It'd only succeed in flipping people from mine to the arguer's stance after they added the laughable claims that <i>theirs</i> somehow presumes less than mine, <i>and</i> that it presumes less than deism/pantheism/something-ism...Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0tag:blogger.com,1999:blog-29876314.post-44237363207999285542017-11-11T09:25:00.003-05:002017-11-11T09:25:50.204-05:00supertankers and SegwaysBack when my frame of mind was incorrect yet complacent, the <i>crucial</i> factor wasn't impaired/lazy intelligence or a substandard education. It wasn't a lack of exposure to influences from outside the subculture. The stumbling block was an extensive set of comforting excuses and misconceptions about not sifting my own supernatural ideas very well for accuracy...combined with the nervous reluctance to do so. It was an unjustified and lax <i>approach</i> to the specific ideas I was clutching.<br />
<br />
I wasn't looking, or not looking intently enough, at the ideas' supporting links. Like the springs around the edge of a trampoline, an idea's links should be part of the judgment of whether its definition is stable and sturdy under pressure. If it's hardly linked to anything substantial, or the links are tenuous, its meaningfulness deserves less credit.<br />
<br />
This metaphor suggests a revealing consequence of the condition of the ideas' links. When the links are tight and abundant, the ideas are less likely to change frequently or radically. An idea that can be abruptly reversed, overturned, rearranged, etc. is more consistent with an idea that's poorly <i>rooted</i>. Perhaps its origin is mostly faddish hearsay. If it can rapidly turn like a Segway for the tiniest reason, it's not showing that it's well-connected to anything that says unchanged day by day.<br />
<br />
However, if it turns in a new direction gradually like huge supertanker-class ships, it's showing that its many real links were firmly reinforcing its former position/orientation. Changes would require breaking or loosening the former links, or creeping slightly within the <i>limits</i> that the links permit. By conforming to a lengthy list of clues, an explanation places the same demands on all modifications or substitutions of it. This characteristic is what separates it from a <i>tentative</i> explanation. Tentativeness refers to the upfront warning that it isn't nailed down or solidified. The chances are high that it might be adjusted a lot in the near future in response to facts that are currently unknown.<br />
<br />
Although revolutionary developments are exciting, such events call for probing reexamination of the past and maybe more than a little initial doubt. There might be a lesson about improving methods for constructing and analyzing the ideas' links. <i>Why</i> were the discarded ideas followed before but not now? How did that happen?<br />
<br />
Amazing upheavals of perspective should be somewhat rare if the perspective has a sound basis. <i>Anyone</i> can claim to have secret or novel ideas that "change everything we thought we knew". The whole point is to specifically <i>not</i> grant them an easy exclusion from the interlinking nature of knowledge. Do their ideas' links to other accurate ideas—logic/proofs, data, calculations, observations, experiments, and so on—either outweigh or invalidate the links of the ideas that they're aiming to replace?<br />
<br />
If their ideas are a swerve on the level they describe, then redirecting the bulk of corroborated beliefs ought to resemble turning a supertanker and not a Segway. Of course, this is only an expectation for the <i>typical manner</i> in which a massive revision takes place. It's not a wholesale rejection of the <i>possible need </i>for it. From time to time, the deceptive influence of appealing yet utterly wrong ideas can last and spread for a long time. So it's sensible that the eventual remedy would have an equally large scale. Paradigm shifts serve a purpose.<br />
<br />
But they must be exceedingly well-defended to be considered. When part of an idea's introduction is "First forget all you think you know", the shrewd reaction isn't taking this unsolicited advice at face value. It's "The stuff I know is underpinned by piles and piles of confirmations and cross-checks. How exactly does your supposed revelation counter all that?" The apparent "freedom" to impulsively modify or even drop ideas implies that they were just dangling by threads or flapping in the wind to start with.Art Vandalayhttp://www.blogger.com/profile/08432367996173233599noreply@blogger.com0