Critical thinking and where people err on both sides of the conspiracy fence

Discussion in 'Conspiracies' started by Petra Liverani, Mar 19, 2023.

  1. Petra Liverani Registered Member

    Messages:
    46
    Are you aware of rules of critical thinking that guide you and do you see those rules broken by others?

    These are two guideposts and two rules that guide my thinking. What do you think of them? I'm what many others would perceive as a hard-core conspiracy theorist and yet I don't see myself that way at all as I think that my thinking is 100% evidence-based (even if I'm mistaken in my interpretation sometimes) and follows the rules of critical thinking.

    Guideposts
    1. Every relevant piece of information will at least support if not favour the correct hypothesis
    It is useful to constantly bear in mind that the nature of reality is that every single relevant piece of information will at least support if not favour the correct hypothesis. Any relevant item selected at random will show that it at least is consistent with the hypothesis if not favour it. If not, the hypothesis isn’t correct. Sometimes seeming anomalies might contradict the correct hypothesis but on closer inspection will be revealed to be only seeming anomalies not real anomalies.

    2. Internal consistency and consistency with expectations
    Where all the evidence is both internally consistent and consistent with expectations, unless a good reason is put forward for doubt we should accept an hypothesis as correct.

    Rules
    Rule 1: Aim to prove your hypothesis wrong
    This rule applies generally to the validity of the hypothesis you hold.

    When I came across the statement by Kary Mullis, the Nobel-prize winning inventor of the PCR technique, in an interview with Gary Null, "The scientist aims to prove their hypothesis wrong," I thought, "Bingo! That's what I do.” If ever what I believe is challenged by anyone or anything I review my hypothesis against the challenge to see if it still holds. I also go out of my way to investigate the opposing arguments sufficiently to ensure I can respond to them … and if I can’t respond with a good argument, I change my mind or at least “park” the challenge for later review. Richard Feynman effectively said the same thing as Kary Mullis in his commencement address at Caltech in 1974 entitled, Cargo Cult Science.

    Rule 2: Confine your analysis to the most relevant and unarguable-with data in the first instance
    This rule applies to the best way to approach evidence in order to get to the truth.

    If the nature of reality is that every single piece of evidence will at least support if not favour the correct hypothesis then if there is a reasonable amount of unarguable-with data and all of it supports your hypothesis if not favours it over any other then it's going to be rather difficult for another hypothesis to be correct.

    People get carried away with claims on subject matter about which they have insufficient understanding, for which there is not good evidence and that do not align with all the evidence. They also focus on irrelevant information which creates confusion and clouds the issue. Even if certain facts are unarguable-with are they necessarily the most relevant? Considering the most relevant and unarguable-with data first sets you on a good path to the truth. In essence it’s Occam's Razor, shaving away the unnecessary.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. James R Just this guy, you know? Staff Member

    Messages:
    39,397
    Petra:

    Welcome to sciforums.

    I have some thoughts on this.
    Interesting. I think the biggest trap with conspiratorial thinking is confirmation bias. The danger is that you like the idea of the conspiracy, for whatever reason, then you spend most of your time looking for evidence that seems to confirm it, rather than looking for evidence that might disprove it.

    Conspiracy theories are often constructed in such a way that the most important information is inaccessible - often deliberately withheld by the alleged "conspirators". In those circumstances it might be impossible ever to disprove that there is a conspiracy. In other words, the conspiracy theory has a self-protection mechanism built in, to make it unfalsifiable. If that's the case, we should be very wary about believing it, in my opinion.
    My main question here is: how do you decide what isn't and isn't a "relevant" piece of information, in any given case?

    The danger lies in confirmation bias, again. When you find evidence that tends to support the hypothesis, you file it away in the "evidence for" box. But when you find evidence that tends to refute the hypothesis, maybe you just say "that's irrelevant", and quickly forget about it.

    Perhaps you have criteria for determining relevance. I'd like to know what they are.
    It can sometimes be hard to tell the difference between a real anomaly and a seeming anomaly. If you're biased to towards confirming your hypothesis, you're more likely to write off real anomalies (i.e. facts that actually tend to refute the hypothesis) as seeming anomalies (e.g. you might go out of your way to find obscure or ad hoc reasons to "explain away" the anomaly).
    There is an assumption in there that, most of the time, things will turn out to be what we expect, isn't there? But is that necessarily true? What if different people expect different findings, and the evidence is both internally consistent and consistent with both sets of expectations? Whose hypothesis should we accept as correct?

    My own view is that we shouldn't accept any hypothesis as correct just because it is self-consistent and consistent with our expectations; we should accept a hypothesis as correct when we have sufficient evidence that it is correct. Of course, what counts as "sufficient" will often vary on a case-by-case basis.
    Feynman also said (perhaps in the address you mention) that the first thing to do is to make sure you're not fooling yourself, because you're the easiest person you can fool. People tend to invest emotionally in their own pet hypotheses. Wanting something to be true, especially if its a strong desire, often leads to error and a blinkered view of things.
    There's that word "relevant" again. You need to be careful that you don't just dismiss inconvenient data as irrelevant.
    I don't disagree with a lot of this. However, I might mention that Occam's razor is not a guide to what might be true. It's just an instruction to look for the simplest possible explanation for a thing and to prefer that over a more convoluted explanation that does just as good a job (but not a better one). What it doesn't say is that simple explanations are more likely to be true than more complex ones; that's not obviously true.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Petra Liverani Registered Member

    Messages:
    46
    Thanks for your reply, James.

    I'm afraid I question your claim of "inaccessible" important information as most events about which so-called "conspiracy theories" are formed are of the type psychological operation or psyop which have as part of their MO the rule of giving the game away with obviousness often in the base narrative itself but most definitely with obviously deliberate anomalies. In fact, I'd say the two reasons I became interested in psyops are:
    1. They follow a formula which makes them easy to work out
    2. The fact that the same formula has been going for centuries at least without the majority catching on I find fascinating

    So would you say you follow particular rules of critical thinking, James, and if so what are they?
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Tiassa Let us not launch the boat ... Valued Senior Member

    Messages:
    37,884
    I was looking through the earlier version you posted at Logically Fallacious↱, and, so, is this still Covid conspiracism? Or are you going somewhere more general with it, this time?
     
  8. Petra Liverani Registered Member

    Messages:
    46
    My belief is that if people use the correct rules of critical thinking they can only reach the correct conclusion assuming there is sufficient evidence to show that something is one thing or another so an approach one can take (which I've only thought of recently) is before engaging in discussion establish the correct rules of critical thinking and then those involved in the discussion can hold the other/s to account if they aren't following those rules. I have no plan to discuss any particular event at this point, just to establish the rules of critical thinking. If people cannot agree on the rules then before you even start on the subject you've got a problem, right?
     
  9. Tiassa Let us not launch the boat ... Valued Senior Member

    Messages:
    37,884
    Depends on what you mean by a problem.

    If you wait for everyone to agree on the rules before starting, well, who all is everyone, because at some point that's enoguh people that it just won't happen. Setting that aside, it's one thing to agree on the rules, but, having seen you apply these rules to your take on Covid, I am reminded of the likelihood that people will disagree about what attends or violates those rules.
     
  10. Petra Liverani Registered Member

    Messages:
    46
    I think we can take it step by step here. "Everyone" is simply all those engaged in a particular discussion, no? Let's see if we can agree on the rules first and proceed from there. Do you wish to involve yourself in a discussion about an event some people regard as a conspiracy. If so, I invite you to name the rules you think you follow in trying to determine what something is or isn't.
     
  11. Tiassa Let us not launch the boat ... Valued Senior Member

    Messages:
    37,884
    You're in a room where people cannot or will not agree on the rules; moreover, in certain ways, such rules are viewed as dangerous. It is unlikely that people who feel threatened by the prospect of reasonable and supportable discourse are suddenly going to come around and line up for your rules of critical thinking. And that's just a practical reality.

    It is also true that in order for you to convince anyone that your approach to critical thinking is reliable, you will eventually need to apply it, as such, and the available record, such as the previously-noted post at Logically Fallacious, or the indulgent rant at Crucial Learning↱, tell us something about what that reliability looks like.

    Like the heresy bit.

    Do you know why Trimble wanted to address beliefs and not behavior? Because there are objectively right and wrong answers in the world, and that is inconvenient to him: "If you think that those who hold these opinions are wrong to believe you are misinformed," he argues, "then you are part of the problem." And as a general rule, it is true in many subjects and discussions. But it is also true that objective, reliable facts do exist in the world, and sometimes someone is wrong. That's why "beliefs, not behavior". Trimble wags, "You’re contributing to your own inability to 'deal with,' as you put it, people who have a different opinion." And he even does the clarification about "not defending dangerous beliefs or behavior", and guards against suggestions of relativism "tha every viewpoint is equally justified", or "that merely holding a perspective makes it rational or valid", and even "that one's behavior is excusable ecause he or she believes it is". But after separating himself from all that, he explains that what he is suggesting is, "Regardless of what the facts are or the truth is or whose opinion is more carefully developed or less dangerous, you will not be able to disagree meaningfully until you decide to own the problem." And what problem is that? That someone is contributing to their own inability to deal with people who have different opinions. And how are they doing that? By believing someone is misinformed.

    Sometimes, though, there are right and wrong answers. Trimble is correct, however, that we cannot be certain those others are misinformed; in some cases, they are perfectly well informed, but just choose to disinform.

    The part where you might find some sympathy here is where the "simple formula" is to pander to Covid conspiracists like they're new. I've encountered similar pitches for religion, child abuse, white supremacism, lunar conspiracy, sexual violence, and what's wrong with Jews, at least. And that's what such wags are for; inasmuch as Trimble thinks we live in a "post-truth" era, his would-be advice column actually promotes that circumstance.

    And that's what you threw in with. As a question of reliabiliy, consider that you opened with declaring that your experience tells you "there is no such thing as a reliable source". And while it is true "that you can provide all the evidence you want and it will not change people’s minds", Trimble demonstrates truisims are not in and of themselvees inherently reliable. Again, we find a question of application.

    For instance, I'm not sure what to tell you about normalizing Hitler, but maybe a couple rewrites of that paragraph could make it more meaningful. And, also, just as a generally observable behavioral expectation, vis à vis credibility, declaring oneself among the elect isn't helpful. Many? Most? Typally speaking? Those who are "of the relatively few who is always open to the evidence regardless" tend to be seeking something extraordinary. And maybe it's not Hitler, but the bit about the moon landing is actually exemplary of what Trimble suggests in pandering to conspiracists, so it's true that people can see what that looks like. And when you turn that waxing romantic toward "heretical claims", well, people can see that application, too, i.e., where your sort of "critical thinking" leads, and what it means that you think you do what scientists do.

    But, yeah, sure, now do the heresy bit. Who knows, maybe it will be fun. But choosing a religious term instead of a reasonably accurate descriptive word will only ask people to wonder why; it's part of figuring out what you mean by the word. Moreover, consider the relationship between religious sentiment and the idea of a "post-truth" era. And, regardless of how you feel about conspiracism as heresy, not all heresy is conspiracism. History reminds that some heresy was actually the correct answer.

    Even setting aside the dubiousness self-perceived specialness, or the incompleteness that sometimes comes with trying too hard, we might look at how your rules of critical thinking are applied, and see what they bring.

    And, yeah, if that's what the rules mean, finding that agreement will likely be very, very difficult.
     
  12. Petra Liverani Registered Member

    Messages:
    46
    I'm afraid I'm having a little trouble following your argument, Tiassa. I feel as though we're staying rather tediously in the realm of the abstract and I think at this point we need to go concrete.

    It was a mistake for me to say, "If people cannot agree on the rules [of critical thinking] then before you even start on the subject you've got a problem, right?," because, in fact, I believe the rules of critical thinking are pretty straightforward and common sense and I wouldn't actually expect disagreement on them, however, I think it's important to establish them with whomever you're in discussion with so each person can hold the other/s to account if they show signs of not following the rules all are agreed to. James said he didn't disagree with me on the rules I put forward, right, he simply cautioned about what people do wrong such as engage in confirmation bias.

    A rule we could add to my two rules is: no logical fallacies, which applies, of course, by default. Obviously, any logical fallacy doesn't make valid argument.

    So Tiassa please tell me what you think of my rules and what your rules are if they're different. Let's get a bit concrete.

    1. Aim to prove your hypothesis wrong
    Look at all the argument opposing your chosen hypothesis and ensure you can respond to any challenges, etc.

    (If this rule is good enough for at least two Nobel-prize winning scientists, it's a little difficult to argue with plus it's simple common sense, right?)

    2. Confine analysis to the most relevant and unarguable facts in the first instance
    Of course, whether something is an unarguable fact or not might be argued about but some facts are definitely irrefutable, for example, you cannot argue it's not a fact that a particular statistic is shown on a government website if it's right there for everyone to see. Whether the statistic represents reality is another matter altogether but you cannot argue that the statistic is shown on the website. Similarly, relevance may be argued about but often the use of irrelevant information would fall into obvious logical fallacy I think.

    3. No logical fallacies

    Please just give me your rules of critical thinking or criticise mine so we can move onto a meaningful discussion.
     
  13. Tiassa Let us not launch the boat ... Valued Senior Member

    Messages:
    37,884
    Again, it's in how you apply those criteria.

    Still, though, while aiming to prove one's own hypothesis wrong is a succinct expression of a useful principle, no rule actually requires that one have any particular hypothesis in order to think critically.

    Confining analysis is a strange notion in part because of its popularity in political discourse, and in that context is actually antithetical to critical thinking. The difference, of course, would seem to be a matter of application.

    Inasmuch as part of what critical thinking does is help someone avoid logical fallacies, "no logical fallacies" is pretty obvious.

    One need not know the explicitly correct answer in order to know that something put before them is incorrect. In this context, what is the hypothesis? Or, to the other, no rule actually requires any particular hypothesis in order to think critically.

    For instance, here are two questions many Americans my age have forgotten, even though we were told repeatedly, having to do with reading and writing: What is this writing? Who is the audience?

    To the one, many people don't ask these questions; to the other, many people depend on other people not asking those questions.

    Here, just play along for a moment. Hold out your hands in front of you, palms up. Imagine the trays of a balance scale.

    1) Gesture with left hand, and say aloud, "One side tells me one thing."

    2) Gesture with right hand, and say aloud, "The other side tells me something else."

    3) Move hands up and down like trays of a balance scale, and say aloud, "It's hard to tell who's right!"​

    Critical thinking helps resolve, or even forestall that confusion. Sometimes there are right and wrong answers such that the only equivalent thing about what two different people tell us is that someone said it.

    Your criteria seem reasonable suggestions toward rational discourse, but critical thinking will affect what that hypothesis is, as well as identification of the most relevant and unarguable facts.

    That is to say, critical thinking will affect how your criteria are applied.
     
  14. RainbowSingularity Valued Senior Member

    Messages:
    7,447
    the gold fish poking the dolphin asking "do you want to be a real fish or dont you?"
     

Share This Page