Monday, July 9, 2012

Logical Fallacies and Scientific Method

Cracked had a very nice article on logical fallacies -- that we all make as a matter of course.  Also some good illustrations and suggestions.  Aside from the fact that it was a humor magazine that had such a nice article on rational thought, I was struck by the fact that each of the points mentioned are ones that the practice of science has addressed.

The 5 natural fallacies mentioned are:
5. We're Not Programmed to Seek "Truth," We're Programmed to "Win"
4. Our Brains Don't Understand Probability
3. We Think Everyone's Out to Get Us 
2. We're Hard-Wired to Have a Double Standard
1. Facts Don't Change Our Minds

Let's take a look at what science method does to combat these:
 
5. We're Not Programmed to Seek "Truth," We're Programmed to "Win"

In science, the 'win' is changed to be the successful seeking for 'truth'.  Out-talking someone, as in a public debate, or out-wording them on an internet argument, or just browbeating them enough that they leave in either, is not a win.  Hence the lack of interest from scientists in 'debate'.  Putting forth an idea that is seen, eventually, to match reality better than what came before is the win.

4. Our Brains Don't Understand Probability

Therefore, we go through the occasionally ugly math to nail down the probabilities of our results.  We just can't trust our intuitions about probability -- our brains don't naturally handle it at all well.  We go back and work through that math, and then after publishing, many people read the news article and say 'everybody knew that already'.  While everybody may have thought it in the first place, we do the work because it's also common for a reader to see the same article and say 'that's absurd, everybody knows it isn't so'. We may not agree that it was obviously true, or true at all, but we can agree on whether 2 * 3 = 6.

3. We Think Everyone's Out to Get Us

In the sense, the article notes, that "If you're smart and savvy, you know not to trust anyone. This is why we can excuse ourselves for using shady or flat-out dishonest tactics to win an argument. We're sure the other guy is doing much, much worse."

So how do you work towards truth if everybody is out to get you?  One part is that when someone is found lying in their work, they're out of the field. Contrast that with, say, business or politics.  Another is to enlist the help of other people who are knowledgeable in the topic and have them read the new work to ensure that there aren't any obvious mistakes or frauds.  Peer review.  Bad papers still make it in to the scientific literature, but it improves the chances that what you're reading is not too badly flawed.  When the peer review process fails, the editors in charge usually take it very seriously.  When's the last time a corporate president resigned because a vice president let a salesman lie about their product?

A different, major, part of the method is that experiments must be repeatable.  Good enough a fake to get past a reviewer is not enough.  Somebody, somewhere, must be able to repeat your experiment and get sufficiently similar results.  For preference, someone should actually do so, but funding agencies don't like paying two or more groups to run the same experiment -- a failing in funding agencies and those who allocate funds for research.  But if the experiment is not even in principle repeatable, if the answer is 'trust me', you're in trouble.

2. We're Hard-Wired to Have a Double Standard
My science example is different than the article's, but the same principle is involved.  It is natural to consider evidence in favor of your position to be better than the evidence against it.  It is so natural that it's also natural to simply ignore the evidence against your position outright, and only look at the evidence that supports it.  Even if you have to make it up, or use for your source someone who did.

To combat this natural double-standard, in science, unlike politics/business/law*/..., you are supposed to present the evidence without regard for whether it supports your position or not.  And if you fail to present evidence that is against it, you're in trouble (#3, #5).

1. Facts Don't Change Our Minds
Being able to do this is what I called the central skill of a scientist.  It is so important because it is so unnatural to us humans.  Quoting some pieces of the original article:
....  Let's go back to the beginning for a moment, and the theory that people figured out how to build arguments as a form of verbal bullying rather than a method of spreading correct information. That means that there are actually two reasons somebody might be arguing with you: because they actually want to get you to think the right thing, and because they're trying to establish dominance over you to lower your status in the tribe (or office or forum) and elevate their own. That means there's a pretty severe cost to being on the wrong side of an issue completely separate from the issue itself. ....
So During Your Next Argument, Remember ...You won't remember this. You're hard-wired to remain entrenched, and the Internet makes it worse because your political beliefs are pasted all over Facebook and wherever else you post your opinions. Backing down means going back on all that. It means letting down your team. Every inch of your psychology will fight it.
Doread the original article in full.

* Law has its own standards on proof and approach to truth.  And it must.  My wife is a lawyer, so we've had some fun talks about the differences and where they came from.  One where it differs most strongly from science is how it handles the natural double standard.  In science, we take the side of making practitioners do the highly un-natural thing of avoiding the double standard.  Law, at least in common law countries like the US (and UK, ...), takes the opposite -- if everyone is predisposed to some level of double standard, and side-taking in their arguments, let's take it out to the extreme -- each side presents the best possible case for its own position, and only that.  Then have a judge or jury assess who made the better case.  The people deciding which is the stronger case are not the ones who make the case in the first place, so (the design hopes) they won't be subject to the double-standard problem.  In science, the same people who would be deciding which case is stronger are the ones making (some of) the cases.

The article helped me understand a conflict I'd encountered.  On one hand, doing science is very natural.  We are all disposed to learning how the universe around us works, starting from birth.  On the other hand, what I see in internet discussions bears strong resemblance to the 5 fallacies discussed above, even when the topic is scientific ("Is CO2 a greenhouse gas?").  Even though trying to find out more about the universe is natural, the methods that evolved over the last few thousand years to help us do so have required us to do it in differently than we reflexively choose.

3 comments:

Anonymous said...

Robert, your two links to Cracked don't work.

t_p_hamilton

Robert Grumbine said...

Thanks. Fixed. Blogger did some weird things to the links behind my back.

Anonymous said...

I'm reminded of a passage from Galileo's Dialogue Concerning the Two Chief World Systems:

If what we are discussing were a point of law or of the humanities, in which neither true nor false exists, one might trust in subtlety of mind and readiness of tongue and in the greater experience of the writers, and expect him who excelled in those things to make his reasoning most plausible, and one might judge it to be the best. But in the natural sciences, whose conclusions are true and necessary and have nothing to do with human will, one must take care not to place oneself in the defense of error; for here a thousand Demostheneses and a thousand Aristotles would be left in the lurch by every mediocre wit who happened to hit upon the truth for himself