Sunday, March 13, 2011

Being Wrong - Kathyrn Schultz
  • Unlike earlier thinkers who had sought to improve their accuracy by getting rid of error, Laplace realized that you should try to get more error: aggregate enough flawed data, and you get a glimpse of the truth
  • John Ross - 1818 went looking for the Northwest passage, but came back and couldn't find it because of an optical illusion. However his 2nd in command (William Parry) did not see the same optical illusion and made it known to the investor of the trip. A year later, William went back and also saw the mirage, but decided to sail right through it. The concept is known as an superior mirage - shows things that do exist, but only much closer. The mountain range was about 200miles away instead of the way it looked (25miles away). Essentially his sense of vision deceived him.
    • Failures of perception capture the essential nature of error
    • For the most part, people accept as true anything they see with their own eyes
    • Ex: when you look at the stars long enough it is easy to interpret their movement to mean they we are the center of the galaxy. Sense's don't necessarily give the correct information.
    • Philosophers came up a model of the rift between our minds and the world. Sensing is actually two different operations (normally not separable). The first is sensation, in which our nervous system responds to a piece of information from our environment. The seconds is perception, in which we process that information and make it meaningful. Perceptions is the interpretation of sensation.
    • The mechanisms that form our perceptions operate almost entirely below the level of conscious awareness. Our obliviousness to the act of interpretation leaves us insensitive to the possibility of error.
  • Sensing incorrectly is often a side effect of a system that is functioning correctly, in an abnormal perceptual process.
  • Anosognosia, denial of paralysis, patients say that they can move limbs that aren't there, see things if they are blind. The brain mistakes an idea in the mind for a feature of the real world. => this shows us that there is no form of knowledge that under certain circumstances can fail us
    • The fallibility of knowledge is gravely disappointing for humans because we really, really love to know things.
    • In sum: we love to know things, but ultimately we can't know for sure that we know them; we are bad at recognizing when we don't know something, and we are very, very good at making stuff up.
    • The feeling of knowing something is incredible convincing and satisfying, but it is not a good way to gauge the accuracy of our knowledge
    • The way we actually remember is by rebuilding the memory afresh every time. The vividness, ease to recreate, might be a side effect of the building process itself.
      => unfortunately, Plato's wax tablet is an excellent description of how remembering feels.
    • If we don't know something, and know we don't know it we can admit it. However, if we don't know something, and we are unsure whether we know it, our brains make up stuff to fill in the gaps. This is called confabulation and we are excellent at it.
    • In the end, everything we 'know' is actually a belief.
  • We have implicit and explicit beliefs. However, we only recognize implicit beliefs when we are wrong about them in some way.
    • This because these beliefs are models of our world, and only once we fail do we try to create a new model.
    • The reason we do this is because at some point in our evolution it was necessary to formulate models of the world to survive (ie: I can eat these red berries, but not the blue ones. That rustling sound in the bushes is a bunny I can eat or a bear that will kill me)
    • However, this same mechanism which makes us form theories about important things also makes us theorize about everything.
  • Cuz its true constraint - we confuse our models of the world with the world itself
    • Ignorance assumption - other people just don't have the facts and we do and have interpreted them correctly
    • Idiocy assumption - they have the facts, but don't have the brains to comprehend them
    • Evil assumption - they are smart enough, but they have turned their back on it (prevalent in politics and religion)
    • This is the grim side of our passion for inventing theories. Like toddlers, we are quick to take our own stories for the infallible truth, and dismiss as wrongheaded or wicked anyone who disagrees. This also makes it difficult to accept our own fallibility because if we assume people who are wrong are ignorant, idiotic, or evil, we prefer not to confront this possibility in ourselves.
  • Evidence - we should not believe things w/o sufficient evidence
    • We always believe things based on paltry evidence. It is the engine that drives human cognition
    • inductive reasoning - We make decisions based on if whatever evidence we do have supports one conclusion better than another.
    • This makes us 'leap' to conclusions, sometimes to the wrong conclusions.
    • Inductive bias (Confirmation bias) - we give more weight to evidence that confirms our beliefs than evidence that contradicts them.
    • This also causes us not to look for or accept evidence that contradicts us
  • Society - many of our beliefs are a matter of fate (where are born, our family)
    • Instead of vetting information we simply vet the source, and then trust their information.
    • Asch study - 3 lines are shown on one paper, 1 line on the other. people asked to say which line is the same length (it is pretty obvious). When asked individually error rate is about 1%, however, in a group where many people answer incorrectly, error rate of the subject jumps to 37%! In another study they showed that when the group answers, our brains function differently, so people's brains changed how they saw the lines based on the judgment of the the group. Also, it only took 3 fake people to elicit this effect. However if in a big group, just 1 person says the correct answer, real subjects start doing so as well.
    • We also tend to disregard beliefs from people we disagree with or who are unfamiliar.
    • Dissent from within a community is especially damning for a community because it only takes 1 person from that group to help others come to the same conclusion since it is harder to disregard them.
  • Certainty - conviction that we cannot be wrong - mental state of being without doubt
    • The feeling of knowing, our sensory perceptions, cuz its true constraint (thinking that our beliefs are grounded in facts), our biases we bring to bear when we assess evidence for and against those beliefs. Our communities and convictions are mutually reinforcing, so can't question our beliefs without running the risk of support status, and sense of identity that comes with belonging to a society.
    • this type of certainty can be 'murderous' but it is also evolutionarily advantageous because you don't want to have any doubt that you can't escape from danger,ect...
    • Spinoza - disbelieving is a two step process - initially accepting and only subsequently rejecting it => they set up an experiment where they told ppl fake things and the interrupted them immediately after. Since they didn't have a chance to mentally evaluate and reject the fake idea, they reported believing it
  • Being Wrong
    • we cling to beliefs until something better comes along. we are rarely 'between beliefs'.
    • if we are between beliefs for something big, something that our model of the world is built on, it can make you feel completely confused (panic, anguish, rage and we fear we do not have the resources to find our way in the world again). with your belief of this model of the world gone, your understanding of how the world works is also gone. Until you figure out a new one.
    • being wrong about any type of belief makes you feel something. it is usually this feeling that is we try to avoid. being able to admit you're wrong depends on your ability to tolerate emotions.

No comments:

Post a Comment