I have for some years now, been reading articles and viewing lectures by Dr. Steven Novella. Some of these quotes are from articles, lectures and but most of them are from Dr. Novella’s course called “Your Deceptive Mind: A Scientific Guide to Critical Thinking Skills”.
In my opinion this course is the best introductory course to critical thinking and the Philosophy of Science. Of the many online free online lectures with Dr. Novella these two are the best in my opinion, The Skeptical Neurologist (2011) and How to Think Like a Skeptical Neurologist (2014).
Dr. Novella is a clinical neurologist and assistant professor at Yale University School of Medicine. Dr. Novella received his M.D. from Georgetown University and went on to complete residency training in neurology at Yale. He is active in medical education at every level of experience, including patients, the public, medical students, and continuing education for medical professionals, and he also performs clinical research.
Dr. Steven Novella, M.D. is also the president and cofounder of the New England Skeptical Society, a nonprofit educational organization dedicated to promoting the public understanding of science. He is also the founder and senior editor of Science-Based Medicine, a medical and health blog with contributions from dozens of physicians and scientists. Science-Based Medicine is dedicated to promoting the highest standards of both basic and clinical science in medical practice.
Without further ado, here are 32 quotes from Dr. Steven Novella:
“There is nothing magical about science. It is simply a systematic way for carefully and thoroughly observing nature and using consistent logic to evaluate results. So which part of that exactly do you disagree with? Do you disagree with being thorough? Using careful observation? Being systematic? Or using consistent logic?” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“The thing that I think separates skeptics from non-skeptics, more than anything else, is a thorough knowledge of the mechanisms of self-deception, one of the key things of being a skeptic is the humility to know that we can’t rely on anything that we think, that we know” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“There is a tendency to grossly underestimate the human potential for self-deception, even among skeptics. We all want to view ourselves as rational beings, but that is the greatest self-deception of all. If you think you are not biased and cannot rationalize a completely erroneous belief, then that bias can undo all of your critical thinking skills.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Science is the foundation of critical thinking; it involves the methods for testing our beliefs about the natural world. The strengths of science are that it is transparent, rigorous, systematic, and quantitative. In other words, science is a system of methods that seeks to compensate for the failings of human thinking, perception, and memory.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“You can not trust any individual study, when people say “a study showed”, I don’t care what a study showed, you can’t interpret “a study” you have to interpret the litteratur, the whole course of the litteratur, what patterns does it show? What’s the relationship between the rigorousness and the methodology and how big the outcome is? Is it positive or negative?” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“We are very influenced by stories, non more so than our own, so when patients say “but this worked for me”, that is like a game ender, you can’t even to a person more when they say, “but I got better when I took this treatment so I don’t care what you are saying about science and evidence, I believe my own experience” but the core lesson there is, no you should not believe your own experience, you can’t believe your own experience, because it is so flawed and bias” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Confirmation bias: A cognitive bias to support beliefs we already hold, including the tendency to notice and accept confirming information while ignoring or rationalizing disconfirming information.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Science also needs to be understood as a human endeavor. Therefore, science is imperfect and messy, has many false steps, and is plagued with bias and error. Fortunately, science is also self- correcting, which is perhaps its strongest feature” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Cognitive biases affect the way we argue and the way we think. Our minds will tend to take a path of least resistance unless we make a specific high-energy effort to step out of these processes and think in a more clear and logical manner” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“While we have some inherent sense of logic, we are overwhelmingly emotional creatures. We have the capacity for logic, but logic and critical thinking are skills. We’re not born as master critical thinkers—just as we’re not born as violinists. Both are skills that need to be developed and practiced over many years.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“The worst cognitive bias, is the one you are not aware of” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
Anecdotes are a way of remembering hits and forgetting misses and seeing patterns in a vast, perhaps unappreciated, set of data.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“If you are basing your claims on anecdotal experience, then any treatment will seem to work for anything and everything” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Interestingly, confidence in a memory is not a good predictor of the accuracy of a memory. We tend to naively assume that if we are very confidence in a memory—if it feels vivid to us and if it can be easily recalled—it must therefore be accurate, but the research does not support this.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“There is also a tendency to rely upon anecdotal evidence and testimony. Anecdotes are uncontrolled, or ad-hoc observations. They are not systematic; they are, therefore, plagued with conrmation bias and recall bias. Pseudoscientists will often heavily rely upon this type of evidence because, essentially, they could make it say whatever they want it to say.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Much of what we remember and believe is flawed or simply wrong. Our brains seem to constantly generate false observations, memories, and beliefs – and yet we tend to take the truth of our experiences for granted” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“With a thorough understanding of logic and cognitive biases, however, we have the opportunity to engage in metacognitive reasoning. We can consciously put into place a metacognitive self-check on our own reasoning. We engage our frontal lobe executive function to filter and inhibit our more primitive cognitive impulses. We can try to transcend our biases by being truly aware of them. Even with the understanding of metacognition, when logic and evidence leads us to uncomfortable conclusions, this will create cognitive dissonance. Recognizing that dissonance and how it will motivate you empowers you to engage in metacognition – to choose an adaptive, rational resolution rather than rationalizing a convenient answer.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“The critical thinking approach to argumentation is to value the process of developing your arguments and reaching conclusions; a critical thinker should be willing to change any conclusion when new information or a better argument is presented” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“To avoid cognitive dissonance, we should focus on the process instead. In other words, if we don’t tie ourselves firmly to a conclusion, then we won’t feel any emotional dissonance when new data is encountered that shows that the conclusion is wrong” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“It’s OK to make grandiose claims as long as you have the evidence to back them up” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Scientists believe that anecdotes are a very dubious form of evidence because the variables are not controlled and the observations are not systematic. The reason why that is a problem is because anecdotes are a way of subconscious data mining and are subject to confirmation biases, memory effects, and other cognitive biases.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Logical fallacies are arguments in which the conclusion does not have to be true if the premises are true. The generic term for this type of invalid logic is the non sequitur, which literally means “it does not follow”—or the conclusion does not follow from the premises. Essentially, logical fallacies are our mechanisms for rationalizing conclusions.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Logical fallacies give us the ability to rationalize when we are defending a conclusion that isn’t true, we can’t use valid logic and premises in order to support it, we have to use either assumptions or false premises or we need to use invalid logic in order to construct an argument to defend an incorrect conclusion, that is what logical fallacies are, essential our mechanism for rationalizing our own conclusion” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“There are several lessons to be learned from the story of Lord Kelvin. The first is that scientific authority can never rest in one individual—no matter how famous or successful their career.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Another heuristic is the escalation of commitment. Once we have committed to a decision, we tend to stick to that commitment. We feel like we have invested in it, and therefore, that feeling biases all later judgments about that commitment. We’re overly influenced by what we have already committed to, even if further commitment is a losing proposition—including money, time, or even soldiers’ lives.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“An example of a confirmation bias is called the toupee fallacy. Some people believe that they can always tell when a man is wearing a toupee because when they notice a man wearing a toupee, they take that as confirmation of their ability. However, they’re not accounting for the fact that they don’t notice when they don’t recognize a toupee—that data is completely missing from their dataset.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“A choice-supportive bias is a bias in which once we make a decision, we then assess that decision much more favorably. This is a way of relieving some of the anxiety or angst over whether we made the right decision. When we buy something, we therefore have a tendency to rate what we purchased much more favorably than we did prior to deciding that that’s what we were going to purchase. In essence, we’re trying to justify a decision that we already made.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Another very powerful bias is what psychologists call the fundamental attribution error, which is an actor-observer bias, or a tendency to explain the actions of others according to their personality traits while downplaying situational factors. However, we explain our own behavior with situational factors and downplay personality traits. For example, if someone trips while walking down the sidewalk, we’re likely to conclude that he or she is a clumsy person. If we trip when walking down the sidewalk, however, we will blame it on an external factor, such as a crack in the sidewalk.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Wishful thinking is another bias toward favorable ideas that are emotionally appealing to us regardless of the logic and evidence. This is also called an optimism bias. For example, this motivates people to seek highly implausible—even magical—treatments for their ailments over warnings that such treatments do not work. In this case, their desire for the treatment to work overwhelms their logic. The lottery industry is largely based on this bias.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Observational studies do not control many variables, although you can try to account for as many variables as you can think of. Therefore, observational studies are always subject to unknown variables — those that you haven’t thought of or that cannot be accounted for. Observational studies generally can only demonstrate correlation; they cannot establish definitively cause and effect,” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“The primary weakness of anecdotes as evidence is that they are not controlled. This opens them up to many hidden variables that could potentially affect the results. We therefore cannot make any reliable assumptions about which variable (for example a specific treatment) was responsible for any apparent improvement.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“When someone looks at me and earnestly says, “I know what I saw,” I am fond of replying, “No you don’t.” You have a distorted and constructed memory of a distorted and constructed perception, both of which are subservient to whatever narrative your brain is operating under.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“In any rigorous study, the scientists that are recording the outcome of the results should be blinded to whether what they’re looking for is in the intervention or in the control group. This can introduce significant subconscious researcher bias into the results” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“Multilevel marketing schemes prey on our innumeracy. To maintain the pyramid structure of a multilevel marketing scheme, geometrically more people would have to be recruited. For example, salespeople are encouraged to increase the number of people they recruit, who then sell for them. Sometimes they’re required to recruit a certain minimum number of people in order to make back their investment. Then, the recruited people have to recruit the same number of people to make back their investment. These progressions are inherently unsustainable, and we underestimate how quickly a community, city, region, or even the world can be saturated with any multilevel marketing program.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“We tend to overestimate the probability of something happening if we see it, which relates to the availability heuristic. We tend to latch onto examples that are available to us and that we are aware of.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“The notion of life energy, for example, is a pre scientific idea, but it forms the basis of many so-called alternative therapies, such as therapeutic touch, acupuncture, straight chiropractic, and even homeopathy. Often, pseudoscience involves grandiose claims based on preliminar or flimsy evidence. This is sometimes called the Galileo syndrome for the frequent tendency to compare oneself to Galileo. In other words, far-reaching claims that overturn entire segments of well-established science are extrapolated from very little research or small bits of evidence.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine
“We don’t recall memories as much as we reconstruct and update them, altering the information every time we access it. Our brains also fill in gaps by making up information as needed. Additionally, a host of logical flaws and cognitive biases plague our thinking, unless we are specifically aware of and avoid those fallacies. In this course, you will explore logical fallacies and cognitive biases in detail, learning how they affect thinking in often subtle ways.” Dr. Steven Novella, Assistant Professor of Neurology at the Yale School of Medicine