Medicine’s Fundamentalists

Norman Doidge, a contributing writer for Tablet, is a psychiatrist, psychoanalyst, and author of The Brain That Changes Itself and The Brain’s Way of Healing.

‘The phrase, “all else being equal,” is crucial, because so often all else is not equal. Simply repeating “RCTs are the gold standard of evidence-based medicine” implies to the naive listener that if it is an RCT then it must be a good study, and reliable, and replicable. It leaves out that most studies have many steps in them, and even if they have a randomization component, they can be badly designed in a step or two, and then lead to misinformation. Then there is the very uncomfortable fact that, so often, RCTs can’t even be replicated, and so often contradict each other, as anyone who has followed RCTs done on their own medical condition often sadly finds out. …. ‘

Read here:

We seem to read frequently that scientific evidence in medicine is no good if it does not come from randomized clinical trials. This is not really the case, and has never been the case. Norman Doidge, psychiatrist, psychoanalyst, and author of The Brain That Changes Itself and The Brain’s Way of Healing explains how medical science really works, and fails to work.

There are big issues discussed in the article pertaining to research designs, controls, confounds, randomization, statistics, bias, incentive and deception which anyone using scientific evidence to make a judgment or buttress an argument should understand. All scientific evidence is underdetermined. One study does not establish a case; it only points in a certain direction.

An earlier article by Doidge was on HCQ and the campaign against it. Doidge disputed Fauci’s claim that HCQ did not work because evidence in its favor was only anectdotal, and the studies had not been done according to the random controlled paradigm, the alleged ‘gold standard.”

Scientific evidence, production, evaluation and interpretation is way more complicated than that. Claiming that RCT are the only method to gain scientific understanding shows a limited knowledge of the history of science. This has been pointed out by Dr. Harvey Risch in several places, in his defence of HCQ for early stage treatment and prophylactic use for Covid-19. So, you seem to be implicitly aligning yourself with Fauci’s irrational position.

Below are other references:

Understanding and misunderstanding randomized controlled trials

Social Science & Medicine

Volume 210, August 2018, Pages 2-21

Understanding and misunderstanding randomized controlled trials

Angus Deatonab and  Nancy Cartwright

and also this: Evidence for Health Decision Making Beyond Randomized, Controlled Trials

Evidence for Health Decision Making Beyond Randomized, Controlled Trials

  • Thomas R. Frieden, M.D., M.P.H.

More from Dr. Norman Doidge:

click here

“But drug companies are big businesses, and when they bring a drug to market, they do studies that display an aptitude for not asking questions they don’t want the answer to. Relatively little attention is paid to documenting even short-term side effects in studies. How little? A recent review of 192 randomized control trials, in seven different areas of medicine, showed most randomized control trials for drugs (61%-71%) didn’t deal adequately with short-term drug toxicity, and those that dealt with it devoted the same amount of page space, in the published articles, as was taken up by listing the author’s credentials. And of course we can only learn of the long-term side effects decades later.”

You might also want to read my essays “Randomized Controlled Trials” and “Randomized Control Trials and Experimental Evidence.”

When to remain silent

Knowing when to draw inference, and when to remain silent (epoché) – is the key to fortress wisdom.” –https://theethicalskeptic.com/2015/04/08/a-new-ethic/

Contempt prior to investigation

There is a principle which is a bar against all information, which is proof against all arguments and which cannot fail to keep a man in everlasting ignorance – that principle is contempt prior to investigation. –English Philosopher, Herbert Spencer

Effective Skepticism

Humble

  • Aware of limitations in own beliefs
  • Not arrogant

Independent of thought

  • Not a worshiper of the status quo
  • Able to avoid social pressures and group-think
  • Understanding that consensus does not imply truth

Not intellectually combative

  • Not attacking the views of others
  • Not trying to win
  • Not mocking the views of others
  • No ad hominen attacks

Not dogmatic

  • Not a pseudo skeptic
  • Non-theological
  • Non-authoritarian
  • Agnostic in all things
  • No such thing as “the science is settled”

Open to Ideas

  • But not credulous
  • Capable of wonder
  • Suspending judgment
  • Not closed to new ideas
  • Open to new perspectives
  • Striving to put aside biases
  • Not rejecting without consideration of evidence
  • Not pathologically skeptical

Reflective

  • Trying to understand
  • Questioning self
  • Questioning others
  • Evaluative
  • Cautious in reaching conclusions
  • Using evidence based thinking

Willful ignorance

Herbert I. WeisbergJohn Wiley & Sons, Jun. 23, 2014 – Mathematics – 452 pages0 Reviews

An original account of willful ignorance and how this principle relates to modern probability and statistical methods

Through a series of colorful stories about great thinkers and the problems they chose to solve, the author traces the historical evolution of probability and explains how statistical methods have helped to propel scientific research. However, the past success of statistics has depended on vast, deliberate simplifications amounting to willful ignorance, and this very success now threatens future advances in medicine, the social sciences, and other fields. Limitations of existing methods result in frequent reversals of scientific findings and recommendations, to the consternation of both scientists and the lay public.

Willful Ignorance: The Mismeasure of Uncertainty exposes the fallacy of regarding probability as the full measure of our uncertainty. The book explains how statistical methodology, though enormously productive and influential over the past century, is approaching a crisis. The deep and troubling divide between qualitative and quantitative modes of research, and between research and practice, are reflections of this underlying problem. The author outlines a path toward the re-engineering of data analysis to help close these gaps and accelerate scientific discovery.

Willful Ignorance: The Mismeasure of Uncertainty presents essential information and novel ideas that should be of interest to anyone concerned about the future of scientific research. The book is especially pertinent for professionals in statistics and related fields, including practicing and research clinicians, biomedical and social science researchers, business leaders, and policy-makers.”

https://play.google.com/store/books/details?id=L-JvBAAAQBAJ&rdid=book-L-JvBAAAQBAJ&rdot=1&source=gbs_atb&pcampaignid=books_booksearch_atb

Evidence and understanding

Change is ceaseless over time; obvious or imperceptible, but always there. We can break it down as punctuation points, events, as discrete or continuous as we may find convenient. This aids our own understanding, our limited grasp of the world. Events leave traces in their passing, evidence that they have occurred. We look to the evidence, the events, things and their relationships to the world in order to understand the puzzles of existence.

Our perceptions can provide evidence for innumerable things, most of them unnoticed. We evaluate and interpret the traces left in the world by events in order to understand.

Claims of Knowledge

The problem is belief

Claims of knowledge depend on belief, justified beliefs according to some. Is it possible to have correct and justified beliefs? This is a question that has bedevilled philosophers for millennia, and yet is relevant to practical day to day concerns.

Our current understanding is limited and undoubtedly incorrect in numerous ways. Over time, our understanding will change, perhaps for the better, but perhaps for the worse. Maybe in some respects we have it correct, at least in small and practical ways When looking at many complex issues, our understanding gets a lot more suspect.

Perhaps we are best served by regarding all beliefs as tentative, by reserving judgment, looking for evidence. We are better realizing that we do not know over believing that which is incorrect. The latter is frequently far more damaging.

We can see an example in religion. Atheism and deism are both claims to knowledge, knowledge which I do not think we are or will ever be in a position to be certain about. Clearly both the claims of the atheists and the claims of the deists cannot both be correct. It is better to be agnostic, to be non-dogmatic and to suspend judgment. See Understanding through existing beliefs at https://ephektikoi.ca/2020/04/26/understanding-through-existing-beliefs/

It helps to discard the delusion that we stand on some privileged position of knowledge, having a direct conduit from the omniscient one.

Tracking Truth: Knowledge, Evidence, and Science

Sherrilyn Roush

Abstract

This book develops and defends a new externalist, reliabilist theory of knowledge and evidence, and develops a new view about scientific realism. Knowledge is viewed as a tracking theory that has a conditional probability rather than counterfactual formulation, and the property of closure under known implication is imposed on knowledge. It is argued that the tracking theory of evidence is best formulated and defended as a confirmation theory based on the Likelihood Ratio. These tracking theories of knowledge and evidence fit together to provide a deep explanation of why having better evidence makes one more likely to know. The new tracking theory of knowledge is argued to be superior to all currently known externalist rivals. It provides a distinctive explanation of why knowledge is more valuable than mere true belief, and explains why knowledge is power in the Baconian sense. Finally, the book argues that confirmation theory is relevant to debates about scientific realism, and defends a position intermediate between realism and anti-realism based on a view about what having evidence requires.

https://oxford.universitypressscholarship.com/view/10.1093/0199274738.001.0001/acprof-9780199274734

Perception is constructive

“In online discussions, this is usually due to someone misreading the headline and not reading the study. Other times, it is due to expectation bias, the human tendency that can cause people to see or experience what they expect, despite the fact that it is not there. Such things are possible because the nature of our perception is constructive. We don’t see the world as it is; our brain constructs a world for us to experience. Because it often does so based on what it expects to be there, we will often literally see just what we expect to see. One can easily read a study and literally see something that is not there.” — David Kyle Johnson Ph.D. https://www.psychologytoday.com/ca/blog/logical-take/202007/yes-masks-work-debunking-the-pseudoscience

Communication

I am going to use the neutral term communication to cover the transmission of information from one person to another. This may be spoken or written information, or perhaps communications transmitted by other means. We do not have a special term for information that is correct, although we have special terms for information that is incorrect, misinformation, and for information the is deliberately deceptive, disinformation.

Below is a mind map showing these relationships. After the diagram, I will expand a bit on these ideas.

1 Information

Some times communication contains true assertions. This can happen in at least two ways:

  1. Logical truth
  2. Coincidental truth

Some true assertions are arrived at correctly, logically, reasoning from evidence. If the premises are correct, and the logic is sound, then the conclusions are supported by reason.

Some true assertions are arrived at by chance, guess work, or whim. They are not arrived at from true premises and logical reasoning. Although the conclusions are true, this is only a coincidence.

2 Misinformation

Some times communication contains false assertions, but not outright lies. This can happen in at least these ways:

  1. Poor logic
  2. Mistaken evidence
  3. Incorrect premises
  4. Clinical delusion
  5. Confabulation

Poor logic means that even if the premises are true, the conclusions do not follow because the logic is shaky. If the premises are false, you will not have a logically supported conclusion either.

On the other hand, with coincidentally correct assertions, it does not matter if either the premises or the logic are shaky, you get a true conclusion that is not justified.

Evidence requires interpretation. Sometimes, this is straight forward, but frequently it is not. So, conclusions based on suspect interpretations of evidence are of course also suspect.

Incorrect premises can not produce correct assertions through logic. Sometimes our premises are articulated explicitly, and sometimes they are really left ambiguous. In either case, if they are wrong we can not reach a correct conclusion by reasoning from them.

Clinical delusion will result in an inability to think coherently and consistently. It is not likely to lead to logically supported assertions.

Confabulation is another pathological condition where people come up with fact free explanations for events in order to interpret life experiences, to answer questions. It is not deliberate lying, but results in inadvertent lies.

3 Disinformation

Disinformation is deliberate deception. It comes in at least three varieties:

  1. Propaganda
  2. Recreational lies
  3. Lies and evasions of convenience

Governments, organizations, businesses and individuals engage in deliberate deception all too often, in order to gain an advantage at the expense of others, or to cover their own misadventures. It is called propaganda when done by governments and organizations, advertising when done by business, and lies when done by the rest of us. In any case, it generally puts the recipient of the disinformation at a disadvantage, and sabotages trust if found out.

Recreational lies and hoaxes are common among people who take delight in duping or gas-lighting others. It takes a certain amount of malevolence, narcissism, Machiavellianism, sociopathy or even psychopathy to engage in this activity. Online trolls are one aspect of this. Hoaxers are another. There are just far too many who delight in spreading bull excrement.

Lies and evasions of convenience are common. People do this both deliberately and reflexively, to get themselves out of immediate trouble or socially awkward situations, or to avoid giving offence. Some of these are classed as “white lies.” They can often backfire. Children do this all of the time to avoid negative consequences. Adults may not be that much different. Such lies are often discovered, reduce trust, and may anger others. Short term gain and long term pain may be the real consequence. In situation comedies, it has been the norm to create entire episodes around such lies and their outcomes.

Motivated reasoning

From Wikipedia; see https://en.wikipedia.org/wiki/Motivated_reasoning:

“Motivated reasoning is a phenomenon studied in cognitive science and social psychology that uses emotionally-biased reasoning to produce justifications or make decisions that are most desired rather than those that accurately reflect the evidence, while still reducing cognitive dissonance. In other words, motivated reasoning is the “tendency to find arguments in favor of conclusions we want to believe to be stronger than arguments for conclusions we do not want to believe”.[1] It can lead to forming and clinging to false beliefs despite substantial evidence to the contrary. The desired outcome acts as a filter that affects evaluation of scientific evidence and of other people.[2]

“Motivated reasoning is similar to confirmation bias, where evidence that confirms a belief (which might be a logical belief, rather than an emotional one) is either sought after more or given more credibility than evidence that disconfirms a belief. It stands in contrast to critical thinking where beliefs are approached in a skeptical and unbiased fashion.”

My response is: are any beliefs not motivated, not subject to confirmation and disconfirmation bias? We are more emotionally invested in some beliefs than in others of course.

This emotional attachment seems to me to be the basis for the idea of cognitive dissonance, a discomfort when confronted with ideas that challenge our beliefs. Many people parrot the term cognitive dissonance, but I suspect they have not looked at the underlying studies by Leon Festinger. See https://en.wikipedia.org/wiki/Leon_Festinger

As I have said elsewhere, we can only reason from the basis of existing beliefs, and it could not be otherwise. These beliefs serve as the grounds for further belief, and are resistant to revision if they link to other beliefs in an extensive network, and also are held with emotion and conviction. https://ephektikoi.ca/2020/04/24/clarity-or-murk/, https://ephektikoi.ca/2020/04/30/the-fundamental-problem-is-belief/

Also, from https://ephektikoi.ca/2020/04/29/interpreting-the-world/:

“We don’t understand the world as much as interpret it. Things and events come to our attention, are perceived, and interpreted; sometimes more or less correctly; frequently quite incorrectly. We understand things in the context of our current beliefs, values, biases and emotional investment. It could not be otherwise. Some types of events lead to a more correct understanding. Some types of events will probably always be beyond our ability to comprehend.”

— Ephektikoi

I know that my views are subject to these processes, despite my best intentions. However, those without emotional intelligence also reason badly, so we cannot be Mr. Spock from the fictional Star Trek series and still function rationally. See: https://en.wikipedia.org/wiki/Emotional_intelligence