Categories
AF ablation Doctoring General Cardiology ICD/Pacemaker Knowledge

The Nobel in Economics and Medicine?

Once again, the Nobel prize for economics–not science and medicine–has immense influence on the practice of medicine.

Every day, in fact.

This year, Richard Thaler, a behavioral economist at the University of Chicago, won for his work on human biases and temptations.

The famous writer Michael Lewis (Moneyball) has a nice essay on Thaler’s work here.

Along with Kahneman and Tvresky, the work of behavioral psychologists and economists directly relates to clinical medicine because it describes human decision making.

Thaler made lists of irrational decisions. For example, we often make choices that don’t result in long-term well-being. E.g. Eating sugary foods and obesity. This led him and colleague Cass Sunstein to the concept of choice architecture.

This concept, applied to employee savings programs, led to an increase in the savings rate of workers.

Thaler calls it a sort of libertarian paternalism.

That’s sort of what doctors do, isn’t it? We are experts in medical science; patients are expert in their goals, and the best medical decisions come when we help align care with a person’s goals.

Libertarian paternalism: freedom of choice with expert nudges.

What’s critical for doctors to understand–and I wrestle with this everyday in the office–is that humans are not maximizers, or logical, or even all that sensible. Doctors feel decisions. So do our patients.

Kahneman and Tversky described the notion that people respond differently when a choice was framed as a loss than when it was framed as a gain. As Lewis writes: “tell a person that he had a 95 percent chance of surviving some medical procedure and he was far more likely to submit to it than if you told him he had a 5 percent chance of dying.”

In my earlier years as a physician, I would emphasize the 95% chance things would go well. Now, as I have aged and seen more of what Nassim Taleb calls iatrogenics (medical harm), I find myself more often feeling the potential harm. (I’m reading Taleb’s book Antifragile. It’s really good.)

Then I am thinking to myself: how I feel about this decision determines the framing.  And that will surely influence the patient. Gosh. This behavioral psychology stuff is damn important.

If I am too pessimistic, patients might lose out on the gain. If I am too optimistic, patients can be exposed to needless harm.

Take the decision to implant an ICD for prevention of sudden cardiac death.

The accepted benefits in selected patients is an absolute risk reduction of about 7%. That’s pretty good. It corresponds to an NNT (number needed to treat) of about 14–meaning, you have to implant 14 ICDs to save one life. We can’t know who the 13 are who won’t need it, and who the one is who will have his life extended.

But all treatment comes with potential harms. A recent look at real-world data suggests complications of ICDs occur in about 7% of people.

How do I frame this? What if the day before this visit, I had a patient suffer a massive complication from an ICD? Or perhaps the day before, a person was saved from death with an appropriate shock.

 

Medical decisions are not so easy. Because, humans, as Thaler has found, are complicated. Thinking about how we humans think is one of the most interesting aspects of this job.

JMM

2 replies on “The Nobel in Economics and Medicine?”

So I am not a statistician, but I have had an ICD since 2011. Are you saying that it is a wash? The 7% benefit balances the 7% complication rate? And is the complication rate front loaded into the time of implantation, such that if you get past that, the benefit might outweigh the residual risk?

Comments are closed.