Automated Decisions in Health Care by Barbara A. Olevitch, Ph.D.

by Barbara A. Olevitch, Ph.D.

Author: Dr. Olevitch is a  psychologist with a background of clinical research and practice who has devoted herself to writing about “end of life†issues from a Jewish perspective.

A feature story in Time magazine on June 22 by Eliza Gray revealed that many employers are using personality tests with large numbers of questions to choose their employees. The questions do not necessarily measure the candidates’ abilities to do their proposed jobs.  Instead, they determine whether the candidates resemble the employees who are already doing the job successfully.

This is a very popular way of making decisions – get an “expert†to make up a list of questions, get the candidates to input their information, and then, BINGO, an automatic decision!

So it is not so surprising, given our current culture, that an article published on January 23, 2015 in the Telegraph announced, “A test to determine if elderly patients will die within 30 days of being admitted to hospital has been developed by doctors to give them the chance to go home or say goodbye to loved ones.â€

More details about this “test†can be found in a Medscape article published on the same day. It is based on an article in BMJ Supportive and Palliative Care. The authors searched for “predictors†of death and came up with a checklist.

A predictor is any item of information – fair or unfair – that you can use along with other information to come up with a score that you can use to predict something else.

Before we consider the individual items that these authors used, we must first consider whether the job of a health care facility is to treat or to predict? We go to an emergency room, not to a fortune teller. We want the best treatment, and we hope it will help us. Maybe it won’t, but we want to try. We want to encounter gifted physicians and nurses, not clever computer whizzes teamed up with cautious accountants trying to decide who to discourage from pursuing medical treatment.

By using this kind of a “test,†they try to make it appear that they are making a scientific decision.  The patient will have to hear about how he does on the “death test.†Only the patient who has the courage to ignore a pessimistic prognosis and insist that he wants medical treatment will get to meet the gifted physician who might possibly help him.

In any predictive “test,†the items can be fair or unfair. For example, if you trying to predict how well a student will do in college, you might use his high school grade point average as a predictor. Most people might consider this to be fair.

But what if you use his skin color? Or what neighborhood he grew up in? Is that fair? Most people would say that it is not fair. Even if it did have some predictive value, it wouldn’t be fair to use it.

So now let’s consider some of the items on the “death test.†One item was whether the patient has been hospitalized previously in the past year! Is this fair? The number of admissions tells us exactly nothing about the patient’s medical condition! He could have had multiple disconnected problems or he could have been admitted for the same problem multiple times. Perhaps he wasn’t treated correctly? Perhaps he was discharged too soon? The fact that he was hospitalized previously doesn’t even mean that he is unhealthy. Perhaps he was injured and recovered and was then injured again. When we create a seemingly objective death-likelihood “score,†we are bound to deny  needed and appropriate medical care to many patients!

Another item on the Medscape list was “age over 65â€! Is this fair?

Even in the corporate world, some people are objecting to using these inventories. According to Eliza Gray’s article, the senior engineers at Google didn’t want to use an algorithm for deciding upon promotions because it was too important.  Their hiring executive said that these algorithms might weed out certain people who would be valuable to the company and also have a major impact on the lives of the individuals who might be denied a promotion.

In the health care arena, the impact is much greater. It can mean life or death.

If it is being proposed to start rationing health care, the proponents should at least have the nerve to suggest such a thing openly and be prepared to encounter a lot of opposition. Let’s not be fooled when they try to disguise it as a scientific discovery! The person’s age is input into an elaborate computer program and the chances of predicting his death are slightly increased – big deal. This is not exactly a scientific discovery. The neighborhood gossip looking out the window could predict somebody’s death using the same information – “This guy called an ambulance again. He looks pretty old. I don’t think he’s coming back this time.â€

It is true that if you use enough items like this to create a “score,†you can be right some of the time that somebody is going to die (especially if you convince him to go home instead of being treated)! But just because you can make a few correct predictions doesn’t mean that decisions based upon these predictions are medically defensible or ethically acceptable.

With this sort of “death test,†who needs a death panel?

We have to be alert to these kinds of developments.  We only have the right to refuse medical care if the medical care is still being offered. If it is not truly being offered, if health care counselors are using more and more elaborate techniques to try to talk patients out of requesting care, and if it is beyond the ability of the average patient to withstand their tactics, then we no longer have patient autonomy.

Categories: