The “residents” — medical school graduates who staff the hospital for several years as they learn specialties such as emergency medicine, internal medicine and family medicine — were furious when it became clear that just seven of the more than 1,300 at the medical center were in the first round for vaccinations. Also affected were “fellows,” who work in the hospital as they train further in sub-specialties, nurses and other staff.
An email to pediatrics residents and fellows obtained by The Washington Post said that “the Stanford vaccine algorithm failed to prioritize house staff,” as the early year doctors are known collectively.
Of course “Medical school graduates” should read doctors or physicians.
Stanford blaming the algorithm in its medical version of Silicon Valley doublespeak is such a hand-wavy 2020 way of passing the buck. An algorithm is just a set of rules for carrying out a task.
The algorithm in this case isn’t some artificial intelligence black box. It represents value judgments made by fallible humans. In this case, the folks in charge forgot that it should be a person’s function in the system more than their benefits package that should determine how quickly they receive a vaccine that may not only protect them but hopefully prevent the spread of a potentially deadly disease to patients.
Those in the front line should be at the front of the line (unless we were to otherwise reasonably implement the age-based vaccination program used in many countries). Within a hospital system stratified by job roles, it’s the environmental and food services staff, nurses, CNAs, techs, transport, and all providers directly facing patients who should get it when supplies are constrained.
The idea that leadership could forget a huge class of people working in the hospital because those people aren’t considered normal employees is just another example of trainees being treated as faceless cog-like cheap labor. It’s an unfairness baked into our training system where the only pathway to independent practice as a physician is to be treated as a second-class citizen, which implicitly supports these sorts of slights.
So, when you make a mistake so blatant that you aren’t sure how to respond, don’t blame the algorithm. Blame yourself.