Radiology and the Private Equity Bait and Switch

With permission, I’m reposting a (very lightly edited) anonymous social media post from a young radiologist who joined a private practice that had recently been purchased by private equity:

I think I committed a huge mistake in signing up for a job with a large private practice group that was bought by a big private equity group in the radiology space. There has been a massive turnover of non-partner radiologists, over 30+% pay-cut in collections, very close monitoring of productivity, poor leadership, and no concern for younger radiologists. Almost everything told to me at the time of my interview turned out to be incorrect including work volume, projected compensation, and the reasons non-partners had left.

Several older partners are retiring as they made their millions in the buyout. The practice can’t hire fast enough as younger rads keep quitting or getting fired, so we’re overall chronically understaffed. I work extremely high volume (25k or more RVUs/year) extremely busy call with very low salary of $300k.

I fear there could be another sale of the practice in the future given how rapidly this private equity company is trying to acquire other practices, further driving down salary. We’d stay understaffed (many non partners leave after the buy-out), so volume will likely still be too high especially for the salary given. Could go on, but feeling really stuck. Any recommendations if I should stick it out or quit?

This is the private equity bait and switch, and I don’t even mean just in the premeditated awfulness of an operational model largely predicated on buying a business with the intent of squeezing the value-creating units (human beings) for more value by a combination of more work for less pay.

I mean that, in many cases, the natural history of these practices can result in a nonviable work environment through what should be an expected evolution of staffing changes.

Let’s walk through one way this can play out:

  • Partners get a buy-out that doesn’t fully vest for a pre-specified duration of time.
  • Some non-partners immediately leave the practice due to perceived or real insult, insufficiently generous retention package, or knowledge that long-term income will fall. Non-partners expect to pay sweat equity to become a shareholder, but they don’t want to work hard for less pay if there isn’t a meaningful long-term partnership at the end of it.
  • Non-partner employee exodus results in immediate understaffing. With the same work and fewer people, everyone is taking more call and needs to work harder to keep up with the lists. Even a desirable practice can’t necessarily hire people instantaneously.
  • The job is therefore less desirable and may have a hard time recruiting and retaining talented employees.
  • The partners finally get all of their money. Many may have planned to retire anyway at this juncture but many more will certainly move on if the job now sucks, which it often does.
  • Even more understaffing occurs.

Most PE practices haven’t gotten past this timeline yet. Potential outcomes are re-selling to a larger fund, selling the practice back to the original physician shareholders, and/or significant operational changes. In a chronic understaffing situation, employee pay sometimes becomes more competitive in order to retain FTEs. In some cases, the now underperforming practice may lose imaging contracts, which has the unanticipated benefit of fixing the understaffing problem.

Now to be clear, these issues can arise in any practice. There are certainly examples of physician-owned groups squeezing employees in the workup with the promise of partnership only to churn them when the time comes. Corporations and PE groups absolutely do not have a monopoly on being jerks. However, a once-profitable business is automatically less “profitable” when a third party inserts itself to take a mandatory cut. The value-add of oft-touted “efficiency” gains can only take you so far, particularly in the face of downward reimbursement pressure.

And to be clear, someone just making you work harder and read more cases per day for the same pay isn’t the kind of efficiency gain any group should be proud of. While $300k is obviously a lot of money, the average radiologist reads in the ballpark of 10k RVU per year depending on subspeciality. 25k RVUs is a really big number, which means that the anonymous poster is generating a ton of money for someone else on the back of their misery.

Their job is almost certainly not going to get better, and they should leave.

Lesson: Know the group dynamics and local market of any practice you consider joining.

Virtual Exams and Security Theatre

Many months into the pandemic, and we’re all acquainted with the difference between a true public health measure and security theatre. Being outdoors instead of crowding inside? Meaningful intervention. Daily temperature screens? Theatre. We know that most people with the virus who are putting themselves around other humans will not be actively febrile but are still capable of spreading it. These measures are designed to make you feel better about engaging in an activity or to preserve the pretense of control in an ultimately uncontrollable scenario.

And so it is with remote testing.

For as long as there have been high-stakes exams, there has been high-stakes exam security. No student is a stranger to a live proctored exam, and we are all familiar with the commercial testing centers and their uncomfortable low-budget airport security facsimile. You would be forgiven for assuming that these measures were all to prevent cheating, and that is certainly part of the purpose, but individual dishonesty on a big exam ultimately isn’t the most pressing concern: it is the control of intellectual property. These exams cost money and time to create, and having the questions widely shared by some intrepid thief invalidates them and makes development more difficult and expensive.

Some organizations, like the National Board of Medical Examiners, increased testing capacity during the pandemic by expanding live proctoring to include selected medical schools. This made a lot of sense because medical schools give tests all the time and have the resources and space that can be easily utilized for exam administration.

Other organizations have looked to employ third-party online virtual proctoring solutions for exam security. An example of these services would be ProctorTrack, the company at the heart of the massive failure of the American Board of Surgery’s attempted virtual board examination this summer.

My board, the American Board of Radiology, has announced its intention to use a similar service, though they haven’t specified the details.

Third-party proctoring

I’m going to argue that expensive and invasive monitoring solutions like ProctorTrack sacrifice a lot in personal security and inconvenience for a modest benefit.

To proctor a high stakes exam, what you really need is a webcam turned on with both real-time and recorded video and audio of the examinee. You need to be able to watch their behavior as they take your exam, and you should be able to interact with them by audio if needed. This is enough to discourage and catch a casual cheater.

But what about the industrious premeditated antiestablishment cheater hell-bent on copying the test and then releasing it like Wikileaks? Well, the solution these platforms have for that is a combination of electronic control and visual monitoring. That starts with control of your phone and control of your computer so that you can’t run non-sanctioned software and all your actions are recorded. They usually employ some sort of “Roomscan,” which is what ProctorTrack called their “AI-powered” environment screening feature to supposedly able to capture security contraband.

I don’t know what nonsense data these companies use to train their algorithms, but let’s just be reasonable and agree that no 360-degree video sweep is going to see through semi-opaque objects, under all surfaces, or check for hidden pocked sewen into the crotch of your pants.

So yes, if IP theft is a low-hanging fruit type of crime, then these measures raise the bar. But most people aren’t going to cheat, and anyone truly determined to likely still can. The downsides of security theatre are very real: personal insecurity and platform instability. The American Board of Surgery got to experience both as candidates began receiving Facebook friend requests from proctors and seeing unauthorized credit card charges almost as fast as exam administration was canceled.

Alternative solutions

I don’t mind if people agree to a third-party proctoring platform if it at least works, but I would argue this sort of invasiveness should be an option and not a requirement. A live and record video-proctored exam with software that limits the most egregious forms of screen recording etc (similar to that used by UWorld, for example) would be reasonable.

The ABR should also offer disseminated in-person testing in a relatively safe controlled environment like at a residency program, which would allow the ABR to rely on a combination of live proctoring by residency programs and/or ABR volunteers as well as remote first-party proctoring themselves.

The Best Solution

The real solution, ultimately, is to have an exam that does not rely on gotcha questions to test raw medical knowledge. As long as the ABR focuses on multiple-choice questions coupled with an image or two, the exam will remain vulnerable to question theft and recalls.

Virtual proctoring or not, every medical MCQ exam has already created a robust industry of question bank products peddling glorified recalls. The solution won’t be found in the monitoring, it’s in the test development itself.

The lion share of a radiology certification exam should be the internet-enabled practice of radiology. Being able to accurately interpret real exams, as I’ve argued before, is by far the best testing format: high fidelity simulation has not just the best face and construct validity but almost certainly the best content and predictive validity as well.

If it’s not a PACS, then it’s probably not a good test.

The Report of the ACR Task Force on Certification in Radiology

The report from the ACR Task Force on Certification in Radiology is out. This is the American College of Radiology’s formal take on how the American Board of Radiology is doing exercising its monopoly on radiology certification.

It’s clear, concise, well-researched, and contains wonderfully diplomatic language. I admire the restraint (emphasis mine):

Fees have remained unchanged for initial certification since 2016 and MOC since 2015. We acknowledge there is a cost of doing business and reserves are necessary but increased transparency and cost effectiveness are encouraged.

This in reference to finances like this. Such a gentle request.

Radiologists are also concerned that there is absence of scientific evidence of value.

An understatement certainly written like it was dictated by a radiologist.

We congratulate the ABR for modernizing its testing platform for MOC Part 3. The move to OLA is a responsive change from feedback. However, we are not aware of any theory or research that supports how the annual completion of 52 online multiple-choice questions (MCQ) demonstrates professional competence.

Ooph. Boom goes the dynamite.

MOC critique is tough.

On the one hand, OLA is better than a 10-year exam based on sheer convenience alone. It’s a trivial task, and therefore I know many radiologists don’t want to complain because they’re concerned that any changes would only make MOC more arduous or challenging (a valid concern). Organizations would much rather increase the failure rate to stave off criticism about a useless exam than actually get rid of a profit-generating useless exam (see USMLE Step 2 CS).

On the other hand, what a joke. There is literally no basis for assuming this token MCQ knowledge assessment reflects or predicts anything meaningful about someone’s ability to practice. Even just the face validity approaches zero. (Of course, this argument could also apply to taking 600+ multiple-choice questions all at once for initial certification).

Scores on standardized tests have been shown to correlate better with each other than with professional performance. Practical radiology is difficult to assess by MCQs, requiring a much greater skillset of inquiry and judgment.

This relates to the only consistent board certification research finding: standardized testing scores like Step 1 are the best predictors of Core Exam passage. People who do well on tests do well on tests. And while certainly smart hard-working people are likely to remain smart hard-working people, it remains to be seen if Step 1, in-service, or even Core Exam performance predicts actually being excellent at radiology versus being excellent at a multiple-choice radiology exam.

The obvious concern is that it’s the latter: that the ABR’s tests do not differentiate between competent and incompetent radiologists, and that we are largely selecting medical students based on their ability to play a really boring game than their ability to grow into radiologists.

Successful certification programs undertake early and independent research of assessment tools, prior to implementation. This is a vital step to ensure the accurate assessment of both learner competence and patient outcomes.

Subtle dig with the use of the word successful, but this is the crux:

Assessments are not bad. Good assessments are a critical component of the learning process. Bad assessments are bad because they provide incomplete, confusing, or misleading information, a problem compounded when a preoccupation with doing well on said bad assessment then distracts learners from more meaningful activities (look no further than Step 1 generated boards fever).

Medicine and radiology should not be limited by legacy methodology. Recognizing that learning and assessment are inseparable, the ABR has the opportunity to lead other radiology organizations, integrating emerging techniques such as peer-learning and simulation into residency programs. Assessment techniques are most effective when they create authentic simulations of learners’ actual jobs, although such techniques can be time-consuming and resource-intensive to develop.

Yes.

And I’ll say it again: diagnostic radiology is uniquely suited–within all of medicine–to incorporate simulation. Whether a case was performed in real-life years ago or is fresh in the ER, a learner can approach it the same way.

Despite alternative certification boards, the market dominance of the ABMS and its member boards has been supported by a large infrastructure of organizations that influence radiologists’ practices. The ABR should welcome new entrants, perhaps by sponsoring products developed by other organizations to catalyze evolution, innovation and improvement to benefit patients.

Hard to imagine that alternate reality.

Although the ABR meets regularly with leadership from external organizations, such as APDR, the ABR could better connect with its candidates and diplomates by reserving some voting member positions on their boards for various constituencies.

As I discussed in my breakdown of the ABR Bylaws, there is a massive echo chamber effect due to the ABR’s promotion policy, which requires all voting board members to be voted in by the current leadership, usually from within the ranks of its hard-working uncompensated volunteers. This means that operationally, the ABR is completely led, at all levels of its organization, by people who believe in and support the status quo.

Meeting with stakeholders may act as a thermometer helping them feel the room. The recent inclusion of Advisory Committees that give intermittent feedback and the perusal of social media commentary may give them the occasional idea. But all of this information is, by the ABR’s design, put into a well-worn framework.

The ABR is designed to resist change.

No one has a vote who wasn’t voted to have a vote by those who already vote.

And that’s a problem.

Imaging is the great equalizer

Imaging is the great equalizer. When we look deep into ourselves from the vantage of this fundamental level, with exterior barriers and labels removed, we just might just see ourselves, other people, and our lives in a whole new light.

From Dr. Cullen Ruff’s Looking Within: Understanding Ourselves through Human Imaging, currently an Amazon Black Friday deal for a whopping $0.99 on Kindle.

When I see patients these days, it’s usually because I’m about to put a needle somewhere, but Ruff is old enough that he has decades of stories from an era where radiologists got (relatively speaking) a lot more patient facetime.

And yet, what a strange job we have, bypassing everything externally visible to study people’s insides.

WCICON 2021

The next Physician Wellness and Financial Literacy Conference (WCICON21) will be online from March 4-6, 2021. I’ll be there virtually to answer questions and give two talks, one about writing (worth CME) and one about student loans. It’s a great opportunity to use those CME funds that are feeling neglected during the pandemic. Registration is now open.

In related news, this week is the White Coat Investor’s “Continuing Financial Education Week,” which means that all courses including Fire Your Financial Advisor are 10% off and they’re throwing in the original WCICON Park City course for free. You can nail that deal through this link.