This is a work in progress, but I humbly submit a draft proposal for a new multimodality standardized radiology grading schema: Sigh-RADS.

Sigh-RADS 1: Unwarranted & unremarkable

Sigh-RADS 2: Irrelevant incidental finding to be buried deep in the body of the report

Sigh-RADS 3: Incidental finding requiring nonemergent outpatient follow-up (e.g. pancreatic cystic lesion)

Sigh-RADS 4: Off-target clinically significant management-changing finding by sheer chance.

Sigh-RADS 5: Even broken clocks are correct twice a day (e.g. PE actually present on a CTA of the pulmonary arteries).

Sigh-RADS 6: Known malignancy staged/restaged STAT from the ED


Update (h/t @eliebalesh):

Sigh-RADS 0: Inappropriate and/or technically non-diagnostic exam for the stated clinical indication.

Radiology Call Tips

It’s July, and that means a new generation starting radiology call. I’m not sure I’ve ever done a listicle or top ten, so here are fifteen.

The Images

  1. Look at the priors. For CTs of the spine, that may be CTs of the chest/abdomen/pelvis, PET scans, or *gasp* even radiographs.
  2. Look at all reformats available to you. On a CT head, for example, that means looking at the midline sagittal on every CT head (especially the sella, clivus, and cerebellar tonsils) as well as clearing the vertex on every coronal.
  3. Become a master of manipulation. If your PACS has the ability to generate multiplanar reformats or MIPS, don’t just rely on what the tech sends as a dedicated series. Your goal is to make the findings, and you should be facile with the software enough to adjust the images to help you make efficient confident decisions, such as adjusting the axials to deal with spinal curvature or tweaking images to line anatomy up when a patient is tilted in the scanner. MPRs are your tool to fight against confusing cases of volume averaging.


  1. Your reports are a reflection of you. I don’t know if your program has standard templates or if those templates have pre-filled verbiage or just blank fields.  There is nothing I’ve seen radiologists bicker about more than the “right” way to dictate. What is clear is that you should seriously try to avoid errors, which include dictation/transcription errors as well as leaving in false standard verbiage. We are all fallible, and Powerscribe is a tool. Do whatever it takes to have as close to error-free reports as humanely possible
  2. Seriously, please proofread that impression. Especially for mistakingly missing words like no.
  3. Templates and macros are powerful, useful, and easily abused tools, just like dot phrases and copy-forward in Epic. I am all for using every tool you have, but you need to use them in a way that comports with your psychology and doesn’t make you cut corners or include inadvertently incorrect information.
  4. Dictate efficiently. If you are saying the same thing over and over again, it should be a macro. If you use PowerScribe, you can highlight that magical text and say “macro that” to create a new macro. (On a related note, “macro” is a shorter trigger word than “Powerscribe”)
  5. More words ≠ more caring/thoughtful. As Mark Twain famously said, “I didn’t have time to write a short letter, so I wrote a long one instead.” It’s easier to word vomit than to dictate thoughtfully, but no one wants to read a long (or disorganized) report. Thorough is good, but verbose doesn’t mean thorough. It usually means unfiltered stream of consciousness. The more you write, the less they read.
  6. Never forget why you’re working. The purpose of the radiology report is to create the right frame of mind for the reader. Our job is to translate context/pretest probability (history/indication) and images (findings) into a summary that guides management (impression).
  7. Address the clinical question. This is especially true in the impression. If your template for CTAs was designed for stroke cases and says some variation of “No stenosis,” that impression would be inappropriate for a trauma case looking for vascular injury.
  8. Include a real history. Yes, there are cases where an autogenerated indication from the EMR is appropriate, but there are many more where that history is either insufficient or frankly misleading/untrue. You need to check the EMR on every case for the real history. Then, including a few words of that history is both the right thing to do and also very helpful for the attending who is overreading you.

Your Mindset

  1. Radiologists practice Bayesian statistics every day. This is to say: context matters. A subtle questionable finding that would perfectly explain the clinical situation or be more likely given the history should be given more psychological weight in your decision-making process than if it would be completely irrelevant to the presentation. For example, a sorta dense basilar artery is a very different finding in someone acutely locked-in than somebody with a bad episode of a chronic headache.
  2. Work on your tired moves. We can’t all make Herculean calls at 4 am. When you’re exhausted and depleted, you rely on the skills you’ve overtrained to not require exceptional effort. For radiologists, this boils down to your search pattern. You need to not just have well-developed search patterns but also to have sets of knee-jerk associations and mental checklists of findings to confirm/exclude in different scenarios to prevent satisfaction of search (e.g whenever you see mastoid opacification in a trauma case, you will make sure to look carefully for a temporal bone fracture).
  3. Everyone is a person. The patients, the clinicians, the technologists, and any other faceless person you talk to on the phone. It’s easy to feel distanced and disrespected sitting in your institution’s dungeon. But even you will feel better after a hard night’s work when you’re a good version of yourself and not just someone sighing loudly and picking fights with strangers.
  4. Music modulates the mood.

Should You Care About the ACR’s DXIT Exam?

Who & What

The Diagnostic Radiology In-Training (DXIT) is radiology’s in-service/in-training exam and it’s typically taken by R1, R2, and R3 residents midway through the year. Every(?) specialty has one.

It’s important to note that the ACR (the American College of Radiology) offers the DXIT exam, whereas the ABR (the American Board of Radiology) controls the longer multiple-choice nightmare that is initial board certification in the form of the Core and Certifying exams. These are different organizations that exist for different purposes that just happen to both provide multiple-choice tests for radiologists. The DXIT is one of many things the ACR does. Torturing you with their exams is literally all the ABR exists to do. The ACR is a democratic deliberative body. The ABR is not. They also don’t necessarily see eye to eye on the topic of certification.

Does the DXIT matter?

In the strictest sense, the DXIT simply does not matter. The ACR has no control over your future. The ACGME requires programs to perform a “summative assessment,” and the DIXT simply fulfills that need.

DXIT performance only matters if your program thinks it matters. There are no inherent consequences, good or bad, for your performance. Whether or not you get a high five or put into a remediation plan or just receive an email with your scores is up to your program.

It’s my anecdotal impression that most program directors don’t care all that much about the DXIT. If you get 95th percentile, you’ll probably get a pat on the pack? I know my program used to give out an award every year to the resident with the highest score. (The only awards available to non-senior residents were the highest in-service exam score and the research award. Perhaps a very narrow set of behaviors to promote and recognize, but that’s just my take.)

So if you are wondering if it “matters” for you, the answer is probably not: Statistically, most people are not going to perform at the extremes, and even programs that do care are likely to care most about poor performance. All things being equal, programs would obviously prefer their residents perform better than average (and our residents certainly do by a large margin).

But if you’re somewhere in the middle of the pack (i.e. ~50th percentile), I suspect it won’t really matter for you in your program unless otherwise specified. Every group has an average. If your program thinks you need to be at the 70th percentile or some other arbitrary floor in order to not get dissed in your semiannual review, then they should tell you.

Interpreting performance

One of the key limitations in interpreting in-training performance is this: did you study? Presumably, one key factor influencing how much you study is how big of a deal your program makes about the in-service exam. Therefore, residents in those programs may study more and outperform their national peers. Is this meaningful in the real world? Probably not? How does one compare the real-world performance of a resident who studies for two weeks and achieves 70th percentile and one who doesn’t study and gets 50th? I have no idea. High performance without dedicated review and low performance despite preparation are probably substantially more meaningful–but the test isn’t designed for any of that. It’s just a summative assessment.

However, if you are below average, and particularly if you fall in the bottom quintile, I would take the results more seriously. I want to be clear that such a score doesn’t mean you are a bad radiologist or a bad resident or that you do bad work. These are knowledge-based assessments that include a broad and idiosyncratic swath of radiology concepts. These tests are a poor proxy for real-world competence. However, standing between you-as-you-stand-now and you-the-independent-practitioner-of-radiology is a longer and more arduous exam that is the DXIT’s spiritual cousin. This is a wake-up call that you need to start or change the process you’re using to learn radiology. Either you need to up your radiology book learning earlier and/or begin the serious incorporation of high-quality questions into your regular review (e.g. if you haven’t been using your program’s RadPrimer subscription, then start).

Why it really matters

So, I would argue the real reason the DXIT matters, if at all, is as a predictor of Core Exam passage and therefore a possible wake-up call that your personal status quo may not be sufficient.

The reality is that the passage rate for the Core Exam is around 10-15% and that everyone will study much, much more for the Core Exam than they do for the in-service exam. Therefore, the conclusion that people in the bottom quintile of the DXIT are at substantially higher risk of Core Exam failure is highly intuitive. The Core Exam isn’t curved, but logically it’s not a stretch to imagine that the people performing the worst on one radiology multiple-choice exam might also be the same group that would perform worst on another radiology multiple-choice exam.

It’s obviously not perfect, but it’s probably the best thing we have access to at the moment.

There’s some data to back this up, and there are a few relatively recent papers that cover this:

From the first paper:

This chart is from the first paper. Despite a relatively thick confidence interval, their conclusion was that “residents with scores below 20th percentile should be concerned.”

From the second paper:

In our study, higher percentile performance on the ACR ITE strongly predicted success on the Core Examination (P = .003322). [This was based on survey results where the average percentile of the passing group n=251 was 58th and of the failing group n=19 was 33.2th.]

Residents should see the ACR ITE as a crucial step in their training and adequately prepare for it.

From the third paper:

DXIT and Core outcome data were available for 446 residents. The Core examination failure rate for the lowest quintile R1, R2, and R3 DXIT scores was 20.3%, 34.2%, and 38.0%, respectively. Core performance improved with higher R3 DXIT quintiles. Only 2 of 229 residents with R3 DXIT score ≥ 50th percentile failed the Core examination, with both failing residents having R2 DXIT scores in the lowest quintile.

Interestingly, in both studies that tracked multiyear DXIT performance, the relationship between DXIT and Core performance was stronger for R1 and R3 residents and statistically insignificant for R2 residents. Why might that be? If anything, one might assume that the R1 performance would be the least useful given that many R1 residents are taking the exam before taking many of the relevant rotations. (For example, I hadn’t had body CT, ultrasound, mammo, or peds, to name a few). Perhaps the R1 results more reflect generic test-taking acumen, the R3 results partially capture early Core prep, and the R2 results are garbage because everyone is too tired from joining the call pool to bother preparing. Or perhaps the data is just messy garbage. Who knows?

Some possible takeaways

Fact: The statistical trends above are not destiny. It is information. What you choose to do with it is up to you.

If you feel like you studied for the DXIT but still performed in the lowest (or perhaps second lowest) quintile, then you need to look at your study methods. There’s a concept in educational literature that testing is learning. If you aren’t regularly testing yourself (e.g. question banks, flashcards), you are missing an opportunity for effective active learning as well as an important tool in the fight against the forgetting curve. Learning to practice radiology at the workstation and learning to master radiology testing are related but ultimately different tasks.

If you destroyed it, particularly as an R3, then your odds of failing the Core Exam probably approach zero (not that you are ready without studying per se (although maybe!?), but rather that with reasonable preparation you should be able to succeed by a clear margin (especially now that conditioning physics is no longer possible).


Unlike the ABR, the ACR is not so precious with its questions. Every year they release the test and its answers. See ‘previous exam sets’ on their In-Training Exam page.

For the flashcard fans among you, someone collected all those kindly released in-service exam questions from 2002-2021 into a big DXIT Anki deck, which in addition to being an extremely efficient DXIT review is also essentially a free high-quality almost ~2500 question Core Exam qbank.

The ACR also made a slew of question-containing ebooks for Continuous Professional Improvement (CPI) free during the pandemic, and those are still available for download. The files are all ebooks (.epub), so you’ll need an e-reader app (e.g. Apple Books) to view them.

You should also at some point read the ACR Contrast Manual.

Take home

I wish we had better predictive tools/exams for radiology that trainees could use during dedicated prep time to predict Core Exam passage like the score estimation available for the USMLE exams. We don’t. For every person that fails the Core Exam, there are also probably two who wildly overprepare.

Even though its interpretation is a little muddy, the DXIT is the best we have for now. It can give you a blurry snapshot of where you are versus where you likely want/need to be.

The Truth about Private Equity and Radiology with Dr. Kurt Schoppe

Have you ever talked to someone above you on the food chain—usually with the word manager, director, or Vice President somewhere in their job title—and after they depart, you just stared blankly into the distance while slowly shaking your head thinking, Wow, they really don’t get it. What a useless bag of skin?

Well, that’s the opposite of my friend Dr. Kurt Schoppe, a radiologist on the board of directors at (my friendly local competitor) Radiology Associates of North Texas and payment policy guru for the American College of Radiology where he works on that fun zero-sum game of CMS reimbursement as part of the RUC. He’s whip-smart and has a unique perspective: Before pursuing medicine, Dr. Schoppe was a private equity analyst.

Consider this transcribed interview a follow-up to my essay about private equity in medicine published a few months ago.

Here’s our (lightly edited conversation):

Continue reading

Recommended Books for Radiology Residents

[This updated/revised article was originally published way back on December 21, 2013]

There are lots and lots of radiology books out there.

Rather than list oodles of options, I’ve made a short editorial selection for each section. There are obviously many good books, but your book fund is probably not infinite and you need to start somewhere.

First-year residents, in addition to Brant and Helms Core Radiology, might start with these recommendations prior to buying any additional texts that they are unlikely to read at length during their first exposure to each section.

Continue reading