Skip to the content

Ben White

  • About
  • Archives
  • Asides
  • Support
    • Paying Surveys for Doctors
  • Medical Advice
    • Book: The Texas Medical Jurisprudence Exam: A Concise Review
    • Book: Student Loans (Free!)
    • Book: Fourth Year & The Match (Free!)
  • Radiology Jobs
  • #
  • #
  • #
  • #
  • About
  • Archives
  • Asides
  • Support
    • Paying Surveys for Doctors
  • Medical Advice
    • Book: The Texas Medical Jurisprudence Exam: A Concise Review
    • Book: Student Loans (Free!)
    • Book: Fourth Year & The Match (Free!)
  • Radiology Jobs
  • Search
  • #
  • #
  • #
  • #

The 2019 ABR Core Exam Results, the Board Prep Arms Race, and Where It All Went Wrong

08.25.19 // Radiology

On August 15, the ABR released the 2019 Core Exam results, which included the highest failure rate since the exam’s inception in 2013: 15.9%.

(Side note: due to a “computer error,” the ABR decided to release the aggregate results before sharing individual results with trainees, resulting in entirely unnecessary extra anxiety. This itchy trigger finger release is in stark contrast to the Certifying Exam pass rates, which have never been released.)

 

Year Percent Passed Percent Failed Percent Conditioned Total Examinees
2016 91.1 8.5 0.4 1,150
2017 93.5 6.3 0.2 1,173
2018 86.2 13.0 0.8 1,189
2019 84.0 15.9 0.1 1,191

So what happened?

 

Option 1

One potential explanation is that current residents are less intelligent, less hard-working, or less prepared for the exam despite similar baseline board scores in medical school, similar training at their residency programs, and now very mature and continually improving board preparation materials. This would seem unlikely.

If it really does simply chalk up to resident “caliber” as reflected in minor variations in Step scores, then I would volunteer that we should be concerned that a minimally related test could be so predictive (i.e., so what are we testing here? Radiology knowledge as gained over years of training or just MCQ ability?).

Option 2

Another explanation is that—despite the magical Angoff method used to determine the difficulty/fairness of questions—the ABR simply isn’t very good at figuring out how hard their test is, and we should expect to see large swings in success rates year to year because different exams are simply easier or harder than others. This is feasible but does not speak well to the ABR’s ability to fairly and accurately test residents (i.e., their primary stated purpose). In terms of psychometrics, this would make the Core exam “unreliable.”

The ABR would certainly argue that the exam is criterion-based and that a swing of 10% is within the norms of expected performance. The simple way to address this would be to have the ABR’s psychometric data evaluated by an independent third-party such as the ACR. Transparency is the best disinfectant.

Option 3

The third and most entertaining explanation is that current residents are essentially being sacrificed in petty opposition to Prometheus Lionheart. The test got too easy a couple years back and there needed to be a course correction.

 

The Core Prep Arms Race

With the widespread availability of continually evolving high-yield board prep material, the ABR may feel the need to update the exam in unpredictable ways year to year in order to stay ahead of “the man.”

(I’ve even heard secondhand stories about persons affiliated with the ABR in some capacity making intimations to that effect including admitting to feeling threatened by Lionheart’s materials/snarky approach and expressing a desire to “get him.” I wouldn’t reprint such things because they seem like really stupid things for someone to admit within public earshot, and I certainly cannot vouch for their veracity.)

If you’re happy with how your exam works, and then third parties create study materials that you feel devalue the exam, then your only option is to change (at least parts of) the exam. This may necessitate more unusual questions that do not make appearances in any of the several popular books or question banks. This is also not a good long-term plan.

This scenario was not just predictable but was the inevitable outcome of creating the Core exam to replace the oral boards. If the ABR thought people “cheating” on the oral boards by using recalls was bad, replacing that live performance with an MCQ test—the single most recallable and reproducible exam format ever created—was a true fool’s errand.

A useless high-stakes MCQ test based on a large and unspecified fraction of bullshit results in residents optimizing their learning for exam preparation. I see first-year residents using Crack the Core as a primary text, annotating it like a medical student annotates First Aid for the USMLE Step 1. Look no further than undergraduate medical education to see what happens when you make a challenging test that is critically important and cannot be safely passed without a large amount of dedicated studying: you devalue the actual thing you ostensibly want to promote.

In medical school, that means swathes of students ignoring their actual curricula in favor of self-directed board prep throughout the basic sciences and third-year students who would rather study for shelf exams than see patients. The ABR has said in the past that the Core Exam should require no dedicated studying outside of daily service learning. That is blatantly untrue, and an increasing failure rate only confirms how nonsensical that statement was and continues to be. Instead, the ABR is going to drive more residents into a board prep attitude that will detract from their actual learning. Time is finite; something always has to give.

If I were running a program that had recurrent Core Exam failures, I wouldn’t focus on improving teaching and service-learning. Because on a system-level, those things are not only hard to do well but probably wouldn’t even help. The smart move would be to give struggling residents more time to study. And that is bad for radiology and bad for patients.

The underlying impression is that the ABR’s efforts to make the test feel fresh every year have forced them to abandon some of the classic Aunt Minnie’s and reasonable questions in favor of an increasing number of bullshit questions in either content or form in order to drive the increasing failure rates. Even if this is not actually true, those are the optics, and that’s what folks in the community are saying. It’s the ABR’s job to convince people otherwise, but they’ve shown little interest in doing so in the past.

There is no evidence that the examination has gotten more relevant to clinical practice or better at predicting clinical performance, because there has never been any data nor will there ever be any data regarding the validity of the exam to do that.

 

The Impossibility of True Exam Validity

The ABR may employ a person with the official title of “Psychometric Director” with an annual base salary of $132,151, but it’s crucial to realize the difference between psychometrics in terms of making a test reliable and reproducible (such that the same person will achieve a similar score on different days) and that score being meaningful or valid in demonstrating what it is you designed the test to do. The latter would be if passing the Core Exam meant that you were actually safe to practice diagnostic radiology and failing it meant you were unsafe. That isn’t going to happen. It is unlikely to happen with any multiple-choice test because real life is not a closed book multiple-choice exam, but it’s compounded by the fact that the content choices just aren’t that great (no offense to the unpaid volunteers that do the actual work here). Case in point: there is completely separate dedicated Cardiac imaging section, giving it the same weight as all of MSK or neuroradiology. Give me a break.

The irony here is that one common way to demonstrate supposed validity is to norm results with a comparison group. In this case, to determine question fairness and passing thresholds, you wouldn’t just convene a panel of subject matter experts (self-selected mostly-academic rads) and then ask them to estimate the fraction of minimally competent radiologists you’d expect to get the question right (the Angoff method). You’d norm the test against a cohort of practicing general radiologists.

Unfortunately, this wouldn’t work, because the test includes too much material that a general radiologist would never use. Radiologists in practice would probably be more likely to fail than residents. That’s why MOC is so much easier than initial certification. Unlike the Core exam, the statement that no studying is required for MOC is actually true. Now, why isn’t the Core Exam more like MOC? That’s a question only the ABR can answer.

I occasionally hear the counter-argument that the failure rate should go up because some radiologists are terrible at their jobs. I wouldn’t necessarily argue that last part, with the caveat that we are all human and there are weak practitioners of all ages. But this sort of callous offhand criticism only makes sense if an increasing failure rate means that the people who pass the exam are better radiologists, the people who fail the exam are worse radiologists, and those who initially fail and then pass demonstrate a measurable increase in their ability to independently practice radiology. It is likely that none of the three statements are true.

Without getting too far into the weeds discussing types of validity (e.g., content, construct, and criterion), a valid Core Exam should have content that aligns closely with the content of practicing radiology, should actually measure radiology practice ability and not just radiology “knowledge,” and should be predictive of job performance. 0 for 3, it would seem.

So, this exam is lame and apparently getting lamer with no hope in sight. And let’s not get started on shameless exercise in redundant futility that is the Certifying Exam. So where did everything go wrong? Right from the start.

That’s the end of the rant. But let’s end with some thoughts for the future.

What the Core Exam SHOULD Be

To the ABR, feel free to use this obvious solution. It will be relatively expensive to produce, but luckily, you have the funds.

Diagnostic radiology is a specialty of image interpretation. While some content would be reasonable to continue in a single-best-answer multiple-choice format, the bulk of the test should be composed of simulated day-to-day practice. Unlike most medical fields, where it would be impossible to objectively see a resident perform in a standardized assortment of medical situations, the same portability of radiology that makes AIs so easy to train and cases so easy to share would be equally easy to use for resident testing.

Oral boards aren’t coming back. The testing software should be a PACS.

Questions would be cases, and the answers would be impressions. Instead of having a selection of radio buttons to click on, there would be free text boxes that would narrow down to a list of diagnoses as you type (like when you try to order a lab or enter a diagnosis in the EMR; this component would be important to make grading automated.)

The exam could be anchored in everyday practice. One should present cases centered on the common and/or high-stakes pathology that we expect every radiologist to safely and consistently diagnose. We could even have differential questions by having the examinee enter two or three diagnoses for the cases where such things are important considerations (e.g., some cases of diverticulitis vs colon cancer). These real-life PACS-based cases could be tied into second-order questions about management, communication, image quality, and even radiation dose. But it should all center around how radiologists actually view real studies. It could all be a true real-world simulation that is a direct assessment of relevant practice ability and not a proxy for other potentially related measurables. Let’s just have the examinees practice radiology and see how they do.

The ABR has argued in the past that the Core exam cannot be ported to a commercial center, which is largely the fault of the ABR for producing a terrible test. But at least that argument would finally hold water if the ABR actually deployed a truly unique evaluative experience that could actually demonstrate a trainee’s ability. The current paradigm is silly and outdated, and radiology is uniquely positioned within all of medicine to do better. The exam of the future should not be rooted in the largely failed techniques of the past.

 

Core Exam Predictors

05.09.19 // Radiology

For those radiology residents agonizing over the coming Core Exam in June, there is a new article in JACR that discusses Core Exam predictors for success.

One takeaway? It doesn’t matter what you use to study.

Image-guided Epidural Blood Patches

04.17.19 // Radiology

If you’ve ever wanted to learn more about fluoroscopic-guided epidural blood patches, someone just had an article about that published in Applied Radiology.

Class Action Lawsuit Against the ABR

03.05.19 // Radiology

Radiology joined the ranks of physician-led class action lawsuits against the ABMS member boards last week when interventional radiologist Sadhish K. Siva filed a complaint on behalf of radiologists against the ABR for (and I’m paraphrasing) running an illegal anticompetitive monopoly and generally being terrible.

You can read the full 30-page suit if you’re interested. Legal writing is generally not of the page-turning variety, but there are still some great lines.

Regarding MOC (emphasis mine):

[The] ABR admits that no studying will be necessary for [the new MOC program] OLA and that ABR “doesn’t anticipate” incorrect answers “will happen often.” ABR also confirms on its website that “[t]he goal with all OLA content is that diplomates won’t have to study.” When a question is answered incorrectly, an explanation of the correct answer is provided so that when a similar question is asked in the future it can be answered correctly. Unsurprisingly, ABR admits it does “not anticipate a high failure rate.”

In short, to maintain ABR certification under OLA, a radiologist need only spend as little as 52 minutes per year (one minute for each of 52 questions) answering questions designed so as not to require studying, and for which ABR anticipates neither incorrect answers nor a high failure rate.

Because OLA has been designed so that all or most radiologists will pass, it validates nothing more than ABR’s ability to force radiologists to purchase MOC and continue assessing MOC fees.

Burn!

Though not called out in the lawsuit, this argument also applies to the Certifying Exam (a second, superfluous exam taken after the Core Exam, after graduating residency, and after already practicing independently as a radiologist). This may be in part because the angriest radiologists are the ones who paid for and then passed what should have been a 10-year recertification exam only to be told they had to start shelling out and doing questions right after. But the main reason is likely that the suit primarily asserts that the monopolistic behavior at play includes the ABR illegally tying mandatory MOC to its “initial certification product,” and the Certifying Exam—though suspect–is part of the initial certification process.

Interesting fact that I did not know about MOC & the insurance market:

In addition, patients whose doctors have been denied coverage by BCBS because they have not complied with MOC requirements, are typically required to pay a higher “out of network” coinsurance rate (for example, 10% in network versus 30% out of network) to their financial detriment.

It’s amazing how these organizations, which are completely unaccountable, have become such integral parts of so many different components of the healthcare machine from hospital credentialing to insurance coverage.

Speaking of that power:

The American Medical Association (“AMA”) has adopted “AMA Policy H-275.924, Principles on Maintenance of Certification (MOC),” which states, among other things, that “MOC should be based on evidence,” “should not be a mandated requirement for licensure, credentialing, reimbursement, network participation or employment,” “should be relevant to clinical practice,” “not present barriers to patient care,” and “should include cost effectiveness with full financial transparency, respect for physician’s time and their patient care commitments, alignment of MOC requirements with other regulator and payer requirements, and adherence to an evidence basis for both MOC content and processes.” ABR’s MOC fails in all of these respects.

And lastly:

[The] ABR is not a “self”-regulatory body in any meaningful sense for, among other reasons, its complete lack of accountability. Unlike the medical boards of the individual States, for example, as alleged above, ABR is a revenue-driven entity beholden to its own financial interests and those of its officers, governors, trustees, management, and key employees. ABR itself is not subject to legislative, regulatory, administrative, or other oversight by any other person, entity, or organization. It answers to no one, much less to the radiologist community which it brazenly claims to self-regulate.

Final burn!

Whether or not the suit will convince a jury that an illegal monopoly is at play, I don’t know. I can take a pretty confident educated guess as to what radiologists are rooting for. It’s pretty clear that while MOC can engender a controversy, the ABR’s efforts can’t meaningfully impact the quality of radiology practiced by its diplomates or have a significant effect on patient care.

 

Stop Free-Dictating

02.07.19 // Radiology

There are many institutions/practices with well-defined “normal” templates for all types studies, which help provide a reasonable approximation of a house style. A clinician (or the next radiologist) has a reasonable chance of knowing where to find the information in the report. The reader can see something in the impression and quickly find the longer description in the body of the report for more information.

Templates can be brief skeletal outlines or include more thorough components containing pertinent negative verbiage. A section for the Kidneys could say “Normal” or it could say, “No parenchymal lesions. No calculi. No hydronephrosis.” Some groups have diagnosis-specific templates that build off a generic foundation to better address specific concerns like renal mass characterization or appendicitis.

Either way, some form of templating is a helpful forcing function to creating a readable report. After all, radiology for better or worse is a field where the report is the primary product, and creating reports that are concise, organized, and readable should be a goal.

Some institutions and practices do not have these baseline templates. There are (often but not always older) attendings who seem to not only practice but respect the freewheeling old school transcriptionist style of reporting. A resident who doesn’t “need” a template is to be prized and congratulated.

This isn’t 100% wrong either. It’s a useful ability in the sense that it’s important to be able to summarize findings in cohesive English. It’s largely the same skill as the casemanship skills used during hot-seat conferences that the recent Core exam generation of residents have largely lost, and so I can appreciate this perspective. However, at least from a reporting perspective, this is suboptimal in the 21st century.

 

The purpose of the radiology report

The first attending I ever worked in radiology was a neuroradiologist who posed a semi-rhetorical question on my first day. He used to ask:

What is the purpose of the radiology report?

The answer, he argued, was to create the right frame of mind in the reader.

I think this view is exactly right.

Defined in a narrow sense, this means that the reader should come away with the impression that you intend for them to have. If something is bad and scary, that should be clear. If something is of no consequence, that should also be clear. Items in the impression are there because we want those impressed on the minds of our readers, not just because we saw them.

With increasing patient access to radiology reports, we now have a second audience. While doing away with all medical and radiological jargon is probably misguided and unnecessary, we need to at least be cognizant of how our reports might read to a layperson (or non-specialist, for that matter). If we can be more clear and more direct, we have a greater chance of communicating effectively to all involved parties.

Templates make reports more organized and scannable. Not even debatable.

But while the primary intent of “frame of mind”-creation may relate to the significant radiological findings, it’s also about creating the right frame of mind about you, the radiologist. Thorough, thoughtful, organized, conscientious? Or rushed, disorganized, careless, apathetic?

There may be some perks of blinding readers with science and drowning readers in long-winded descriptions of even benign and irrelevant incidental findings. At least you won’t look lazy! But for the less verbose among us, we can show we care by creating reports that reflect our systematic approach and clear writing style. Templating creates digestible reports.

Lastly, as quality metrics rise in importance and resource utilization re-enters the arena as a responsibility of the radiologist, we also need our reports to be readable and indexable by computers. The easier our reports are to parse, the easier we can extract meaningful data about our findings, link these up with patient data from the EMR, and draw high-powered conclusions about patient impact, outcomes, and (of special importance to me) the utility of certain exams in specific clinical contexts.

 

Dictation software is a tool, not a recorder

If you’re a resident somewhere and your institution doesn’t have power normals to frame-out your reports, make some. If you find yourself saying the exact same things over and over again every single day, then you’re doing it wrong. It should either part of the template or an auto-text macro (tip: In PowerScribe, highlight the text you want to save and say “macro that”). If nothing else, it will reduce your rate of transcription errors.

No one needs to reinvent the wheel on every case!

The ABR’s new Online Longitudinal Assessment (OLA)

02.02.19 // Radiology

It was super duper gratifying to receive my first OLA email from the ABR this past month. OLA (Online Longitudinal Assessment) is the ABR’s new longitudinal MOC (Maintenance of Certification) process, where diplomates take 52 questions every year instead of a big test every decade.

I took the Certifying Exam in October and received my passing result in November, so the month-long break prior to needing to “maintain” my brand new certification from the ABR feels just about right. Yes, a thousand folks need to maintain a piece of paper they haven’t actually received in the mail yet. I can appreciate why folks fresh off their q10-year MOC victory are irritated at needing to immediately participate in more MOC. Promises are being broken left and right. But, hey, money.

Adding insult to injury, as a neuroradiologist, I still have to sit for the exorbitantly expensive ($3,270) neuroradiology subspecialty exam this October. Which means that I need to maintain my first certification in between getting my second.

The final irritant in this system of paying $340/year (forever) is that the ABR, which is a nonprofit sitting on a war chest of ~$48 million, didn’t apply for (i.e. pay for) ACCME accreditation, so the hours spent doing OLA questions don’t count as official CME. (Update Feb 2020: Now they do, reducing your SA-CME burden from 25 to 15 hours over the 3-year period for MOC attestation)

 

The Actual OLA Experience

The current OLA paradigm is that 2 questions are released every week (104 a year) and “expire” after 28 days. So while you can log in and batch around 8 questions a month, you won’t be able to do it less often without losing some expired questions. Since you only need 52 questions and can do around 8 a month, you could actually get away with doing it almost bimonthly.

I took my first 8 questions this week and got them all right. They were straightforward, reasonable, and relevant to practice (at least in neuroradiology). My initial impression is that OLA questions are more like what the Core exam should be. You get between 1-3 minutes per question, the website was pretty slick (at least on a desktop), and I did all 8 in around 5 minutes. Can’t complain there. This is clearly a better system and more logical way to fulfill the spirit of MOC than taking an exam full of (even more) irrelevant material every decade.

You get to choose your practice profile and thus what types of questions you receive. I originally chose general diagnostic radiology and neuroradiology, but out of my first 8 questions, 7 were neuro and only 1 ended up being general, and the general question concerned GI fluoroscopy, which I detest, so I switched to 100% neuro. Maybe it’ll help with the subspecialty exam.

 

Things the ABR should improve:

  • Mobile experience. I’ve heard complaints about display issues on phones. You only get a minute for most questions, so it needs to work.
  • Lower the price. At the current rates, this is far more expensive than any commercial qbank. And that’s what this is. The ABR makes a lot of profit for a non-profit.
  • Increase question shelf-life. Why do questions expire after 28 days? So arbitrary. Let the radiologists hold themselves accountable. How about 90?
  • Get official CME accreditation. This feels like apathy and laziness. I know it’s not straightforward or cheap to be a CME-granting organization with the ACCME. But again, this is an expensive process, but it would be far more reasonable if it counted for CME. (Update Feb 2020: They don’t give you hours per se, but they do reduce the obligation for SA-CME for MOC; you’ll still have to satisfy your state requirements)

And finally, how about you let everyone take the certifying and subspecialty exams using the OLA software instead of flying out to Chicago to waste their time?

Pitfalls of Private Equity Takeovers

01.28.19 // Medicine, Radiology

You may have heard about this absurd story in the NYTimes a few months ago: An academic journal pulled a legitimate article comparing practice characteristics of groups that take on private-equity funding and those that do not. Why? Because a PE firm put the squeeze on their editor, that’s why:

In an interview, Dr. Hruza [the incoming president of the American Academy of Dermatology and board-member of United Skin Specialists, the largest PE-backed derm practice in the country] said he did not ask that the paper be taken down. He did, however, confirm that he expressed his concerns to Dr. Elston, the editor, after it was posted. Two days later, Dr. Elston removed the paper.

From the reporting in the times, this situation is absurd. If people have quibbles with the conclusions of a peer-reviewed article, then they should write a commentary. You don’t get to line-edit someone else’s manuscript.

Dermatologists account for one percent of physicians in the United States, but 15 percent of recent private equity acquisitions of medical practices have involved dermatology practices. Other specialties that have attracted private equity investment include orthopedics, radiology, cardiology, urgent care, anesthesiology and ophthalmology.

PE firms are following the money. However, their primary objective of extracting profit doesn’t necessarily equate with an understanding of how to actually run a successful, responsible, and sustainable medical practice.

Dr. Konda, [the paper’s lead author], said he first grew interested in the topic when several of his trainees went to work for private equity-backed practices and told him of clinical environments that emphasized profits at the expense of patient care.

 

With that preamble, check out this interview with radiologist and former PE analyst, Kurt Schoppe, MD on Radiology’s Nearest Threat, Commoditization, and the Misguided Notion That You Will Be Paid for Everything You Do.

 

Lots of excellent responses, but these three quotes give you a nice flavor of private-equity takeovers in broad strokes:

One of their favorite marketing lines is “physician-owned or physician-operated.” That’s really a misdirection because, frequently, they set up a holding company under which the physician group is a wholly owned subsidiary. Yes, the physician group is owned and operated by physicians, but it is not controlled by physicians because, as a wholly owned subsidiary, the parent corporation, or the holding company, is going to have absolute control. That holding company is not majority-owned by the physicians. The wording on the contracts is going to be such that the PE firm or the corporate entity is going to have control over the parent entity when it needs it.

…

What I’m getting at is no matter what the marketing says, no matter what they are telling people when they are selling services, these entities must make money for their owners/investor as their primary objective. Changing the economics of radiology group ownership is not fundamentally about the patients or saving money for the payers. They do these things to make money for their investors. This is not a negative judgement, it’s just a fact. If physicians want to sell their practice, if someone is only 4 or 5 years from retirement, and they only have a 4- or 5-year hold on their contract after they sell their group, well, that is just logical. From a purely personal economic point of view, it makes sense for them to sell, because they are not looking at a 15- to 20-year timeline.

…

The people who need to look out for this are the people in training, the people coming out of training, and the younger physicians in the group who have a 15-, 20-, 30-year timeline. If your goal when you came out of medical school was caring for patients, positively affecting the health care environment, or doing things for the greater good, I think you are better able to do that as a physician group in which you decide, as a group, how much money you need to make, what sacrifices you choose to make, and for whom you will charge less. If you cede control of your decision-making to a group that will only be motivated by its ability to make returns for its investors, you’ve put someone else in that conversation who does not necessarily share your values and ethics as a physician.

Anyone joining a hot-bed field like dermatology or radiology needs to understand the business model of your chosen profession and evaluate the health of both the practice and local market you consider joining.

While partners may get short-term windfalls in some buyout scenarios, non-partner employees are the primary profit source. Spending time in a partnership-track without eventually being a partner is a waste if the position becomes untenable and you need to start fresh somewhere else.

Review: Proscan’s MRI Online

01.09.19 // Radiology, Reviews

MRI Online (now Medality) is an advanced (MRI-focused) online radiology video platform offered by Dr. Stephen J Pomeranz, who is primarily a musculoskeletal radiologist. Just one dude. This in contrast to most online offerings in radiology, which are typically recorded board reviews or CME lectures from the big popular courses at places like Stanford, Hopkins, Duke etc. Multiple folks talking about multiple topics. Those production values tend to be relatively low because they’re typically recorded from normal in-person talks with the best of intentions (but without the best of audio engineering).

I was recently offered the chance to check out MRI online. I had the intention of spending time with it to help with studying for the certifying exam, but then I ended up not studying. That’s a separate story.

Anyway.

Content

There are several different kinds of content: “Mastery series” lectures are divided into digestible 5-10 minute chunks. “Lecture series” are more typical hour-long lectures (some of these are a bit older). “Courses on Demand,” which are recordings of in-person case reviews (my least favorite). And lastly, “Power Packs,” which are interactive PACS-integrated cases with questions and explanations (but no video).

Platform

MRI Online uses the Teachable platform, which is basically what every new course you’ve seen advertised on Facebook uses. Teachable is simple to use, especially well-suited for video courses, and produces a clean product, so there’s no secret why.

There are pre- and post-tests available, but these tend to be short little multiple-choice deals (often text-only). Nothing special there. This is definitely not aiming to be a q-bank.

More importantly, Teachable videos have the ability to be sped up, so you can pick your pace accordingly.

What separates MRI Online from just about every other product out there is that the case review components are integrated with an online PACS. You can review the cases (scroll through stacks, multiple sequences, window/level, etc.), read them cold, and then essentially go through them with Pomeranz or with a written explanation. It’s interactive. It’s practical. It’s reflective of real practice. It’s basically like being a resident or fellow, except that you’re on your own pace, the cases are carefully curated, and your teacher isn’t too busy to teach. It’s pretty neat.

Pricing

Pricing is a bit of a mixed bag.

The in-training price is actually pretty reasonable ($50/month or $500/year). In particular, if you have plans to do an MSK mini- or real fellowship, going through MRI Online would be a great introduction and much less painful than Requisites. For cost reasons, I think any trainee is probably going to buy on a month by month basis when they have time and not to fork out for the year.

(Talk about responsive, the price for fellows used to be $100/month. When I pointed out that fellows don’t really make significantly more than residents, they dropped the price a week later.)

While there’s also a lot of content for neuro (and some prostate), I think most people probably wouldn’t need to buy more than a month if their focus is non-MSK. Proscan tells me they’re adding tons more non-MSK content this year, so I imagine that’s likely to change.

The price for folks out in practice gave me a bit more sticker shock at first: $150/month or $1500/year. That said, you do need CME, lots of practices do provide CME funds, and course reviews and conferences are generally even more expensive and not amenable to pajamas. MRI Online provides real ACCME CME credits, which for the price are actually a bargain depending on how hard you pound your subscription.

I wouldn’t pretend to have the ability to compare and contrast any of the huge number of course reviews that exist in radiology, but MRI Online is definitely better than a lot of conference talks I’ve gone to at RSNA, ASNR, WNRS, ABCD, and WXYZ.

Here’s where the usual negotiated discount/affiliate stuff comes in:

Code BEN10 gets you 10% off.

The annual subscription also includes a free MRI anatomy atlas as well as free attendance at a 3-day MSK MRI course held annually in Cincinnati. They tell me the vast majority of subscribers are annual, not monthly.

Free Samples

There’s a free online MSK mini-course with a sample of cases (that you would need to sign up to take).

There are also sample videos for each course (e.g. shoulder, hip) that you can watch without logging in, as well as sample cases for basically every course. You’ll get a history, review the cases in the diagnostic viewer, then answer a multiple-choice question about them. The explanations have annotated lesions and a relatively concise readable description.

They also provide a full free 7-day trial, which is a real steal for trainees or for focused test-prep.

Bottom line is that there are plenty of no-risk opportunities to check it out. There’s lots of totally free content and no bait-and-switch in sight. I wish more companies were this transparent.

Conclusion

MRI Online is actually an impressive and pretty expansive product, particularly for MSK, but also with hours of content for neuro and body. In addition to solid review, I’d definitely consider signing up again if I changed practices and needed to expand my toolset.

Journey to the ABR Certifying Exam

11.28.18 // Radiology

If there is little information online about the ABR Core Exam, there is essentially none about the Certifying Exam. After several years, the only nuggets on the grapevine were that it was easy, nobody has ever failed, and you might as well do all your selected modules in the field of your fellowship.

All of that is probably true. But just as diagnostic imaging for pulmonary embolism in the ER is always indicated, more information is always better, right?Read More →

Q&A: Pros/Cons of Choosing Radiology

11.07.18 // Radiology

Answers to some frequently asked questions about being a radiologist:

 

How bad is the grind?

Depends.

Is there a race to the bottom?

Yes.

Do procedures add or detract from the grind?

Depends.

Do you begin to feel comfortable with radiology material during residency?

Yes.

How much studying do you need to do? Does that need follow you home every day?

Depends.

How exhausting is the work?

Mentally, quite. Physically, depends on your posture.

How easy is it to have a life outside of radiology/medicine?

Easy.

 

Hope that clears things up!

Older
Newer