ABR OLA MOC: The First-Year Experience

2019 was the initial offering of ABR’s MOC of the future: Online Longitudinal Assessment (OLA). I wrote about it earlier this year, but to recap: All Diagnostic Radiology ABR diplomates including those fresh off their Certifying Exam victory lap were immediately thrust into the new paradigm. This amounts to answering a whopping 52 multiple choice questions over the course of the calendar year in whatever subspecialty composition you prefer. Questions are released 2 per week and expire after a month.

It’s…fine? Sorta I guess?

The website works (mostly), and the questions are questions (undeniable). Some are pretty good, some certainly less so. People on the internet grumble about content relevance more than I personally would, but then again the minute I got a lame low-yield Core-style GI fluoro question I switched to 100% neuroradiology.

The ABR hasn’t released the passing thresholds yet, which is the most interesting facet of the whole ordeal: recall that the Core and Certifying Exams are “criterion-referenced” by magical Angoff committees that can infallibly determine what a “minimally component” radiologist can do. The ABR just doesn’t seem to have that same confidence when it comes to MOC, presumably because they have no idea how many people would fail if they had logically employed that exact same Angoff method, and failing an unknown number of already dissenting practicing radiologists is a much bigger deal than embarrassing some more trainees.

Now before you say that each diplomate needs to hit 200+ questions to hit the psychometric validity threshold, the ABR could still tell people if they were on track to pass or fail based on their current performance. There are apparently plans to release preliminary feedback soon (which may do just that now that there is some real-world data to calibrate with), but all of us will need to do another few years of OLA to learn if we’re truly maintaining the magic.

In case you were wondering, I did get one question wrong (the software buries additional images in tabs you have to click through; I kept forgetting, though it only burned me the one time).


There are no secrets as to why the ABR chose to release two questions per week that subsequently expire a month later. I finished my required questions in August, less than a year from when I took (and presumably destroyed) the Certifying Exam (but we’ll never know because they don’t release scores for that exam).

What I can tell you is that I spent approximately one hour satisfying the OLA requirements for the year. Without the forced drip-feeding, I could’ve accomplished the entire process during a single generous lunch break.

Some of you reading may be thinking, hey, that’s not so bad. And you’re right, the process is relatively painless. I didn’t learn anything, but at least it didn’t take a lot of my time.

Ultimately, that’s also what makes MOC a meaningless box-checking endeavor and blatant money grab.

The argument that something isn’t stupid, bad, useless, or wrongheaded as long as it doesn’t suck is spurious. Just because it could be worse doesn’t mean that it shouldn’t be better.

And the fact that many doctors are scared that these unelected unaccountable pseudo-governing organizations will punish any dissent by making tests harder and MOC more arduous is toxic and should not be accepted. We shouldn’t treat the relative ease of a profit-seeking exercise as a thrown bone from the shadow lords that can be taken away at any time or a secret to keep quiet so the “public” doesn’t find out.

The Anti-MOC Wave

I am actually not really part of the large and growing cohort of physicians adamantly opposed to any third-party validation of demonstrable skill or the mere idea of a certification-granting organization that can reliably establish minimal competence. In fact, if board certification wasn’t a de facto requirement in many contexts (and thus akin to licensure itself), I wouldn’t even mind if the supposed threshold was greater than minimal competence.

The ABMS was founded in 1933. The ABR was founded in 1934. We’re still waiting on evidence that anything these people do means anything. If the intellectual underpinnings of initial certification are up for debate, then the “maintenance” of said certification is doubly so (hence the lawsuit).

The new system may be no worse than the 10-year exam it replaced; it would seem to me that it’s likely significantly less hassle. Less studying, less travel, less time, less effort, and more relevant (in that you can exclude broad categories of radiology irrelevant to your practice). However, cumbersomeness (or lack thereof) is not a component of psychometric validity.

A lack of rigor may serve as a salve for diplomates injured by losing out on years of rightfully-earned respite after a recently passed 10-year exam, but it doesn’t change the fact that gradually adding strata of multiple-choice questions on a foundation of more multiple-choice questions creates a weak structure that teeters in the winds of change.

A Deep Dive into the Tax Returns of the American Board of Radiology

With the class-action antitrust suit filed against the ABR earlier this year, a post looking deeper at the finances that make an appearance in the lawsuit is overdue. You can find the recent filings that I used for this post collected here.

I promise this is a more interesting read than one might think.


The ABR is a 501(c)(6) organization.

Readers may be familiar with the more common 501(c)(3) designation, which is the non-profit status used by religious, charitable, scientific, and educational organizations (and is the type generally required to qualify for loan forgiveness within the Public Service Loan Forgiveness (PSLF) program).

A 501(c)(6) organization is a business league or association organized and operated primarily to “promote the common business interests of its members.” I’m not really sure how the ABR qualifies as that, but it’s a self-reported designation and that’s their purpose as far as the IRS is concerned. (burn!)

Regardless, as a tax-exempt non-profit, the ABR must make public their Form 990 annual returns for the past three years. The most recent returns (2017 tax year, filed in 2018) are also available on several sites including ProPublica and GuideStar, both of which maintain a searchable database of all non-profit tax returns.

But before we go through the returns and try to make sense of the ABR’s finances, a disclaimer: I am a radiologist with a hobbyist understanding of the tax code, not a CPA, tax preparer, or financial anything (let alone a forensic accountant). This is all for entertainment purposes only.

Disclaimer #2: Form 990 is light on details. I emailed the ABR for clarifications about several issues. Unsurprisingly, they ignored me.

Revenue Breakdown

Tax-exempt non-profits can, in fact, have taxable income if the income derives from activities separate from their mission. In 2016, the ABR claimed $45,605 in taxable income on line 7. In Part VIII, this was described as real estate rental income. I don’t know what they’re renting or to whom they rent to. In 2017, it was down to around $30k.

Total tax-exempt revenue is mostly from “certification fees.” Over the past five years, total revenue (which includes investment income):

2017: $17,430,259 (up $1,138,815, 6.9%)
2016: $16,291,444 (up $530,424, 3.4%)
2015: $15,761,020 (up $585,430, 3.9%)
2014: $15,175,589 (up $1,635,419, 12.1%)
2013: $13,540,170

For reference, the inflation rates over this period according to the US Labor Department were 0.8% in 2014, 0.7% in 2015, 2.1% in 2016, and 2.1% in 2017. So the ABR has reported true growing revenue.

Though not specified as such in the 990, the substantial year over year increase is primarily related to increasing MOC enrollment.

In 2017’s Part III, the ABR says that it administered 4,790 total exams and that approximately 27,000 diplomates were enrolled in MOC. As a reminder, MOC costs $340/year, so the revenue from MOC was approximately $9.2 million in 2017. Note that because older radiologists were grandfathered with “Lifetime” certifications whereas all new diplomates immediately enter MOC, this number will enlarge annually until a steady state is reached, presumably sometime in the next 10-20 years. I’m sure the ABR has a better idea of when the gravy train will hit its coasting speed.

To wit, the number of MOC-enrolled physicians was 26,140 in 2016, 25,000 in 2015, and 24,000 in 2014. With over a thousand new diplomates automatically enrolled in MOC every year, the ABR can anticipate rising revenue for the foreseeable future.

Also note that MOC revenue scales fantastically, as the incremental cost to service an individual enrollee approaches $0 but each one brings in $340 every year throughout their career.

Subtracting MOC income from the total fee-related revenue of $16,271,311 would leave around $7 million in revenue from exam services ($7.4MM in 2016). As I’ve discussed before, around $3 million of that comes just from residents, who spend around 1% of their pre-tax income directly to the ABR annually without exception.

In the 2018 annual report, ABR president Brent Wagner made this comment on the first page of content:

As a non-for-profit, the ABR collects fees to cover the expenses of administering the programs. Reserves are maintained to cover unexpected capital expenses, but fees are set as closely as possible to approximate administrative expenses.

Based on the numbers you just read, I think you can see where this is going. But let’s see how that holds up.


Expenses also continued to rise from $13,758,299 in 2015 to $15,590,929 (a 13% increase) in 2016 to $16,468,080 in 2017 (a 5.6% increase). Payroll-type expenses increased from $6,932,139 to $7,342,360 (a 6% increase) to $8,256,080 (a 12% increase).

Revenue minus expenses yield a “non” profit of $962,179 in 2017, down from $1,329,124 in 2016.

So despite $17.4MM in revenue, the ABR claims its expenses take out all but a single mil. Let’s look at some of those.


We don’t need to name names of every ABR officer and their compensation, but we can at least track the highest paid, who is Dr. Valerie Jackson, the ABR’s executive director during the studied period.

YearSalaryNon-salary (retirement + benefits)Total Compensation

2015 was a good year to hold the reins.

These numbers are interesting in how they may or may not correlate with the recent changes to the ABR board certification process that occurred in 2013 as well as the rising profits from MOC endeavors (common among ABMS members) and then perhaps coincidentally followed by small token decreases in light of increasing physician frustration with MOC across the country, increased scrutiny of member-board financials (like these wonderful reads from Newsweek about the ABIM), or the recent series of class action lawsuits.

Or that could totally be a coincidence. What any reader can agree on is that being the head of the ABR is certainly a livable income.

Another reason payroll increased? In 2017, the ABR hired a “Director of External Relations” whose base salary is $135,033. Trying to make the ABR look good is apparently a challenging full-time task. To round out the A-Team, they also hired a similarly paid “IT Director” (possibly as a result of Mammogeddon) and a “Managing Director,” to, um, manage and direct things?

“Other” Expenses

Other expenses make up the majority of the ABRs expenses and totaled $8,248,569. What are “other” expenses you ask? Look no further than Part IX (“Statement of Functional Expenses”)

These include things like credit card processing fees, office expenses, insurance, etc. I wouldn’t pretend to have any idea about how much the ABR should spend on staplers and toner.

Expenses are hard to parse because they’re grouped into large nebulous categories.

  • In 2016, the ABR spent $2,204,166 for Exam Services. Given that the ABR owns its own testing center, the exams are administered by employees, and the questions are largely written by volunteers, I would personally be interested in having these big numbers broken down some more. In 2017, this dropped down to $466,472.
  • $1,292,317 was for conferences, conventions, and meetings. The ABR does reportedly convalesce in Hawaii twice annually. This is down from $1,534,608 in 2016.
  • Legal services accounted for $45,439 in 2017. Watch for that number to rise substantially in 2019.
  • $1,051,695 was for “Other fees” — who knows? These are often payments made to various independent contractors that don’t fall into the other categories likes investment management fees, IT, etc.
  • Another mil for office expenses. Another mil for occupancy (rent, utilities, real estate taxes, etc).
  • And a big $5.4MM for salaries/wages of the nonexecutive rank and file.

What we do know from Schedule J, which details compensation information, is that the ABR does “reimburse board members for companion’s travel.” That’s probably in the $640,464 figure for travel, which is separate from the $1.3MM above for conferences/meetings. Twice annual all-expense-paid trips for the family to Hawaii do sound nice.

The ABR doesn’t run lean.

War Chest

Rising revenues have nicely padded the ABR’s current assets, which totaled $51,737,127 in 2017, up from $49.5 million in 2016 and $45.7 million in 2015.

The ABR does claim $10.8MM in liabilities, so according to its 990, the net assets total $40.8MM. However, these liabilities include $8,914,139 in “deferred revenue.” This is to say, the lower figure is meaningless in a common-sense interpretation. Deferred revenue is a mostly BS accounting technique used to refer to payments made in advance for services not yet rendered. In this case, it’s a convenient way to make it look like you’re making less money than you really are. Based on the figure, it would seem the ABR jams all the MOC fees in there, though it’s not as though they offer refunds.

In everyday terms, most would argue that all of the ABR’s revenue is “unearned.” Regardless, outside of clever spreadsheets, that cash isn’t really a liability. It’s all sitting in the bank.

So, the ABR was really holding on to a war chest of almost $52 million in 2017.

Even with questionable payroll, staffing, and vacation meeting practices, the ABR still has an annual operating revenue surplus (aka a profit) of a million bucks. What size of “reserves” will finally be sufficient to “cover unexpected capital expenses,” so to speak? Maybe the slush fund was to cover the inevitable lawsuit. Outside of its testing business, the ABR investment portfolio itself gained almost a million in 2017 and almost two million in 2016.

Even if the ABR stopped making a profit on fees (hard to do even with an impressive meeting budget), they would still likely make money every year. The portfolio proceeds would certainly be enough, for example, to drop the resident and fellow fees down to attending levels from their current $300 premium ($640/yr for trainees vs $340/yr for MOC).

The ABR Foundation

The ABR does maintain a separate “Foundation” that is a 501(c)3 organization. The ABR Foundation, unlike the ABR, is able to receive tax-exempt charitable donations. The nebulous purpose of the ABR Foundation is “to demonstrate, enhance, and continuously improve accountability to the public in the use of medical imaging and radiation therapy.” Like you, dear reader, I have no idea what that means.

Later, the mission of the foundation is described: “The Foundation carries out the scientific, educational and charitable purpose of the mission of the American Board of Radiology.” I have a hard time picturing that too. The final description of the mission: “to demonstrate, enhance, and continuously improve accountability to the public in the use of medical imaging and radiation therapy.” Darn, that still doesn’t help.

In 2017, it only made money from investments on its net assets (now $1.6MM). No one gave them any money, and they awarded no new grants. Why?

Because, since 2015:
“The Foundation is re-evaluating program services offered to determine how to most effectively achieve the mission statement. During this period of re-evaluation, no new contributions are currently being accepted. Current program commitments for sponsorships continue to be serviced.”

In 2014, the ABR awarded two grants:
1. $95,000 to create a national brachytherapy registry and QA program
2. $25,550 to create ethics and professionalism instructional modules

But 2013 was a much more interesting year:
The ABR Foundation somehow managed to receive $202,348. Total expenses were $305,982:
1. $95k again went to the brachytherapy project
2. $77,599 went to “summit meetings/conduct symposiums to optimize a national strategy for safe and appropriate medical imaging”
3. An additional $115,908 were also “meetings expenses”

So, it’s meetings all the way down.

Either way, the foundation seems mostly defunct now.

American Board of Radiology International

Is a “disregarded entity” that made $178,750 for total assets $539,649 in 2016. Its stated purpose is to “provide guidance in a radiology certification exam program.” Yes, a program. I have no idea.




So to summarize, I am not an accountant. If you or someone you love has more information about the ABR’s operations or financial workings, please feel free to contact me. I would love to update this post (or all my posts, for that matter). I feel strongly that there should be more information available to candidates and diplomates, and it would be much better if it came unwhitewashed from the ABR itself rather than from someone throwing snarky potshots from the sidelines like myself.

The ABR makes a lot of money from trainees and radiologists who have zero say in its operations and to whom the ABR does not feel accountable.

The ABR’s expenses are hard to parse but are clearly not super-duper efficient in their use of very generous certification fees.

The war chest was around $52 million in 2017, is almost certainly higher now, and will continue increasing every year for the foreseeable future due to essentially compulsory MOC.

Assuming any of the current lawsuits progress to discovery and aren’t confidentially settled, we can eventually expect some fascinating news in the years to come. In the meantime, those legal fees certainly aren’t going to help their bottom line.


The 2019 ABR Core Exam Results, the Board Prep Arms Race, and Where It All Went Wrong

On August 15, the ABR released the 2019 Core Exam results, which included the highest failure rate since the exam’s inception in 2013: 15.9%.

(Side note: due to a “computer error,” the ABR decided to release the aggregate results before sharing individual results with trainees, resulting in entirely unnecessary extra anxiety. This itchy trigger finger release is in stark contrast to the Certifying Exam pass rates, which have never been released.)


YearPercent PassedPercent FailedPercent ConditionedTotal Examinees

So what happened?


Option 1

One potential explanation is that current residents are less intelligent, less hard-working, or less prepared for the exam despite similar baseline board scores in medical school, similar training at their residency programs, and now very mature and continually improving board preparation materials. This would seem unlikely.

If it really does simply chalk up to resident “caliber” as reflected in minor variations in Step scores, then I would volunteer that we should be concerned that a minimally related test could be so predictive (i.e., so what are we testing here? Radiology knowledge as gained over years of training or just MCQ ability?).

Option 2

Another explanation is that—despite the magical Angoff method used to determine the difficulty/fairness of questions—the ABR simply isn’t very good at figuring out how hard their test is, and we should expect to see large swings in success rates year to year because different exams are simply easier or harder than others. This is feasible but does not speak well to the ABR’s ability to fairly and accurately test residents (i.e., their primary stated purpose). In terms of psychometrics, this would make the Core exam “unreliable.”

The ABR would certainly argue that the exam is criterion-based and that a swing of 10% is within the norms of expected performance. The simple way to address this would be to have the ABR’s psychometric data evaluated by an independent third-party such as the ACR. Transparency is the best disinfectant.

Option 3

The third and most entertaining explanation is that current residents are essentially being sacrificed in petty opposition to Prometheus Lionheart. The test got too easy a couple years back and there needed to be a course correction.


The Core Prep Arms Race

With the widespread availability of continually evolving high-yield board prep material, the ABR may feel the need to update the exam in unpredictable ways year to year in order to stay ahead of “the man.”

(I’ve even heard secondhand stories about persons affiliated with the ABR in some capacity making intimations to that effect including admitting to feeling threatened by Lionheart’s materials/snarky approach and expressing a desire to “get him.” I wouldn’t reprint such things because they seem like really stupid things for someone to admit within public earshot, and I certainly cannot vouch for their veracity.)

If you’re happy with how your exam works, and then third parties create study materials that you feel devalue the exam, then your only option is to change (at least parts of) the exam. This may necessitate more unusual questions that do not make appearances in any of the several popular books or question banks. This is also not a good long-term plan.

This scenario was not just predictable but was the inevitable outcome of creating the Core exam to replace the oral boards. If the ABR thought people “cheating” on the oral boards by using recalls was bad, replacing that live performance with an MCQ test–the single most recallable and reproducible exam format ever created–was a true fool’s errand.

A useless high-stakes MCQ test based on a large and unspecified fraction of bullshit results in residents optimizing their learning for exam preparation. I see first-year residents using Crack the Core as a primary text, annotating it like a medical student annotates First Aid for the USMLE Step 1. Look no further than undergraduate medical education to see what happens when you make a challenging test that is critically important and cannot be safely passed without a large amount of dedicated studying: you devalue the actual thing you ostensibly want to promote.

In medical school, that means swathes of students ignoring their actual curricula in favor of self-directed board prep throughout the basic sciences and third-year students who would rather study for shelf exams than see patients. The ABR has said in the past that the Core Exam should require no dedicated studying outside of daily service learning. That is blatantly untrue, and an increasing failure rate only confirms how nonsensical that statement was and continues to be. Instead, the ABR is going to drive more residents into a board prep attitude that will detract from their actual learning. Time is finite; something always has to give.

If I were running a program that had recurrent Core Exam failures, I wouldn’t focus on improving teaching and service-learning. Because on a system-level, those things are not only hard to do well but probably wouldn’t even help. The smart move would be to give struggling residents more time to study. And that is bad for radiology and bad for patients.

The underlying impression is that the ABR’s efforts to make the test feel fresh every year have forced them to abandon some of the classic Aunt Minnie’s and reasonable questions in favor of an increasing number of bullshit questions in either content or form in order to drive the increasing failure rates. Even if this is not actually true, those are the optics, and that’s what folks in the community are saying. It’s the ABR’s job to convince people otherwise, but they’ve shown little interest in doing so in the past.

There is no evidence that the examination has gotten more relevant to clinical practice or better at predicting clinical performance, because there has never been any data nor will there ever be any data regarding the validity of the exam to do that.


The Impossibility of True Exam Validity

The ABR may employ a person with the official title of “Psychometric Director” with an annual base salary of $132,151, but it’s crucial to realize the difference between psychometrics in terms of making a test reliable and reproducible (such that the same person will achieve a similar score on different days) and that score being meaningful or valid in demonstrating what it is you designed the test to do. The latter would be if passing the Core Exam meant that you were actually safe to practice diagnostic radiology and failing it meant you were unsafe. That isn’t going to happen. It is unlikely to happen with any multiple-choice test because real life is not a closed book multiple-choice exam, but it’s compounded by the fact that the content choices just aren’t that great (no offense to the unpaid volunteers that do the actual work here). Case in point: there is completely separate dedicated Cardiac imaging section, giving it the same weight as all of MSK or neuroradiology. Give me a break.

The irony here is that one common way to demonstrate supposed validity is to norm results with a comparison group. In this case, to determine question fairness and passing thresholds, you wouldn’t just convene a panel of subject matter experts (self-selected mostly-academic rads) and then ask them to estimate the fraction of minimally competent radiologists you’d expect to get the question right (the Angoff method). You’d norm the test against a cohort of practicing general radiologists.

Unfortunately, this wouldn’t work, because the test includes too much material that a general radiologist would never use. Radiologists in practice would probably be more likely to fail than residents. That’s why MOC is so much easier than initial certification. Unlike the Core exam, the statement that no studying is required for MOC is actually true. Now, why isn’t the Core Exam more like MOC? That’s a question only the ABR can answer.

I occasionally hear the counter-argument that the failure rate should go up because some radiologists are terrible at their jobs. I wouldn’t necessarily argue that last part, with the caveat that we are all human and there are weak practitioners of all ages. But this sort of callous offhand criticism only makes sense if an increasing failure rate means that the people who pass the exam are better radiologists, the people who fail the exam are worse radiologists, and those who initially fail and then pass demonstrate a measurable increase in their ability to independently practice radiology. It is likely that none of the three statements are true.

Without getting too far into the weeds discussing types of validity (e.g., content, construct, and criterion), a valid Core Exam should have content that aligns closely with the content of practicing radiology, should actually measure radiology practice ability and not just radiology “knowledge,” and should be predictive of job performance. 0 for 3, it would seem.

So, this exam is lame and apparently getting lamer with no hope in sight. And let’s not get started on shameless exercise in redundant futility that is the Certifying Exam. So where did everything go wrong? Right from the start.

That’s the end of the rant. But let’s end with some thoughts for the future.

What the Core Exam SHOULD Be

To the ABR, feel free to use this obvious solution. It will be relatively expensive to produce, but luckily, you have the funds.

Diagnostic radiology is a specialty of image interpretation. While some content would be reasonable to continue in a single-best-answer multiple-choice format, the bulk of the test should be composed of simulated day-to-day practice. Unlike most medical fields, where it would be impossible to objectively see a resident perform in a standardized assortment of medical situations, the same portability of radiology that makes AIs so easy to train and cases so easy to share would be equally easy to use for resident testing.

Oral boards aren’t coming back. The testing software should be a PACS.

Questions would be cases, and the answers would be impressions. Instead of having a selection of radio buttons to click on, there would be free text boxes that would narrow down to a list of diagnoses as you type (like when you try to order a lab or enter a diagnosis in the EMR; this component would be important to make grading automated.)

The exam could be anchored in everyday practice. One should present cases centered on the common and/or high-stakes pathology that we expect every radiologist to safely and consistently diagnose. We could even have differential questions by having the examinee enter two or three diagnoses for the cases where such things are important considerations (e.g., some cases of diverticulitis vs colon cancer). These real-life PACS-based cases could be tied into second-order questions about management, communication, image quality, and even radiation dose. But it should all center around how radiologists actually view real studies. It could all be a true real-world simulation that is a direct assessment of relevant practice ability and not a proxy for other potentially related measurables. Let’s just have the examinees practice radiology and see how they do.

The ABR has argued in the past that the Core exam cannot be ported to a commercial center, which is largely the fault of the ABR for producing a terrible test. But at least that argument would finally hold water if the ABR actually deployed a truly unique evaluative experience that could actually demonstrate a trainee’s ability. The current paradigm is silly and outdated, and radiology is uniquely positioned within all of medicine to do better. The exam of the future should not be rooted in the largely failed techniques of the past.