ACVR DI Certifying Examination – 2019 results feedback

As you are aware, this was the first year we moved the certifying examination from an oral exam to a computer-based exam.  Hopefully you have been preparing your residents for this process with the materials the exam committee has provided throughout the past year.  We have tried to be as transparent as possible and definitely did not want the process to get in the way of candidate performance as they stepped toward Diplomate status.

For the examination, Drs. John Mattoon, Stephanie Nykamp, Nate Nelson and myself were on hand for the administration of the two days.  The examination went well from a technical standpoint with only a few small issues that were handled on the fly and immediately.  We were pleased with the results and all examination committee members felt like this process was much better than past oral examinations that they had participated.  In addition, in speaking with several candidates that had failed the oral certifying exam last year, they really liked the new format and felt less pressure than the original oral examination that they had taken.  The set up at the Oquendo Center was great and the candidates did not have any issues with the fact that we had control of the computers, monitors, and all computer hardware (that was standardized for each candidate).

Each examination committee member graded a number of questions and was blinded to candidate identity.  We had some time to process the results and review questions that had a greater than 20% disparity between the two graders.  In these cases, a third reviewer was assigned the question and the question regraded.  The examination performed well and there were no cases that needed to be excluded from any of the four sections. The cut score (passing grade) for the exam was set with the help of a psychometrician based on standard score setting practices.  The cut score for the exam may vary from year to year based on the analysis of the exam cases.   For this examination the cut score was 72%

For the results, we had a total of 48 candidates take the examination and    38 candidates passed (pass rate of 79%).    The majority of the candidates that failed did poorly in multiple modalities.  The one section that the candidates (in total) did poorer in than the other sections was CT.  We did not come up with a specific reason for this, but this section seemed to be more difficult for the candidates.

Some of the common observations from the exam committee members for the failing candidates included:

  1. Incorrect anatomic localization.
  2. Missed major findings and thereby missed correct synthesis.
  3. Could not clearly come up with an appropriate primary differential diagnostic consideration or the top three possibilities based on the context of the roentgen abnormalities and the signalment and history. In fact there were a number of candidates that used phrases, such as “cannot be excluded”, “is not ruled out”, “may be possible”, or “is felt to be less likely but still possible.”
  4. Missed only a few major findings or got all major findings but did not synthesize the case correctly.
  5. Description often incomplete with lack of clear thought process regarding synthesis of the abnormalities described.
  6. Sometimes failed to synthesize the case at all, with the synthesis section just being a repetition of abnormal findings.
  7. Identification of normal structures as abnormal or identification of the abnormalities that just were not present at all.
  8. Failure to be explicit in differential diagnoses, (eg: mass does not equate to neoplasia and is not a synthesized differential).
  9. Occasionally failed to relate the abnormalities seen to the clinical signs of the patient.
  10. Incorrectly prioritized differentials.
  11. Listed a lot of differentials when a prioritized top 1 or 2 would be sufficient.

All in all, the candidates seemed to have enough time for 18 cases in three hours.  There were a few (namely 3) candidates that stated that they were not able to finish all of the cases on one of the four sections.

When we walked around the room, the candidates did appear to have a good “clock” sense in that they were tracking appropriately with the case material.  In some situations, this was hard to assess, as those candidates went through and did the MRs and CTs first prior to the radiographs and US cases.  Exam soft does provide the candidate the opportunity to set “timers” within the exam to keep them on track (say for example at 1 hour, 2 hours and 2 hours 55 minutes).

There were a few issues with the video player (VLC) that we are addressing at this time, but this did not appear to be a major issue for any one candidate.

One other comment.  Most all of the candidates stuck with a paragraph style format for the description and synthesis.  This, we believe, slowed down the candidates due to the excessive typing that they did.  Those candidates that used a bulleted format (as suggested) had no issues with time management or getting through all of the cases of the four sections. Many candidates also re-wrote a lot of the description again in their synthesis. This is not necessary and would have taken a lot of time for the candidates.

There is one suggestion that we might make regarding the training of your third-year residents.  These residents should be moved along during their final year in the number of cases they evaluate and the time that they spend on a given KCC so that their final KCC might be a true three-hour exam with a mixture of 18 cases that would be comparable to the certifying examination.  Seeing if other residency programs are doing this and sharing these three-hour exams would allow one to put together a two-day, two session per day examination.  This will provide the best representation of the exam process itself.  It is a marathon and they do need help in training for that.

Since the Oquendo Center was a great venue to ensure a stable, comfortable, consistent examination computer hardware and environment, we are requesting that Council approve the certifying and preliminary examination be taken next year at this site at the end of August.  One of the reasons for this change for the preliminary examination is that PSI is no longer administering the preliminary examination and therefore we do not have access to their testing centers across the country.  We have heard complaints of inconsistencies in these testing centers and, at least for the time being, we will bring everyone together in a controlled the testing environment at the Oquendo Center.,

Thank you for your time and if there are any questions, please do not hesitate to contact either myself (berryk@ufl.edu; 352-672-2420; current chair finishing December 31, 2019) or Dr. Nate Nelson (ncnelso2@ncsu.edu; current Assistant Chair, starting January 1, 2020 as Chair).

Sincerely,

Clifford R. Berry, DVM, DACVR

Courtesy Professor, University of the Florida

Academic Coordinator, Vet-CT