The new score report format reflects both advances in the underlying science of assessment and changes to the USMLE examinations over time. USMLE score reports were virtually unchanged for nearly two decades. During that time, the USMLE exams experienced incremental changes, including changes to content sampling, item types, and review and modification of minimum passing scores. Simultaneously, measurement researchers, including those who support the USMLE program, continued to advance the science of subscore computational methods, data visualization, and score reporting.
The process used to design a new score report format included:
- The development of prototype score reports that better supported the intended inferences
- Approval of the prototypes by USMLE governance committees (composed mostly of US medical school faculty)
- Focus group studies with USMLE test-takers to identify preferences among prototypes and to evaluate their ability to make correct inferences (and not make incorrect inferences) from the prototype score reports
The examinee focus group results clearly demonstrated that the new score report format was viewed as a substantial improvement by USMLE test-takers. Results also showed that USMLE examinees were both able to interpret the information provided in the new score report format appropriately and were less likely to make incorrect inferences.
Please view the sample score report here.
Your score reflects your relative mastery of the concepts and principles that constitute the basis of safe and effective patient care specific to each Step examination.
The average score and standard deviation for recent administrations are included on your score report. Detailed information about interpreting USMLE scores is available in the Score Interpretation Guidelines.
Please see additional information about score reports here.
The USMLE Management Committee establishes the minimum passing score. The USMLE Management Committee reviews data for each component in the USMLE sequence approximately once every four years and decides whether to change the recommended minimum passing score.
All scores are made comparable through equating, a psychometric process that adjusts scores based on the difficulty of the questions. This can be thought of as small score increases applied to examinees who see somewhat more difficult sets of test questions, and small score decreases applied to test takers who see somewhat less difficult sets of test questions.
Information on minimum passing scores for USMLE examinations is posted here.
Your performance is compared to other examinees by comparing your three-digit score to that of a recent group of examinees (students from US and Canadian medical schools taking the examination for the first time). The blue bars represent the distribution of this group’s scores, with higher bars indicating more individuals with scores in that range. Your overall performance (three-digit score) is marked with an orange rectangle and solid line. The black rectangle and dashed line indicate the minimum passing score.
If your overall performance (indicated by the orange rectangle and solid line on page 1) is toward the right side of the chart and to the right of the tallest blue bar, your performance was higher than others. If you overall performance is toward the left side of the chart, or to the left of the tallest blue bar, your performance was lower than others.
Additional information about how your performance compares to others (including norm tables and summary data from recent administrations) is available in the Score Interpretation Guidelines. Performance data by group for each Step examination is available each year based on the performance of examinees in that year.
The standard error of the estimate (SEE) indicates how you might perform if you were to retest repeatedly under the same conditions (without learning or forgetting). Approximately two-thirds (or 68%) of the time your score would fall within one SEE of your reported score (your score +/- 1 SEE); 95% of the time your score would fall within two SEEs of your reported score. This information may be useful if you are planning to retake the examination. The standard error of measurement (SEM) can also be used as an indication of the precision of the three-digit score, or the amount of measurement error. The SEM indicates how far an examinee’s "true" score might be from their reported score. However, since most test takers are interested in the score they would achieve if they tested again, we report the SEE, which provides that information.
We no longer provide a graphical comparison of your performance in each topic, or content area to borderline performance. Instead you can view the orange box next to each content area label to make a similar comparison – for example, if your overall performance was borderline (or close to the minimum passing score) and your performance in a content area is categorized as the same as your overall performance, you know that your performance in that area was borderline.
Your score report shows both your overall performance (your three-digit score and pass/fail outcome) and an indication of whether your performance in each content area within the examination was higher, lower, or the same as your overall performance. For example, if your overall performance was strong and most of the content areas indicate that your performance was "similar," your performance across the examination was strong.
This indicates that your performance was consistent across the examination. Your performance in each content area was neither stronger nor weaker than your overall performance.
To interpret the boxes, you can use the chart and your three-digit score from the first page to determine your performance. You can then use the boxes next to each content area to determine whether your performance in each content area is the same as or different from your overall performance. If each box indicates that your performance was the same, you had no areas of performance that were meaningfully stronger or weaker than your overall performance on the examination.
The Step examinations are highly integrative, and you should plan to review all content areas. In prioritizing which areas to review, you should take into account both the representation of that content area on the examination (based on the % items per exam next to the content area label) and your relative performance in that content area. Remediation strategies focused solely on relatively weak areas of performance are unlikely to be the most effective for failing examinees. In most cases, failing scores are best remediated through attention to all content areas.
The "% Items Per Test" describes the breakdown of test questions on a typical examination by content area. For example, 30%–40% of each Step 3 test includes items within the Patient Care: Diagnosis content area. Though the amount of content in each area varies on each test, these percentages provide a guideline about how much content in each area is on the examination.
Your score report is provided only for your personal use. When you want a third party (e.g., residency programs) to receive an official record of your USMLE scores, request that your registration entity send the transcript (see Requesting a Transcript of USMLE Scores). Under some circumstances, medical schools may receive scores and pass/fail outcomes for their students.