Summer 2026 Psychometric Internship Announcement
Internship Opportunity
The American Board of Internal Medicine is pleased to announce its summer psychometric internship
program for 2026. The ABIM psychometric internship program is an eight-week summer internship
running from Monday, June 8th to Friday, July 31st in Philadelphia. During the program, each intern
will take primary ownership of an applied psychometric research project under the guidance of one of
the ABIM’s psychometricians. In addition, the internship curriculum includes instructional lectures
and opportunities for the interns to gain applied operational experience. As ABIM is the nation’s
largest physician certification organization, interns will gain experience with issues unique to
professional testing organizations.
Qualifications
• Doctoral student in an educational measurement (or related field) program with at least two
years of coursework completed by the start of the internship
• Preference will be given to applicants who have experience with item response theory
• Ideal candidates will have experience with programming in SAS or R
• Excellent communication skills
• Interest in certification testing
• Eligible to be legally employed in the United States
Stipend
The ABIM provides up to $15,000 for the eight-week internship program. This total includes a
$10,000 stipend as well as up to $5,000 in housing reimbursement.
Research Project
For their primary research project, each intern should expect to perform all stages of the research
process, from literature review to discussion and dissemination of results. At the conclusion of the
program, interns will be expected to share their results by giving a presentation to an audience of
psychometric staff. Further, interns will be encouraged to submit their summer project for
presentation at a professional conference and/or for publication. Each intern will work with their
mentor to select an appropriate project for their experience level and interests. Examples of
previously completed internship projects can be found on the next page.
Application
Please submit your curriculum vitae and a letter of interest to Michele Johnson, Research Program
Manager (researchintern@abim.org) by Monday, February 2, 2026.
Examples of Previously Completed Internship Projects
• Rapid Guessing in Medical Certification Exams (LKA): Impact and Implications. Many
medical certification boards have adopted continuous assessments that consist of multiple
shorter assessments over a fixed time period. These types of assessments may have relatively
high rates of rapid guessing. The intern designed and conducted a simulation study to examine
the impact of rapid guessing on item parameter estimation, drift and equating, examinee ability
estimation, and pass/fail decisions, and how these impacts compound over time.
Cao, Y., & Sauder, D. Paper presented at NCME 2025.
• The Impact of Compromised Anchor and Non-Anchor Items on Equating Results. The
project examined the extent to which compromised anchor and non-anchor items affect the
equating process and undermine the validity of score interpretations. The intern designed and
conducted a simulation study evaluating the effects of compromised items on scaling
coefficients and equated scores across different proportions of cheating examinees and
compromised anchor and non-anchor items.
Wan, S., & Myers, A. Paper presented at NCME 2023.
• The Effect of Patient-Physician Gender Concordance on Item Performance. The purpose of
this project was to determine how patient-physician gender concordance affects physicians’
clinical practice, using exam item performance as a proxy for clinical practice. The intern
designed an applied study that leveraged a hierarchical logistic regression model to investigate
if female and male physicians performed differently on either female-patient or male-patient
items, controlling for physician ability.
Tian, C., & Rewley, K. Paper presented at NCME 2022.
• Anchor Item Replacement in the Presence of Consequential Item Parameter Drift. The
purpose of this project was to develop evidence-based recommendations for replacing anchor
items when a significant number are flagged for exhibiting item parameter drift. The intern
conducted a simulation study that investigated different item replacement strategies, and
examined the effects of each strategy with respect to outcomes such as pass/fail classification
accuracy and RMSD of thetas.
Chang, K., & Rewley, K. Paper presented at NCME 2021.
• Evaluating Use of an Online Open-Book Resource in a High Stakes Credentialing Exam.
This project examined item and examinee characteristics associated with the use of an openbook resource throughout a high-stakes medical certification exam. Using exam process data
and a generalized estimation equations modeling framework, the intern examined use of the
open-book resource and how it might affect examinees’ test-taking experience and
performance.
Myers, A. & Bashkov B. Paper presented at NCME 2020.
• Investigating the Impact of Parameter Instability on IRT Proficiency Estimation. This
project examined how poorly estimated item parameters impact different proficiency
estimators. The intern conducted a simulation study to examine how different levels of
parameter instability impact Bayesian vs. non-Bayesian estimators as well as pattern vs.
summed-score estimators.
McGrath, K. & Smiley, W. Paper presented at NCME 2019