Go to main content Go to main menu
Left Menu

Using Odds Ratios to Detect Differential Item Functioning

Date April 17, 2018
Time 11:00 - 12:00
Dr. Kuan-Yu Jin
Room 108, Runme Shaw Building, HKU


Using Odds Ratios to Detect Differential Item Functioning

Dr. Kuan-Yu Jin
Faculty of Education
The University of Hong Kong

April 17, 2018 (Tuesday)
11:00 – 12:00
Room 108, Runme Shaw Building, HKU

Test fairness is an important concern to examinees. Differential item functioning (DIF), which occurs when two examinees with identical ability levels but from different groups do not have the same probabilities of answering an item correctly, makes test scores incomparable and substantially threatens test validity. Therefore, DIF assessment becomes a routine analysis in test development. This study proposes a simple but effective method to detect DIF using the odds ratio (OR) of two groups’ responses to a studied item. The performance of the OR method is evaluated and compared with two conventional approaches, namely, the logistic regression (LR) and Mantel–Haenszel (MH) methods, through a series of simulation studies. The results show that the OR method without a purification procedure outperforms the LR and MH methods in controlling false positive rates and yielding high true positive rates, especially when tests have a high percentage of DIF items favoring the same group. In addition, only the OR method is feasible when tests adopt the item matrix sampling design. The effectiveness of the OR method is illustrated using PISA 2015 data. 

About the Speaker
Dr. Kuan-Yu Jin is a Post-Doctoral Fellow at the Faculty of Education, The University of Hong Kong. His primary research interests include Rasch measurement, item response models, and psychometrics.

Everyone is welcome to attend!

If interested, please confirm your attendance by sending an email to kpsantos@hku.hk .

HKU HomeHKU PortalFISFaculty PortalContact UsPrivacy Policy