Objective. To study predictors of discordance between self-reported physician diagnosis and administrative database diagnosis of arthritis. Methods. A cohort of all veterans who utilized Veterans Integrated Service Network (VISN)-13 medical facilities were mailed a questionnaire that included patient self-report of physician diagnosis of arthritis and questions regarding demographics, functional limitation, and SF-36V (a validated version of the Medical Outcomes Study Short-Form 36). Kappa coefficient was used to assess the extent of agreement between self-report of physician diagnosis and administrative database definitions that incorporated International Classification of Diseases (ICD) codes and use of medications for arthritis. We identified predictors of overall discordance between self-report and administrative database diagnosis using multivariable logistic regression analyses. Results. Among 70,334 eligible veterans surveyed, 19,749 subjects had an ICD diagnosis of arthritis in the administrative database in the year prior to the survey; 34,440 answered the arthritis question and 18,464 self-reported a physician diagnosis of arthritis. Kappa coefficient showed slight to fair agreement of 0.19-0.32 between self-report and administrative database definitions of arthritis. We found significantly higher overall discordance among veterans with more comorbidities, greater age, worse functional status, lower use of outpatient and inpatient services, lower education level, and among single medical-site users. Conclusion. Low level of agreement between self-report and database diagnosis of arthritis and its significant association with patient demographic, clinical, and functional characteristics highlights the limitation of use of these strategies for identification of patients with arthritis in epidemiological studies. The Journal of Rheumatology Copyright © 2009. All rights reserved.