Objectives: We sought to determine whether early adult levels of cardiovascular risk factors predict subsequent coronary artery calcium (CAC) better than concurrent or average 15-year levels and independent of a 15-year change in levels. Background: Few studies have used multiple measures over the course of time to predict subclinical atherosclerosis. Methods: African American and white adults, ages 18 to 30 years, in 4 U.S. cities were enrolled in the prospective CARDIA (Coronary Artery Risk Development in Young Adults) study from 1985 to 1986. Risk factors were measured at years 0, 2, 5, 7, 10, and 15, and CAC was assessed at year 15 (n = 3,043). Results: Overall, 9.6% adults had any CAC, with a greater prevalence among men than women (15.0% vs. 5.1%), white than African American men (17.6% vs. 11.3%), and ages 40 to 45 years than 33 to 39 years (13.3% vs. 5.5%). Baseline levels predicted CAC presence (C = 0.79) equally as well as average 15-year levels (C = 0.79; p = 0.8262) and better than concurrent levels (C = 0.77; p = 0.019), despite a 15-year change in risk factor levels. Multivariate-adjusted odds ratios of having CAC by ages 33 to 45 years were 1.5 (95% confidence interval [CI] 1.3 to 1.7) per 10 cigarettes, 1.5 (95% CI 1.3 to 1.8) per 30 mg/dl low-density lipoprotein cholesterol, 1.3 (95% CI 1.1 to 1.5) per 10 mm Hg systolic blood pressure, and 1.2 (95% CI 1.1 to 1.4) per 15 mg/dl glucose at baseline. Young adults with above optimal risk factor levels at baseline were 2 to 3 times as likely to have CAC. Conclusions: Early adult levels of modifiable risk factors, albeit low, were equally or more informative about odds of CAC in middle age than subsequent levels. Earlier risk assessment and efforts to achieve and maintain optimal risk factor levels may be needed. © 2007 American College of Cardiology Foundation.