The dependence of radiographic mottle on beam quality has been studied for a variety of film/screen combinations. The measurement technique consisted of scanning the radiographs with a microdenitometer; the analogue signal from the microdesitometer was gated and recorded with a multichannel analyzer. The sampling aperture was 500 mum in diameter and was chosen because its spatial frequency response approximates that of the eye. The standard deviation of the density fluctuations was calculated directly from the number versus density spectrum accumulated in the analyzer. The range of standard deviations' for the various combinations and x-ray tube kilovoltages studied from approximately 0.009 to approximately 0.021 density units. For commonly used combinations, very little kilovoltage dependence was observed. The relative amount varied for different screen phosphors and is attributed to differences in the absorption characteristics.