Computational modeling of statistical learning: Effects of Transitional probability versus frequency and links to word learning

Academic Article


  • Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network (SRN) performed much like human learners: it was sensitive to both transitional probability and frequency, with frequency dominating early in learning and probability emerging as the dominant cue later in learning. In Simulation 2, an SRN captured links between statistical segmentation and word learning in infants and adults, and suggested that these links arise because phonological representations are more distinctive for syllables with higher transitional probability. Beyond simply simulating general phenomena, these models provide new insights into underlying mechanisms and generate novel behavioral predictions. © International Society on Infant Studies (ISIS).
  • Authors

    Published In

  • Infancy  Journal
  • Digital Object Identifier (doi)

    Author List

  • Mirman D; Estes KG; Magnuson JS
  • Start Page

  • 471
  • End Page

  • 486
  • Volume

  • 15
  • Issue

  • 5