Natural language allows humans to package information as words, which
can be combined in novel ways to create completely new ideas. This
recycling of words is possible because of the compositional rules that
govern language: the meaning of a sentence can be predicted by the
meanings of its parts and the structures that are used to combine
them. This talk will examine how linguistic structures encode meaning
and how children use structure to guide inference and learning in
development.To explore this, I will discuss the case study of how
children learn to describe quantity in language. First, I will
describe how linguistic structures encode objects grammatically.
Second, I will discuss how children use non-linguistic object
representations to begin learning quantity expressions like "every"
and "more". Finally, I will argue that linguistic structure allows
children to differentiate the meanings of number words (one, two,
three) from other quantifiers (a, some, all), without requiring a
fundamental conceptual change. Together, these case studies suggest
that, by reflecting and recycling existing mental representations,
linguistic structures provide important constraints to word learning
and the acquisition of abstract human knowledge.