Abstract
Contextual Bootstrapping for Grammar Learning
by
Eva H. Mok
Doctor of Philosophy in Computer Science
University of California, Berkeley
Professor Jerome A. Feldman, Chair
The problem of grammar learning is a challenging one for both children and machines
due to impoverished input: hidden grammatical structures, lack of explicit correction, and in
pro-drop languages, argument omission. This dissertation describes a computational model of
child grammar learning using a probabilistic version of Embodied Construction Grammar (ECG)
that demonstrates how the problem of impoverished input is alleviated through bootstrapping
from the situational context. This model represents the convergence of: (1) a unified
representation that integrates semantic knowledge, linguistic knowledge, and contextual
knowledge, (2) a context-aware language understanding process, and (3) a structured grammar
learning and generalization process.
Using situated child-directed utterances as learning input, the model performs two
concurrent learning tasks: structural learning of the grammatical units and statistical learning of
the associated parameters. The structural learning task is a guided search over the space of
possible constructions. The search is informed by embodied semantic knowledge that it has