Event Date: Wednesday, 27 June, 2018, 3 p.m.
Location: Via Santa Maria, 36, Pisa, PI, Italia [2nd floor seminar room]
Speaker: Prof. Luca Onnis (Nanyang Technological University, Singapore)
Title: Incremental statistical learning of language
Abstract: A tacit assumption in distributional approaches to learning is that learners collect global statistics across the entire set of stimuli they are exposed to. For example, for implicit sensitivity to transitional probabilities to emerge, one must assume relatively extended exposure to frequency of several items across many instances. In naturalistic settings where the full scale of the input is experienced (e.g. a child learning a natural language), this assumption of global access to training data is problematic because it implies that the cognitive system must keep track of an exponentially growing number of relations while determining which of those relations is significant. We investigated a more plausible assumption, namely that learning proceeds incrementally, using small windows of opportunity in which the relevant relations are assumed to hold over temporally contiguous objects or events. We tested this local statistical learning hypothesis in three independent experiments targeting the learning of three separate language-like tasks: the segmentation of artificial speech into wordlike units (Experiment 1); the learning of novel arbitrary word-to-world mappings under conditions of uncertainty and fast mapping (Experiment 2); and the learning of predictive relations among non-adjacent pseudowords (Experiment 3). For each experiment adult participants were exposed to a miniature language
adapted to each task, in one of two conditions: 1) a Structure Variation condition, in which the majority of
trials were arranged in sequence such that one element was in common between two consecutive trials
(elements were specific to each experiment), and 2) a Scrambled condition, in which a smaller proportion of consecutive trials contained auditory or visual overlap. At test, all participants regardless of condition received the same test trials, half of which were structurally congruous according to training, and half were not. Importantly, because the two order conditions during training differed only in the order of trials, the global statistics of the miniature language were identical. This allowed us to make differential predictions. On the global statistical learning (GSL) account, learners solve the learning problem by keeping track of multiple statistics across many individually ambiguous stimuli across trials, possibly over the entire experiment. Thus, learning should not differ across our two conditions. Conversely, on the local statistical learning (LSL) account, learners benefit from the contiguous
arrangement of partially overlapping trials and should therefore learn better in the Scrambled condition.
We found that in all three experiment temporal contiguity in Structured Variation conditions produce
superior learning. A recently proposed theoretical framework, ACCESS (Goldstein et al., 2010) aims to
explain the learning of structure in space and time in terms of general principles of cognitive computation. In agreement with those principles, our results suggest the effectiveness of temporal contiguity and contrast across three independent language learning tasks, namely a) speech segmentation, b) multimodal learning under conditions of uncertainty, and c) non-adjacency learning. The findings thus support the view that local statistical learning may be operating at different levels of language acquisition. In addition, the importance of order of presentation of learning materials has intriguing implications for various practical learning and teaching situations.
Luca Onnis is assistant professor at the Nanyang Technological University in Singapore. He received his PhD in Psychology in 2004 from the University of Warwick, under the supervision of Nick Chater. He was a postdoctoral research associate at Cornell University from 2004 to 2008, working with Morten Christiansen, Michael Spivey, and Shimon Edelman. He was Assistant and then Associate professor at the University of Hawaii from 2008 to 2013, where he also directed the Centre for Second Language Research. He joined NTU in late 2013, and founded the LEAP lab.