OAK

Lexical Predictability During Natural Reading: Effects of Surprisal and Entropy Reduction

Metadata Downloads
Abstract
What are the effects of word-by-word predictability on sentence processing times during the natural reading of a text? Although information complexity metrics such as surprisal and entropy reduction have been useful in addressing this question, these metrics tend to be estimated using computational language models, which require some degree of commitment to a particular theory of language processing. Taking a different approach, this study implemented a large-scale cumulative cloze task to collect word-by-word predictability data for 40 passages and compute surprisal and entropy reduction values in a theory-neutral manner. A separate group of participants read the same texts while their eye movements were recorded. Results showed that increases in surprisal and entropy reduction were both associated with increases in reading times. Furthermore, these effects did not depend on the global difficulty of the text. The findings suggest that surprisal and entropy reduction independently contribute to variation in reading times, as these metrics seem to capture different aspects of lexical predictability.
Author(s)
Lowder, MatthewChoi, WonilFerreira, FernandaHenderson, John
Issued Date
2018-06
Type
Article
DOI
10.1111/cogs.12597
URI
https://scholar.gist.ac.kr/handle/local/13251
Publisher
Lawrence Erlbaum Associates Inc.
Citation
Cognitive Science, v.42, no.Special SI, pp.1166 - 1183
ISSN
0364-0213
Appears in Collections:
School of Humanities and Social Sciences > 1. Journal Articles
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.