3.8 Article

Information-theoretical Complexity Metrics

Journal

LANGUAGE AND LINGUISTICS COMPASS
Volume 10, Issue 9, Pages 397-412

Publisher

WILEY
DOI: 10.1111/lnc3.12196

Keywords

-

Funding

  1. NSF [0741666]
  2. Direct For Social, Behav & Economic Scie
  3. Division Of Behavioral and Cognitive Sci [0741666] Funding Source: National Science Foundation

Ask authors/readers for more resources

Information-theoretical complexity metrics are auxiliary hypotheses that link theories of parsing and grammar to potentially observable measurements such as reading times and neural signals. This review article considers two such metrics, Surprisal and Entropy Reduction, which are respectively built upon the two most natural notions of 'information value' for an observed event (Blachman 1968). This review sketches their conceptual background and touches on their relationship to other theories in cognitive science. It characterizes them as 'lenses' through which theorists 'see' the information-processing consequences of linguistic grammars. While these metrics are not themselves parsing algorithms, the review identifies candidate mechanisms that have been proposed for both of them.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available