- Main
Quantitative Qualitative Correspondence in Grammaticalization
Abstract
The gradual nature of historical language change is widely acknowledged. We explore a syntactic change model that offers a new view into the theoretical difference between classical and neural network claims about language encoding. Most prior treatments of grammaticalization fail to account for how, exactly, new forms arise, focusing instead on change following innovation. A phenomenon relevant to the innovation puzzle is Quantitative Anticipation of Qualitative Change in Grammaticalization (QAQCG)---gradual statistical changes anticipate structural changes. Although prior researchers have given phenomenological descriptions, we know of no rigorous method for testing whether QAQCG exists. Here, we quantitatively examine the case of English "a lot" which has grammaticalized an Adverb function from a Noun Phrase function. A simple feedforward neural network implements QAQCG, predicting a curving trajectory in probability space. Bayes Factor analysis supports the network over a classically-motivated linear model, highlighting continuity and nonlinearity as distinctive theoretical claims.