Publisher's Synopsis
How generative AI systems capture a core function of language
Looking at the emergence of generative AI, Language Machines presents a new theory of meaning in language and computation, arguing that humanistic scholarship misconstrues how large language models (LLMs) function. Seeing LLMs as a convergence of computation and language, Leif Weatherby contends that AI does not simulate cognition, as widely believed, but rather creates culture. This evolution in language, he finds, is one that we are ill-prepared to evaluate, as what he terms "remainder humanism" counterproductively divides the human from the machine without drawing on established theories of representation that include both.
To determine the consequences of using GPT systems for language generation, Weatherby reads linguistic theory in conjunction with the algorithmic architecture of LLMs. He finds that generative AI captures the ways in which language is at first complex, cultural, and poetic, and only later referential, functional, and cognitive. This process is the semiotic hinge on which an emergent AI culture depends. Weatherby calls for a "general poetics" of computational cultural forms under the formal conditions of the algorithmic reproducibility of language.
Locating the output of LLMs on a spectrum from poetry to ideology, Language Machines concludes that literary theory must be the backbone of a new rhetorical training for our linguistic-computational culture.