929 research outputs found

    On analogy as the motivation for grammaticalization

    Get PDF
    The number of phenomena which are gathered together under the term 'grammaticalization' is quite large and in some ways quite diverse. For the different types of grammaticalization similar motivating factors have been suggested, similar principles, clines and hierarchies. Some of Lehmann's (1982[1995], 1985) parameters, which have long been considered to characterize processes of grammaticalization, are now under attack from various quarters, and indeed the phenomenon of grammaticalization itself has been questioned as an independent mechanism in language change. This paper addresses a number of problems connected with the 'apparatus' used in grammaticalization and with the various types of grammaticalization currently distinguished. It will be argued that we get a better grip on what happens in processes of grammaticalization and lexicalization if the process is viewed in terms of an analogical, usage-based grammar, in which a distinction is made between processes taking place on a token-level and those taking place on a type-level. The model involves taking more notice of the form of linguistic signs and of the synchronic grammar system at each stage of the grammaticalization process

    Predictive coding and representationalism

    Get PDF
    According to the predictive coding theory of cognition (PCT), brains are predictive machines that use perception and action to minimize prediction error, i.e. the discrepancy between bottom–up, externally-generated sensory signals and top–down, internally-generated sensory predictions. Many consider PCT to have an explanatory scope that is unparalleled in contemporary cognitive science and see in it a framework that could potentially provide us with a unified account of cognition. It is also commonly assumed that PCT is a representational theory of sorts, in the sense that it postulates that our cognitive contact with the world is mediated by internal representations. However, the exact sense in which PCT is representational remains unclear; neither is it clear that it deserves such status—that is, whether it really invokes structures that are truly and nontrivially representational in nature. In the present article, I argue that the representational pretensions of PCT are completely justified. This is because the theory postulates cognitive structures—namely action-guiding, detachable, structural models that afford representational error detection—that play genuinely representational functions within the cognitive system

    Using Process Mining to Support Theorizing About Change in Organizations

    Get PDF
    Process mining refers to a family of algorithms used to computationally reconstruct, analyze and visualize business processes through event log data. While process mining is commonly associated with the improvement of business processes, we argue that it can be used as a method to support theorizing about change in organizations. Central to our argument is that process mining algorithms can support inductive as well as deductive theorizing. Process mining algorithms can extend established theorizing in a number of ways and in relation to different research agendas and phenomena. We illustrate our argument in relation to two types of change: endogenous change that evolves over time and exogenous change that follows a purposeful intervention. Drawing on the discourse of routine dynamics, we propose how different process mining features can reveal new insights about the dynamics of organizational routines

    Computational approaches to semantic change (Volume 6)

    Get PDF
    Semantic change — how the meanings of words change over time — has preoccupied scholars since well before modern linguistics emerged in the late 19th and early 20th century, ushering in a new methodological turn in the study of language change. Compared to changes in sound and grammar, semantic change is the least understood. Ever since, the study of semantic change has progressed steadily, accumulating a vast store of knowledge for over a century, encompassing many languages and language families. Historical linguists also early on realized the potential of computers as research tools, with papers at the very first international conferences in computational linguistics in the 1960s. Such computational studies still tended to be small-scale, method-oriented, and qualitative. However, recent years have witnessed a sea-change in this regard. Big-data empirical quantitative investigations are now coming to the forefront, enabled by enormous advances in storage capability and processing power. Diachronic corpora have grown beyond imagination, defying exploration by traditional manual qualitative methods, and language technology has become increasingly data-driven and semantics-oriented. These developments present a golden opportunity for the empirical study of semantic change over both long and short time spans
    • …
    corecore