17,917 research outputs found

    Birth, survival and death of languages by Monte Carlo simulation

    Full text link
    Simulations of physicists for the competition between adult languages since 2003 are reviewed. How many languages are spoken by how many people? How many languages are contained in various language families? How do language similarities decay with geographical distance, and what effects do natural boundaries have? New simulations of bilinguality are given in an appendix.Comment: 24 pages review, draft for Comm.Comput.Phys., plus appendix on bilingualit

    Complex systems and the history of the English language

    Get PDF
    Complexity theory (Mitchell 2009, Kretzschmar 2009) is something that historical linguists not only can use but should use in order to improve the relationship between the speech we observe in historical settings and the generalizations we make from it. Complex systems, as described in physics, ecology, and many other sciences, are made up of massive numbers of components interacting with one another, and this results in self-organization and emergent order. For speech, the “components” of a complex system are all of the possible variant realizations of linguistic features as they are deployed by human agents, speakers and writers. The order that emerges in speech is simply the fact that our use of words and other linguistic features is significantly clustered in the spatial and social and textual groups in which we actually communicate. Order emerges from such systems by means of self-organization, but the order that arises from speech is not the same as what linguists study under the rubric of linguistic structure. In both texts and regional/social groups, the frequency distribution of features occurs as the same pattern: an asymptotic hyperbolic curve (or “A-curve”). Formal linguistic systems, grammars, are thus not the direct result of the complex system, and historical linguists must use complexity to mediate between the language production observed in the community and the grammars we describe. The history of the English language does not proceed as regularly as like clockwork, and an understanding of complex systems helps us to see why and how, and suggests what we can do about it. First, the scaling property of complex systems tells us that there are no representative speakers, and so our observation of any small group of speakers is unlikely to represent any group at a larger scale—and limited evidence is the necessary condition of many of our historical studies. The fact that underlying complex distributions follow the 80/20 rule, i.e. 80% of the word tokens in a data set will be instances of only 20% of the word types, while the other 80% of the word types will amount to only 20% of the tokens, gives us an effective tool for estimating the status of historical states of the language. Such a frequency-based technique is opposed to the typological “fit” technique that relies on a few texts that can be reliably located in space, and which may not account for the crosscutting effects of text type, another dimension in which the 80/20 rule applies. Besides issues of sampling, the frequency-based approach also affects how we can think about change. The A-curve immediately translates to the S-curve now used to describe linguistic change, and explains that “change” cannot reasonably be considered to be a qualitative shift. Instead, we can use to model of “punctuated equilibrium” from evolutionary biology (e.g., see Gould and Eldredge 1993), which suggests that multiple changes occur simultaneously and compete rather than the older idea of “phyletic gradualism” in evolution that corresponds to the traditional method of historical linguistics. The Great Vowel Shift, for example, is a useful overall generalization, but complex systems and punctuated equilibrium explain why we should not expect it ever to be “complete” or to appear in the same form in different places. These applications of complexity can help us to understand and interpret our existing studies better, and suggest how new studies in the history of the English language can be made more valid and reliable

    Artificial Intelligence in the Context of Human Consciousness

    Get PDF
    Artificial intelligence (AI) can be defined as the ability of a machine to learn and make decisions based on acquired information. AI’s development has incited rampant public speculation regarding the singularity theory: a futuristic phase in which intelligent machines are capable of creating increasingly intelligent systems. Its implications, combined with the close relationship between humanity and their machines, make achieving understanding both natural and artificial intelligence imperative. Researchers are continuing to discover natural processes responsible for essential human skills like decision-making, understanding language, and performing multiple processes simultaneously. Artificial intelligence attempts to simulate these functions through techniques like artificial neural networks, Markov Decision Processes, Human Language Technology, and Multi-Agent Systems, which rely upon a combination of mathematical models and hardware

    Interpreters and cultural mediators – different but complementary roles

    Get PDF
    This article considers the roles of medical interpreters and cultural mediators and proposes that the two should be seen as separate. In the last six years cultural mediators have been trained in Ireland not to be interpreters but to help immigrants from other countries to access and use healthcare services as well as mediating in situations of conflict between health service providers and patients. Meanwhile, interpreters have been hired to bridge the language gap. Codes of ethics for medical interpreters and competencies of cultural mediators are considered in order to clarify role boundaries and to explore similarities and differences between the two roles

    Orthography and Identity in Cameroon

    Get PDF
    The tone languages of sub-Saharan Africa raise challenging questions for the design of new writing systems. Marking too much or too little tone can have grave consequences for the usability of an orthography. Orthography development, past and present, rests on a raft of sociolinguistic issues having little to do with the technical phonological concerns that usually preoccupy orthographers. Some of these issues are familiar from the spelling reforms which have taken place in European languages. However, many of the issues faced in sub-Saharan Africa are different, being concerned with the creation of new writing systems in a multi-ethnic context: residual colonial influences, the construction of new nation-states, detribalization versus culture preservation and language reclamation, and so on. Language development projects which crucially rely on creating or revising orthographies may founder if they do not attend to the various layers of identity that are indexed by orthography: whether colonial, national, ethnic, local or individual identity. In this study, I review the history and politics of orthography in Cameroon, with a focus on tone marking. The paper concludes by calling present-day orthographers to a deeper and broader understanding of orthographic issues

    Factive and nonfactive mental state attribution

    Get PDF
    Factive mental states, such as knowing or being aware, can only link an agent to the truth; by contrast, nonfactive states, such as believing or thinking, can link an agent to either truths or falsehoods. Researchers of mental state attribution often draw a sharp line between the capacity to attribute accurate states of mind and the capacity to attribute inaccurate or “reality-incongruent” states of mind, such as false belief. This article argues that the contrast that really matters for mental state attribution does not divide accurate from inaccurate states, but factive from nonfactive ones

    Does It Matter How People Speak?

    Get PDF
    Language serves two key functions. It enables communication between agents, which allows for the establishment and operation of formal and informal institutions. It also serves a less obvious function, a reassuring quality more closely related to issues linked with trust, social capital, and cultural identification. While research on the role of language as a learning process is widespread, there is no evidence on the role of language as a signal of cultural affinity. I pursue this latter avenue of research and show that subtle language affinity is positively linked with change in earnings when using English-speaking data for cities in the Golden Horseshoe area in Southern Ontario during the period 1991 to 2001. The results are robust to changes in specification, a broad number of empirical tests, and a diverse set of outcome variables.

    Away with linguists! Normativity, inequality and metascientific reflexivity in sociolinguistic fieldwork

    Get PDF
    This paper addresses the fact that in spite of the descriptive and well-intentioned ambitions of much sociolinguistic-ethnographic research, members of studied groups often continue to interpret such research as a largely vertically organized socio-political activity that communicates a prescriptive social and linguistic normativity the researcher is inevitably taken to embody. We argue that while many researchers agree that sociolinguistic fieldwork is inherently political, actual descriptions of informants' awareness of this are still rather scarce. In the process, we demonstrate how members' metascientific reflexivity can be particularly active precisely in and during fieldwork encounters and in the entire research event, complicating the idea of a pure and disinterested description and understanding
    • 

    corecore