185 research outputs found

    Uncertainty in deliberate lexical interventions

    Get PDF
    Language managers in their different forms (language planners, terminologists, professional neologists 
) have long tried to intervene in the lexical usage of speakers, with various degrees of success: Some of their lexical items (partly) penetrate language use, others do not. Based on electronic networks of practice of the Esperanto speech community, MĂ©lanie Maradan establishes the foundation for a new method to extract speakers’ opinions on lexical items from text corpora. The method is intended as a tool for language managers to detect and explore in context the reasons why speakers might accept or reject lexical items. MĂ©lanie Maradan holds a master’s degree in translation and terminology from the University of Geneva/Switzerland as well as a joint doctoral degree in multilingual information processing and philosophy (Dr. phil.) from the universities of Geneva and Hildesheim/Germany. Her research interests include planned languages (Esperanto studies) as well as neology and corpus linguistics. She works as a professional translator and terminologist in Switzerland

    Evaluation of an Esperanto-Based Interlingua Multilingual Survey Form Machine Translation Mechanism Incorporating a Sublanguage Translation Methodolgy

    Get PDF
    Translation costs restrict the preparation of medical survey and other questionnaires for migrant communities in Western Australia. This restriction is compounded by a lack of affordable and accurate machine translation mechanisms. This research investigated and evaluated combined strategies intended to provide an efficacious and affordable machine translator by: ‱ using an interlingua or pivot-language that requires less resources for its construction than contemporary systems and has the additional benefit of significant error reduction; and ‱ defining smaller lexical environments to restrict data, thereby reducing the complexity of translation rules and enhancing correct semantic transfer between natural languages. This research focussed on producing a prototype machine translation mechanism that would accept questionnaire texts as discrete questions and suggested answers from which a respondent may select. The prototype was designed to accept non-ambiguous English as the source language, translate it to a pivot-language or interlingua, Esperanto, and thence to a selected target language, French. Subsequently, a reverse path of translation from the target language back to the source language enabled validation of minimal or zero change in both syntax and semantics of the original input. Jade, an object-oriented (00) database application, hosting the relationship between the natural languages and the interlingua, was used to facilitate the accurate transfer of meaning between the natural languages. Translation, interpretation and validation of sample texts was undertaken by linguists qualified in English, French and Esperanto. Translation output from the prototype model was compared, again with assistance from linguists, with a \u27control\u27 model, the SYSTRAN On-Line Translator, a more traditional transfer translation product. Successful completion of this research constitutes a step towards an increased availability of low cost machine translation to assist in the development of reliable and efficient survey translation systems for use in specific user environments. These environments include, but arc not exclusive to, medical, hospital and Australian indigenous-contact environments

    Uncertainty in deliberate lexical interventions

    Get PDF
    Language managers in their different forms (language planners, terminologists, professional neologists 
) have long tried to intervene in the lexical usage of speakers, with various degrees of success: Some of their lexical items (partly) penetrate language use, others do not. Based on electronic networks of practice of the Esperanto speech community, MĂ©lanie Maradan establishes the foundation for a new method to extract speakers’ opinions on lexical items from text corpora. The method is intended as a tool for language managers to detect and explore in context the reasons why speakers might accept or reject lexical items. MĂ©lanie Maradan holds a master’s degree in translation and terminology from the University of Geneva/Switzerland as well as a joint doctoral degree in multilingual information processing and philosophy (Dr. phil.) from the universities of Geneva and Hildesheim/Germany. Her research interests include planned languages (Esperanto studies) as well as neology and corpus linguistics. She works as a professional translator and terminologist in Switzerland

    FinBook: literary content as digital commodity

    Get PDF
    This short essay explains the significance of the FinBook intervention, and invites the reader to participate. We have associated each chapter within this book with a financial robot (FinBot), and created a market whereby book content will be traded with financial securities. As human labour increasingly consists of unstable and uncertain work practices and as algorithms replace people on the virtual trading floors of the worlds markets, we see members of society taking advantage of FinBots to invest and make extra funds. Bots of all kinds are making financial decisions for us, searching online on our behalf to help us invest, to consume products and services. Our contribution to this compilation is to turn the collection of chapters in this book into a dynamic investment portfolio, and thereby play out what might happen to the process of buying and consuming literature in the not-so-distant future. By attaching identities (through QR codes) to each chapter, we create a market in which the chapter can ‘perform’. Our FinBots will trade based on features extracted from the authors’ words in this book: the political, ethical and cultural values embedded in the work, and the extent to which the FinBots share authors’ concerns; and the performance of chapters amongst those human and non-human actors that make up the market, and readership. In short, the FinBook model turns our work and the work of our co-authors into an investment portfolio, mediated by the market and the attention of readers. By creating a digital economy specifically around the content of online texts, our chapter and the FinBook platform aims to challenge the reader to consider how their personal values align them with individual articles, and how these become contested as they perform different value judgements about the financial performance of each chapter and the book as a whole. At the same time, by introducing ‘autonomous’ trading bots, we also explore the different ‘network’ affordances that differ between paper based books that’s scarcity is developed through analogue form, and digital forms of books whose uniqueness is reached through encryption. We thereby speak to wider questions about the conditions of an aggressive market in which algorithms subject cultural and intellectual items – books – to economic parameters, and the increasing ubiquity of data bots as actors in our social, political, economic and cultural lives. We understand that our marketization of literature may be an uncomfortable juxtaposition against the conventionally-imagined way a book is created, enjoyed and shared: it is intended to be

    Design and semantics of form and movement (DeSForM 2006)

    Get PDF
    Design and Semantics of Form and Movement (DeSForM) grew from applied research exploring emerging design methods and practices to support new generation product and interface design. The products and interfaces are concerned with: the context of ubiquitous computing and ambient technologies and the need for greater empathy in the pre-programmed behaviour of the ‘machines’ that populate our lives. Such explorative research in the CfDR has been led by Young, supported by Kyffin, Visiting Professor from Philips Design and sponsored by Philips Design over a period of four years (research funding £87k). DeSForM1 was the first of a series of three conferences that enable the presentation and debate of international work within this field: ‱ 1st European conference on Design and Semantics of Form and Movement (DeSForM1), Baltic, Gateshead, 2005, Feijs L., Kyffin S. & Young R.A. eds. ‱ 2nd European conference on Design and Semantics of Form and Movement (DeSForM2), Evoluon, Eindhoven, 2006, Feijs L., Kyffin S. & Young R.A. eds. ‱ 3rd European conference on Design and Semantics of Form and Movement (DeSForM3), New Design School Building, Newcastle, 2007, Feijs L., Kyffin S. & Young R.A. eds. Philips sponsorship of practice-based enquiry led to research by three teams of research students over three years and on-going sponsorship of research through the Northumbria University Design and Innovation Laboratory (nuDIL). Young has been invited on the steering panel of the UK Thinking Digital Conference concerning the latest developments in digital and media technologies. Informed by this research is the work of PhD student Yukie Nakano who examines new technologies in relation to eco-design textiles

    Language Processing and the Artificial Mind: Teaching Code Literacy in the Humanities

    Get PDF
    Humanities majors often find themselves in jobs where they either manage programmers or work with them in close collaboration. These interactions often pose difficulties because specialists in literature, history, philosophy, and so on are not usually code literate. They do not understand what tasks computers are best suited to, or how programmers solve problems. Learning code literacy would be a great benefit to humanities majors, but the traditional computer science curriculum is heavily math oriented, and students outside of science and technology majors are often math averse. Yet they are often interested in language, linguistics, and science fiction. This thesis is a case study to explore whether computational linguistics and artificial intelligence provide a suitable setting for teaching basic code literacy. I researched, designed, and taught a course called “Language Processing and the Artificial Mind.” Instead of math, it focuses on language processing, artificial intelligence, and the formidable challenges that programmers face when trying to create machines that understand natural language. This thesis is a detailed description of the material, how the material was chosen, and the outcome for student learning. Student performance on exams indicates that students learned code literacy basics and important linguistics issues in natural language processing. An exit survey indicates that students found the course to be valuable, though a minority reacted negatively to the material on programming. Future studies should explore teaching code literacy with less programming and new ways to make coding more interesting to the target audience

    Cultural Science

    Get PDF
    This book is available as open access through the Bloomsbury Open Access programme and is available on www.bloomsburycollections.com. Cultural Science introduces a new way of thinking about culture. Adopting an evolutionary and systems approach, the authors argue that culture is the population-wide source of newness and innovation; it faces the future, not the past. Its chief characteristic is the formation of groups or 'demes' (organised and productive subpopulation; 'demos'). Demes are the means for creating, distributing and growing knowledge. However, such groups are competitive and knowledge-systems are adversarial. Starting from a rereading of Darwinian evolutionary theory, the book utilises multidisciplinary resources: Raymond Williams's 'culture is ordinary' approach; evolutionary science (e.g. Mark Pagel and Herbert Gintis); semiotics (Yuri Lotman); and economic theory (from Schumpeter to McCloskey). Successive chapters argue that: -Culture and knowledge need to be understood from an externalist ('linked brains') perspective, rather than through the lens of individual behaviour; -Demes are created by culture, especially storytelling, which in turn constitutes both politics and economics; -The clash of systems - including demes - is productive of newness, meaningfulness and successful reproduction of culture; -Contemporary urban culture and citizenship can best be explained by investigating how culture is used, and how newness and innovation emerge from unstable and contested boundaries between different meaning systems; -The evolution of culture is a process of technologically enabled 'demic concentration' of knowledge, across overlapping meaning-systems or semiospheres; a process where the number of demes accessible to any individual has increased at an accelerating rate, resulting in new problems of scale and coordination for cultural science to address. The book argues for interdisciplinary 'consilience', linking evolutionary and complexity theory in the natural sciences, economics and anthropology in the social sciences, and cultural, communication and media studies in the humanities and creative arts. It describes what is needed for a new 'modern synthesis' for the cultural sciences. It combines analytical and historical methods, to provide a framework for a general reconceptualisation of the theory of culture – one that is focused not on its political or customary aspects but rather its evolutionary significance as a generator of newness and innovation

    The Democratization of Artificial Intelligence: Net Politics in the Era of Learning Algorithms

    Get PDF
    After a long time of neglect, Artificial Intelligence is once again at the center of most of our political, economic, and socio-cultural debates. Recent advances in the field of Artifical Neural Networks have led to a renaissance of dystopian and utopian speculations on an AI-rendered future. Algorithmic technologies are deployed for identifying potential terrorists through vast surveillance networks, for producing sentencing guidelines and recidivism risk profiles in criminal justice systems, for demographic and psychographic targeting of bodies for advertising or propaganda, and more generally for automating the analysis of language, text, and images. Against this background, the aim of this book is to discuss the heterogenous conditions, implications, and effects of modern AI and Internet technologies in terms of their political dimension: What does it mean to critically investigate efforts of net politics in the age of machine learning algorithms
    • 

    corecore