1,809 research outputs found

    How did the discussion go: Discourse act classification in social media conversations

    Full text link
    We propose a novel attention based hierarchical LSTM model to classify discourse act sequences in social media conversations, aimed at mining data from online discussion using textual meanings beyond sentence level. The very uniqueness of the task is the complete categorization of possible pragmatic roles in informal textual discussions, contrary to extraction of question-answers, stance detection or sarcasm identification which are very much role specific tasks. Early attempt was made on a Reddit discussion dataset. We train our model on the same data, and present test results on two different datasets, one from Reddit and one from Facebook. Our proposed model outperformed the previous one in terms of domain independence; without using platform-dependent structural features, our hierarchical LSTM with word relevance attention mechanism achieved F1-scores of 71\% and 66\% respectively to predict discourse roles of comments in Reddit and Facebook discussions. Efficiency of recurrent and convolutional architectures in order to learn discursive representation on the same task has been presented and analyzed, with different word and comment embedding schemes. Our attention mechanism enables us to inquire into relevance ordering of text segments according to their roles in discourse. We present a human annotator experiment to unveil important observations about modeling and data annotation. Equipped with our text-based discourse identification model, we inquire into how heterogeneous non-textual features like location, time, leaning of information etc. play their roles in charaterizing online discussions on Facebook

    Incomplete Innovation and the Premature Disruption of Legal Services

    Get PDF
    Article published in the Michigan State Law Review

    The Automation of Legal Reasoning: Customized AI Techniques for the Patent Field

    Get PDF
    As Artificial Intelligence and Machine Learning continue to transform numerous aspects of our everyday lives, their role in the legal profession is growing in prominence. A subfield of Al with particular applicability to legal analysis is Natural Language Processing (NLP). NLP deals with computational techniques for processing human languages such as English, making it a natural tool for processing the text of statutes, regulations, judicial decisions, contracts, and other legal instruments. Paradoxically, although state-of-the-art Machine Learning and NLP algorithms are able to learn and act upon patterns too complex for humans to perceive, they nevertheless perform poorly on many cognitive tasks that humans routinely perform effortlessly. This profoundly limits the ability of Al to assist in many forms of legal analysis and legal decision making. This article offers two theses. First, notwithstanding impressive progress on NLP tasks in recent years, the state-of-the-art in NLP will remain unable to perform legal analysis for some time. Second, lawyers, legal scholars, and other domain experts can play an integral role in designing Al software that can partially automate legal analysis, overcoming some of the limitations in NLP capabilities

    Patents information for humanities research: Could there be something?

    Get PDF
    Latour and co-authors proposed, in the Science and Technology Translation theory, to target the many SHS (Social and Human Science) questions addressed by social studies of sciences by considering, in complement to traditional academic matters, the complete social environment (political, economic or societal). Patents obviously are a potential primary information source to do so. We propose to extend this considering that recent changes have evolved in our capacity to do so. We propose three preliminary steps: (a) patent documents as providing a structured information source, (b) a patent database as a technical encyclopedia and (c) the recent expansion of the variety of uses and users in patent domains. We underline, furthermore, that minority research in the academic space does effectively use patent information, especially in SHS compared to other disciplines. We deliver an experiment to estimate the amount of data unconsidered by not questioning the huge database of the European Patent Office. By comparatively considering the terminology of the two branches of the Unesco thesaurus, namely the micro thesauri “Social and Human Sciences" and the “Information and Communication Science” branches, we evaluate a database response to the whole vocabulary. An in-depth analysis of one selected concept will complete the study. Results show that patent information may provide a quantity of documents for a wide range of academic research questions, from strategic to state of the art, and position advances aside from the Social Studies of Science. The free open source tool is also a way to practice digital humanities expected skills on real world corpora

    Economic Growth, Innovation, Cultural Diversity. What Are We All Talking About? A Critical Survey of the State-of-the-art

    Get PDF
    This report constitutes the first deliverable of the project ENGIME – Economic Growth and Innovation in Multicultural Environments, financed by the European Commission – FP5 – Key Action: Improving socio-economic knowledge base. Contract HPSE-CT2001-50007Multiculturalism, Diversity, Economic Growth

    Lexical information from a minimalist point of view

    Get PDF
    Simplicity as a methodological orientation applies to linguistic theory just as to any other field of research: ‘Occam’s razor’ is the label for the basic heuristic maxim according to which an adequate analysis must ultimately be reduced to indispensible specifications. In this sense, conceptual economy has been a strict and stimulating guideline in the development of Generative Grammar from the very beginning. Halle’s (1959) argument discarding the level of taxonomic phonemics in order to unify two otherwise separate phonological processes is an early characteristic example; a more general notion is that of an evaluation metric introduced in Chomsky (1957, 1975), which relates the relative simplicity of alternative linguistic descriptions systematically to the quest for explanatory adequacy of the theory underlying the descriptions to be evaluated. Further proposals along these lines include the theory of markedness developed in Chomsky and Halle (1968), Kean (1975, 1981), and others, the notion of underspecification proposed e.g. in Archangeli (1984), Farkas (1990), the concept of default values and related notions. An important step promoting this general orientation was the idea of Principles and Parameters developed in Chomsky (1981, 1986), which reduced the notion of language particular rule systems to universal principles, subject merely to parametrization with restricted options, largely related to properties of particular lexical items. On this account, the notion of a simplicity metric is to be dispensed with, as competing analyses of relevant data are now supposed to be essentially excluded by the restrictive system of principles

    Course Description

    Get PDF

    Semiotic foundations of information science

    Get PDF
    Issued as Progress report no.1, Final fiscal report, and Final report, Project no. G-36-611Final report has number GIT-ICS-77-0

    Paradigmatic Tendencies in Cartography: A Synthesis of the Scientific-Empirical, Critical and Post-Representational Perspectives

    Get PDF
    Maps have been important elements of visual representation in the development of different societies, and for this reason they have mainly been considered from a practical and utilitarian point of view. This means that cartographers or mapmakers have largely focused on the technical aspects of the cartographic products, and cartography has given little attention to both its theoretical component and to its philosophical and epistemological aspects. The current study is dedicated to consider these views. In this study the main trends, thoughts and different directions in cartography during positivism/empiricism, neo-positivism and post-structuralism are reviewed; and cartography is analysed under the modernism and post-modernism periods. Some of the arguments proposed by philosophers such as Ludwig Wittgenstein and Karl Popper are examined as important contributions in our understanding of the development of cartography and mapping. This study also incorporates the idea or concept of paradigm, which has been taken from the field of the epistemology of sciences. The aforementioned opens a space to analyse cartography in terms of a paradigm shift. In the analysis of each trend within contemporary cartography – from the second half of the twentieth century until today – it is necessary to keep in mind the theoretical scheme of a scientific discipline (object of study, research aims, methods and approaches, and results). This helps to determine the body of knowledge in cartography. It is also important to consider the epistemological context in which the tendencies are developed: positivism/empiricism, realism/structuralism and idealism/hermeneutic. In this way, by considering three epistemological levels - essentialist/ontical (scientific), deconstructive (sociological), and ontological (emergent) - some paradigmatic tendencies are postulated. The first level results in tendencies such as cartographic communication, cartographic semiotics, analytical cartography and cartographic visualisation - all of these belong to the scientific-empirical perspective. In the second level, we have critical cartography, belonging to the critical perspective and that confronts the scientific stances. Finally, in the third level the so-called post-representational cartography arises in open opposition to the traditional representational cartography.Im Entwicklungsprozess verschiedener Gesellschaften sind Karten immer wichtige Elemente visueller Darstellung gewesen. Karten wurden meist aus einer praktischen und utilitaristischen Sicht betrachtet. Das heißt, dass sich Kartographen oder Kartenmacher gezielt auf die technischen Aspekte kartographischer Produkte fokussiert haben, und Kartographie sich nur wenig mit den theoretischen Komponenten und philosophischen oder epistemologischen Aspekten auseinandergesetzt hat. Diese Arbeit verfolgt das Ziel, diese Sichten zu analysieren. Diese Studie untersucht die verschiedenen kartographischen Denkrichtungen, die wĂ€hrend des Positivismus/Empirismus, des Neo-Positivismus und der Post-Strukturalismusperioden entstanden sind und analysiert Kartographie der Moderne und post-moderner Perioden. Argumente von Philosophen wie Ludwig Wittgenstein und Karl Popper werden untersucht als wichtige BeitrĂ€ge zu unserem VerstĂ€ndnis der Entwicklung der Kartographie. Diese Arbeit berĂŒcksichtigt auch das Konzept des Paradigmas, welches aus dem Gebiet der wissenschaftlichen Epistemologie adaptiert wurde. Dies eröffnet die Möglichkeit, Kartographie hinsichtlich eines Paradigmenwechsels analysieren zu können. Wenn man die Tendenzen der zeitgenössischen Kartographie – von der zweiten HĂ€lfte des zwanzigsten Jahrhunderts bis heute – studiert, muss der theoretische Rahmen einer wissenschaftlichen Disziplin (Forschungsobjekt, Forschungsziel, Arbeitsmethodik und Ergebnisse) berĂŒcksichtigt werden. Dies erlaubt es, das gesammelte Wissen der Kartographie zu ermitteln. Ebenfalls wichtig ist die BerĂŒcksichtigung des epistemologischen Kontexts, in dem diese Tendenzen entstanden: Positivismus/Empirismus, Realismus/Strukturalismus und Idealismus/Hermeneutik. Unter BerĂŒcksichtigung dreier epistemologischer Ebenen – Essenzialisten/ontisch (wissenschaftlich), dekonstructiv (soziologisch) und ontologisch (emergent) – werden ausgewĂ€hlte paradigmatische Tendenzen postuliert. Die erste Ebene ergibt Tendenzen wie die kartographische Kommunikation, die kartographische Semiotik, die analytische Kartographie und die kartographische Visualisierung, die alle zu der wissenschaftlich-empirischen Perspektive gehören. Zur zweiten Ebene gehört die kritische Kartographie, welche der kritischen Perspektive zugeordnet ist und die wissenschaftliche Standpunkte konfrontiert. Die so genannte post-reprĂ€sentative Kartographie entsteht aus der dritten Ebene im offenen Widerstand zur traditionellen reprĂ€sentativen Kartographie
    • 

    corecore