300,850 research outputs found

    When Do Discourse Markers Affect Computational Sentence Understanding?

    Full text link
    The capabilities and use cases of automatic natural language processing (NLP) have grown significantly over the last few years. While much work has been devoted to understanding how humans deal with discourse connectives, this phenomenon is understudied in computational systems. Therefore, it is important to put NLP models under the microscope and examine whether they can adequately comprehend, process, and reason within the complexity of natural language. In this chapter, we introduce the main mechanisms behind automatic sentence processing systems step by step and then focus on evaluating discourse connective processing. We assess nine popular systems in their ability to understand English discourse connectives and analyze how context and language understanding tasks affect their connective comprehension. The results show that NLP systems do not process all discourse connectives equally well and that the computational processing complexity of different connective kinds is not always consistently in line with the presumed complexity order found in human processing. In addition, while humans are more inclined to be influenced during the reading procedure but not necessarily in the final comprehension performance, discourse connectives have a significant impact on the final accuracy of NLP systems. The richer knowledge of connectives a system learns, the more negative effect inappropriate connectives have on it. This suggests that the correct explicitation of discourse connectives is important for computational natural language processing.Comment: Chapter 7 of Discourse Markers in Interaction, published in Trends in Linguistics. Studies and Monograph

    Innovation, generative relationships and scaffolding structures: implications of a complexity perspective to innovation for public and private interventions

    Get PDF
    The linear model of innovation has been superseded by a variety of theoretical models that view the innovation process as systemic, complex, multi-level, multi-temporal, involving a plurality of heterogeneous economic agents. Accordingly, the emphasis of the policy discourse has changed over time. The focus has shifted from the direct public funding of basic research as an engine of innovation, to the creation of markets for knowledge goods, to, eventually, the acknowledgement that knowledge transfer very often requires direct interactions among innovating actors. In most cases, policy interventions attempt to facilitate the match between “demand” and “supply” of the knowledge needed to innovate. A complexity perspective calls for a different framing, one focused on the fostering of processes characterized by multiple agency levels, multiple temporal scales, ontological uncertainty and emergent outcomes. This contribution explores what it means to design interventions in support of innovation processes inspired by a complex systems perspective. It does so by analyzing two examples of coordinated interventions: a public policy funding innovating networks (with SMEs, research centers and university), and a private initiative, promoted by a network of medium-sized mechanical engineering firms, that supports innovation by means of technology brokerage. Relying on two unique datasets recording the interactions of the organizations involved in these interventions, social network analysis and qualitative research are combined in order to investigate network dynamics and the roles of specific actors in fostering innovation processes. Then, some general implications for the design of coordinated interventions supporting innovation in a complexity perspective are drawn

    Innovative interventions in support of innovation networks. A complex system perspective to public innovation policy and private technology brokering

    Get PDF
    The linear model of innovation has been superseded by a variety of theoretical models that view the innovation process as systemic, complex, multi-level, multi-temporal, involving a plurality of heterogeneous economic agents. Accordingly, the emphasis of the policy discourse has shifted over time. It has gone from a focus on direct public funding of basic research as an engine of innovation, to the creation of markets for knowledge goods, to, eventually, the acknowledgement that knowledge transfer very often requires direct interactions among innovating actors. In most cases, these interventions attempt to facilitate the match between “demand” and “supply” of the knowledge needed to innovate. A complexity perspective calls for a different framing, one focused on the fostering of process characterized by multiple agency levels, multiple temporal scales, ontological uncertainty and emergent outcomes. The article explores what it means to design interventions in support of innovation processes inspired by a complex systems perspective. It does so by analyzing two different examples of coordinated interventions: an innovative public policy funding networks of innovating firms, and a private initiative supporting innovation in the mechanical engineering industry thanks to the set up of a technology broker. Relying on two unique datasets recording the interactions of the various organizations involved in these interventions, the article combines social network analysis and qualitative research in order to investigate the dynamics of the networks and the roles and actions of specific actors in fostering innovation processes. Building upon this comparative analysis, some general implications for the design of coordinated interventions supporting innovation in a complexity perspective are derived.Innovation policy; local development policies; regional development policies; evaluation management

    Linear Latent World Models in Simple Transformers: A Case Study on Othello-GPT

    Full text link
    Foundation models exhibit significant capabilities in decision-making and logical deductions. Nonetheless, a continuing discourse persists regarding their genuine understanding of the world as opposed to mere stochastic mimicry. This paper meticulously examines a simple transformer trained for Othello, extending prior research to enhance comprehension of the emergent world model of Othello-GPT. The investigation reveals that Othello-GPT encapsulates a linear representation of opposing pieces, a factor that causally steers its decision-making process. This paper further elucidates the interplay between the linear world representation and causal decision-making, and their dependence on layer depth and model complexity. We have made the code public

    The symbiosis between information system project complexity and information system project success

    Get PDF
    Project success is widely covered, and the discourse on project complexity is proliferating. The purpose of this book is to merge and investigate the two concepts within the context of information system (IS) projects and understand the symbiosis between success and complexity in these projects. In this original and innovative research, exploratory modelling is employed to identify the aspects that constitute the success and complexity of projects based on the perceptions of IS project participants. This scholarly book aims at deepening the academic discourse on the relationship between the success and complexity of projects and to guide IS project managers towards improved project performance through the complexity lens. The research methodology stems from the realisation that the complexity of IS projects and its relationship to project success are under-documented. A post positivistic approach is applied in order to accommodate the subjective interpretation of IS-project participants through a quantitative design. The researchers developed an online survey strategy regarding literature concerning the success and complexity of projects. The views of 617 participants are documented. In the book, descriptive statistics and exploratory factor analysis pave the way for identifying the key success and complexity constructs of IS projects. These constructs are used in structural-equation modelling to build various validated and predictive models. Knowledge concerning the success and complexity of projects is mostly generic with little exposure to the field of IS project management. The contribution to current knowledge includes how the success of IS projects should be considered as well as what the complexity constructs of IS projects are. The success of IS projects encompasses strategic success, deliverable success, process success and the ‘unknowns’ of project success. The complexity of IS projects embodies organisational complexity, environmental complexity, technical complexity, dynamics and uncertainty. These constructs of success and complexity are mapped according to their underlying latent relationships to each other. The intended audience of this book is fellow researchers and project and IS specialists, including information technology managers, executives, project managers, project team members, the project management office (PMO), general managers and executives that initiate and conduct project-related work. The work presented in this first edition of the book is original and has not been plagiarised or presented before. It is not a revised version of a thesis or research previously published. Comments resulted from the blind peer review process were carefully considered and incorporated accordingly

    Comprehension, Use Cases and Requirements

    Get PDF
    Within requirements engineering it is generally accepted that in writing specifications (or indeed any requirements phase document), one attempts to produce an artefact which will be simple to comprehend for the user. That is, whether the document is intended for customers to validate requirements, or engineers to understand what the design must deliver, comprehension is an important goal for the author. Indeed, advice on producing ‘readable’ or ‘understandable’ documents is often included in courses on requirements engineering. However, few researchers, particularly within the software engineering domain, have attempted either to define or to understand the nature of comprehension and it’s implications for guidance on the production of quality requirements. In contrast, this paper examines thoroughly the nature of textual comprehension, drawing heavily from research in discourse process, and suggests some implications for requirements (and other) software documentation. In essence, we find that the guidance on writing requirements, often prevalent within software engineering, may be based upon assumptions which are an oversimplification of the nature of comprehension. Furthermore, that these assumptions may lead to rules which detract from the quality of the requirements document and, thus, the understanding gained by the reader. Finally the paper suggests lessons learned which may be useful in formulating future guidance for the production of requirements documentation

    Communication in organizations: the heart of information systems

    Get PDF
    We propose a theory characterizing information systems (IS) as language communities which use and develop domain-specific languages for communication. Our theory is anchored in Language Critique, a branch of philosophy of language. In developing our theory, we draw on Systems Theory and Cybernetics as a theoretical framework. "Organization" of a system is directly related to communication of its sub-systems. "Big systems" are self-organizing and the control of this ability is disseminated throughout the system itself. Therefore, the influence on changes of the system from its outside is limited. Operations intended to change an organization are restricted to indirect approaches. The creation of domain-specific languages by the system itself leads to advantageous communication costs compared to colloquial communication at the price of set-up costs for language communities. Furthermore, we demonstrate how our theoretical constructs help to describe and predict the behavior of IS. Finally, we discuss implications of our theory for further research and IS in general. Keywords: Language Critique, language communities, communication, self-organization, IS researc

    Leaving the mainstream behind? Uncovering subjective understandings of economics instructors' roles

    Get PDF
    In the wake of the economic crisis, a number of student organizations and researchers highlighted the lack of pluralism and heterodox approaches in economics curricula. The relevance of pluralism becomes clear once set within the implications of a given scientific discourse on reality (e.g. economics and policy making). This study explores the role of instructors in co-constructing the pluralism discourse and debates, while recognizing the role of institutional obstacles to change within the discipline. An empirical field study is conducted with lecturers in introductory economics courses at the WU Vienna University of Economics and Business where they place themselves within the pluralism discourse via a Q-study - a mixed method employed for studying subjectivity in socially contested topics. In Q, a set of statements undergo a sorting procedure on a relative ranking scale, followed by factor-rendering. Four voices are identified: Moderate Pluralist, Mainstreamers, Responsible Pluralists, and Applied Pluralists. The implications of their ideas are discussed from the viewpoint of discursive institutionalism, stressing the role of ideas and discourse in institutional change. Although a discursive readiness for changes towards more pluralism is claimed, strategies for overcoming the difficulties on the institutional level need to be developed
    corecore