12,675 research outputs found

    The future of computing beyond Moore's Law.

    Get PDF
    Moore's Law is a techno-economic model that has enabled the information technology industry to double the performance and functionality of digital electronics roughly every 2 years within a fixed cost, power and area. Advances in silicon lithography have enabled this exponential miniaturization of electronics, but, as transistors reach atomic scale and fabrication costs continue to rise, the classical technological driver that has underpinned Moore's Law for 50 years is failing and is anticipated to flatten by 2025. This article provides an updated view of what a post-exascale system will look like and the challenges ahead, based on our most recent understanding of technology roadmaps. It also discusses the tapering of historical improvements, and how it affects options available to continue scaling of successors to the first exascale machine. Lastly, this article covers the many different opportunities and strategies available to continue computing performance improvements in the absence of historical technology drivers. This article is part of a discussion meeting issue 'Numerical algorithms for high-performance computational science'

    Software reliability and dependability: a roadmap

    Get PDF
    Shifting the focus from software reliability to user-centred measures of dependability in complete software-based systems. Influencing design practice to facilitate dependability assessment. Propagating awareness of dependability issues and the use of existing, useful methods. Injecting some rigour in the use of process-related evidence for dependability assessment. Better understanding issues of diversity and variation as drivers of dependability. Bev Littlewood is founder-Director of the Centre for Software Reliability, and Professor of Software Engineering at City University, London. Prof Littlewood has worked for many years on problems associated with the modelling and evaluation of the dependability of software-based systems; he has published many papers in international journals and conference proceedings and has edited several books. Much of this work has been carried out in collaborative projects, including the successful EC-funded projects SHIP, PDCS, PDCS2, DeVa. He has been employed as a consultant t

    Formalization of Transform Methods using HOL Light

    Full text link
    Transform methods, like Laplace and Fourier, are frequently used for analyzing the dynamical behaviour of engineering and physical systems, based on their transfer function, and frequency response or the solutions of their corresponding differential equations. In this paper, we present an ongoing project, which focuses on the higher-order logic formalization of transform methods using HOL Light theorem prover. In particular, we present the motivation of the formalization, which is followed by the related work. Next, we present the task completed so far while highlighting some of the challenges faced during the formalization. Finally, we present a roadmap to achieve our objectives, the current status and the future goals for this project.Comment: 15 Pages, CICM 201

    US/UK Mental Models of Planning: The Relationship Between Plan Detail and Plan Quality

    No full text
    This paper presents the results of a research study applying a new cultural analysis method to capture commonalities and differences between US and UK mental models of operational planning. The results demonstrate the existence of fundamental differences between the way US and UK planners think about what it means to have a high quality plan. Specifically, the present study captures differences in how US and UK planners conceptualize plan quality. Explicit models of cultural differences in conceptions of plan quality are useful for establishing performance metrics for multinational planning teams. This paper discusses the prospects of enabling automatic evaluation of multinational team performance by combining recent advances in cultural modelling with enhanced ontology languages

    A Conceptual Framework for Adapation

    Get PDF
    This paper presents a white-box conceptual framework for adaptation that promotes a neat separation of the adaptation logic from the application logic through a clear identification of control data and their role in the adaptation logic. The framework provides an original perspective from which we survey archetypal approaches to (self-)adaptation ranging from programming languages and paradigms, to computational models, to engineering solutions

    A Conceptual Framework for Adapation

    Get PDF
    We present a white-box conceptual framework for adaptation. We called it CODA, for COntrol Data Adaptation, since it is based on the notion of control data. CODA promotes a neat separation between application and adaptation logic through a clear identification of the set of data that is relevant for the latter. The framework provides an original perspective from which we survey a representative set of approaches to adaptation ranging from programming languages and paradigms, to computational models and architectural solutions
    corecore