354 research outputs found

    The future of computing beyond Moore's Law.

    Get PDF
    Moore's Law is a techno-economic model that has enabled the information technology industry to double the performance and functionality of digital electronics roughly every 2 years within a fixed cost, power and area. Advances in silicon lithography have enabled this exponential miniaturization of electronics, but, as transistors reach atomic scale and fabrication costs continue to rise, the classical technological driver that has underpinned Moore's Law for 50 years is failing and is anticipated to flatten by 2025. This article provides an updated view of what a post-exascale system will look like and the challenges ahead, based on our most recent understanding of technology roadmaps. It also discusses the tapering of historical improvements, and how it affects options available to continue scaling of successors to the first exascale machine. Lastly, this article covers the many different opportunities and strategies available to continue computing performance improvements in the absence of historical technology drivers. This article is part of a discussion meeting issue 'Numerical algorithms for high-performance computational science'

    Micro-Futures

    Get PDF
    One of humankinds oldest quests has been to find the ‘elixir of life’, a mythical potion that, would grant the drinker immortality (and preferably, eternal youth!). One of the most famous tales of a search for this fabled tonic was that of the first emperor of a unified China, Qin Shi Huang (246 BC. to 210 BC), who, in the latter part of his life, is said to have become preoccupied with finding this illusive concoction. This article is presented at a workshop that is taking place in the heartland of what was Qin Shi Huang's empire (China), and touches on a modern day search for an elixir of life, this time a high-tech approach based on computers and artificial intelligence technology, that goes by the name of ‘The Technological Singularity’. However, as fascinating as a search for an elixir of life may be, the real motivation of this paper is to introduce micro-fiction as a methodology for capturing and communicating visions for scientific, business and societal innovations. To those end, The Technological Singularity is described and used as a means to illustrate the workings of micro SciFi-Prototyping (micro-SFPs)

    Understanding Digital Technology’s Evolution and the Path of Measured Productivity Growth: Present and Future in the Mirror of the Past

    Get PDF
    Three styles of explanation have been advanced by economists seeking to account for the so-called 'productivity paradox'. The coincidence of a persisting slowdown in the growth of measured total factor productivity (TFP) in the US, since the mid-1970's, with the wave of information technology (It) innovations, is said by some to be an illusion due to the mismeasurement of real output growth; by others to expose the mistaken expectations about the benefits of computerization; and by still others to reflect the amount of time, and the volume of intangible investments in 'learning', and the time required for ancillary innovations that allow the new digital technologies to be applied in ways that are reflected in measured productivity growth. This paper shows that rather than viewing these as competing hypotheses, the dynamics of the transition to a new technological and economic regime based upon a general purpose technology (GPT) should be understood to be likely to give rise to all three 'effects.' It more fully articulates and supports this thesis, which was first advanced in the 'computer and dynamo' papers by David (1990, 1991). The relevance of that historical experience is re-asserted and supported by further evidence rebutting skeptics who have argued that the diffusion of electrification and computerization have little in common. New evidence is produced about the links between IT use, mass customization, and the upward bias of output price deflators arising from the method used to 'chain in' new products prices. The measurement bias due to the exclusion of intangible investments from the scope of the official national product accounts also is examined. Further, it is argued that the development of the general-purpose PC delayed the re-organization of businesses along lines that would have more directly raised task productivity, even though the technologies yielded positive 'revenue productivity' gains for large companies. The paper concludes by indicating the emerging technical and organizational developments that are likely to deliver a sustained surge of measured TFP growth during the decades that lie immediately ahead.

    Labor Law 2.0: The Impact of New Information Technology on the Employment Relationship and the Relevance of the NLRA

    Get PDF
    The NLRA system of collective bargaining was born during the industrial age of the early twentieth century. As a result, key terms in the statute such as employee, employer, and appropriate bargaining unit were first interpreted in the context of long-term employment and large vertically integrated firms that dominated this era. Beginning in the late 1970s, the new information technology wrought a revolution in the organization of production increasing short-term contingent employment and the organization of firms horizontally in trading and subcontracting relationships across the globe. To maintain the relevance of collective bargaining to the modern workplace, the interpretation of the key terms of the NLRA must be updated to recognize the changed circumstances of production and interpret union access and employee mutual support in light of the new technology. However, new information technology promises further changes in the workplace with the accelerating mechanization of many jobs and perhaps a fundamental change in the relationship between labor and capital with the development of artificial intelligence. In this Essay, I explore the implications of new information technology for the workplace, the interpretation of the NLRA, and the continuing evolution of American labor policy

    Labor Law 2.0: The Impact of New Information Technology on the Employment Relationship and the Relevance of the NLRA

    Get PDF
    The NLRA system of collective bargaining was born during the industrial age of the early twentieth century. As a result, key terms in the statute such as employee, employer, and appropriate bargaining unit were first interpreted in the context of long-term employment and large vertically integrated firms that dominated this era. Beginning in the late 1970s, the new information technology wrought a revolution in the organization of production increasing short-term contingent employment and the organization of firms horizontally in trading and subcontracting relationships across the globe. To maintain the relevance of collective bargaining to the modern workplace, the interpretation of the key terms of the NLRA must be updated to recognize the changed circumstances of production and interpret union access and employee mutual support in light of the new technology. However, new information technology promises further changes in the workplace with the accelerating mechanization of many jobs and perhaps a fundamental change in the relationship between labor and capital with the development of artificial intelligence. In this Essay, I explore the implications of new information technology for the workplace, the interpretation of the NLRA, and the continuing evolution of American labor policy

    Date Science: Post Quantum Safe Cryptography

    Get PDF
    New cryptographic techniques have emerged in recent decades that do provide protection against quantum threats. These techniques are termed “postquantum cryptography” and consist of techniques based on quantum properties of light that prevent interception of messages, as well as classic computational techniques, all of which were designed to resist quantum attacks emerging from the rapidly accelerating research field of quantum computation. This paper provides background information on post-quantum security. It explores the security threats against communication security and particularly against key exchange that are enabled by the development of quantum computers. The applied and theoretical aspects of quantum-cryptographic technologies are considered, which is designed to be a reference for those operating in the ICT space in fields other than information security and postquantum cryptography. The interrelated elements that make up the concept and content determined by the application of quantum cryptography are analyzed. The systematic analysis of quantum algorithms, quantum cryptography and quantum hashing are presented. The proper concept vehicle over is brought, in particular the concepts of singularity and supersingularity are determined for elliptic curves and theoretical positions, lyings in their basis, are examined. Terms which must be taken into account at the selection of elliptic curves for cryptographic applications are determined

    CUDA implementation of the solution of a system of linear equations arising in an hp-Finite Element code

    Get PDF
    El método de elementos finitos (FEM) ha probado ser uno de los métodos mas eficientes para resolver ecuaciones diferenciales. Diseñado para aprovechar las capacidades de calculo de los ordenadores, las mejores realizadas a lo largo de los años han permitido solucionar problemas cada vez mas grandes. Una de las ultimas mejoras ha sido el desarrollo de las tarjetas gráficas (GPU). La programación científica con GPUs era extremadamente compleja hasta que en 2006 la compañía NVIDIA desarrolló CUDA. Es un lenguaje genérico de programación que no requiere de conocimientos de la tradicional programación con GPUs. Estos dispositivos son capaces de realizar grandes cantidades de operaciones simultáneamente. Esta capacidad los hace muy atractivos para el calculo en FEM. Una de las partes del FEM que mas recursos computacionales requiere, es la solución de sistemas de ecuaciones lineales. En este trabajo de fin de máster, se implementará un algoritmo para la solución de sistemas de ecuaciones lineales en CUDA. Dicho sistema provendrá de la aplicación de un método hp-FEM a la ecuación de Laplace. El objetivo es comparar la ejecución del solucionador implementado CUDA frente a una implementación en C y comprobar si CUDA presenta ventajas sobre la programación tradicional

    Technology Trends and Opportunities for Construction Industry and Lifecycle Management

    Get PDF
    Master's thesis in for Offshore Technology: Industrial Asset ManagementThe purpose of the report is to highlight methods that can make it easier for the construction industry and industry in general to benefit from new technology. The report is intended as a reference to technological solutions that along with some techniques, can streamline workflow for multiple tasks in planning, design, and operation and maintenance management. The problems focused on is how to: • Simplify the procurement and tracing of documentation • Optimize building stages, design, and Life Cycle Management (LCM) • Provide interactions between disciplines and employees using different software Scientific Platform are based on literature within technology trends. Some history and trends in digital technology are presented. Definition of roles and general terms related to documentation is derived from Norsk Standard and is interpreted on this basis. The report charts the use of individual software and technical setup of digital tools within CAD-engineering (Computer Aided Design), HDS-technology (High Definition Surveying), and gaming technology. This technology combined with cloud-services to support planning, design and management of building stages. Later to support LCM of facilities and businesses' ERP-systems (Enterprise Resource Planning). Use of Robotic Process Automation (RPA) and Artificial Intelligence (AI), for document control tasks. The result of the report is that several suppliers provide services and products accessible through web. Setup and implementation will require some work and knowledge for business and organizations, but the gain largely seems to justify the use of resources for this purpose. Particularly through IOT-interactions (Internet of Things), cloud-services and free downloadable applications that may be considered as a paradigm shift related to the issues in the report. Also, presenting new platforms for engineering phases to support Building Information Modeling processes (BIM). With the use of Algorithmic Editors for encoding between computer programs without the need of data programmer expertise. To streamline workflows, reduce recreation of data, interactions between different software of various user level, and support of AI to optimize designing by adds-on for CAD-engineering (Computer Aided Design). Mobile devices like phones and tablets to support several of solutions and products presented is very accessible. It seems naturally to assume that the vast majority of people are familiar with technology related to smartphone applications for daily use. The use of resources for implementing the presented solutions have not been considered in this report. Some of the equipment presented can be interpreted as relatively expensive. Investment analysis would be sensible. The trend however, shows continues price drops and increased availability. At the same time as the user interface is being improved for both software and digital equipment. The conclusion, is that the construction industry, as well as Facility Management (FM). Within both, public, and private sector, can have much to gain using the technology and techniques presented in the report

    M-health review: joining up healthcare in a wireless world

    Get PDF
    In recent years, there has been a huge increase in the use of information and communication technologies (ICT) to deliver health and social care. This trend is bound to continue as providers (whether public or private) strive to deliver better care to more people under conditions of severe budgetary constraint
    corecore