6,347 research outputs found

    Data management of nanometre­ scale CMOS device simulations

    Get PDF
    In this paper we discuss the problems arising in managing and curating the data generated by simulations of nanometre scale CMOS (Complementary Metal–Oxide Semiconductor) transistors, circuits and systems and describe the software and operational techniques we have adopted to address them. Such simulations pose a number of challenges including, inter alia, multi­TByte data volumes, complex datasets with complex inter-relations between datasets, multi­-institutional collaborations including multiple specialisms and a mixture of academic and industrial partners, and demanding security requirements driven by commercial imperatives. This work was undertaken as part of the NanoCMOS project. However, the problems, solutions and experience seem likely to be of wider relevance, both within the CMOS design community and more generally in other disciplines

    Is there a Moore's law for quantum computing?

    Full text link
    There is a common wisdom according to which many technologies can progress according to some exponential law like the empirical Moore's law that was validated for over half a century with the growth of transistors number in chipsets. As a still in the making technology with a lot of potential promises, quantum computing is supposed to follow the pack and grow inexorably to maturity. The Holy Grail in that domain is a large quantum computer with thousands of errors corrected logical qubits made themselves of thousands, if not more, of physical qubits. These would enable molecular simulations as well as factoring 2048 RSA bit keys among other use cases taken from the intractable classical computing problems book. How far are we from this? Less than 15 years according to many predictions. We will see in this paper that Moore's empirical law cannot easily be translated to an equivalent in quantum computing. Qubits have various figures of merit that won't progress magically thanks to some new manufacturing technique capacity. However, some equivalents of Moore's law may be at play inside and outside the quantum realm like with quantum computers enabling technologies, cryogeny and control electronics. Algorithms, software tools and engineering also play a key role as enablers of quantum computing progress. While much of quantum computing future outcomes depends on qubit fidelities, it is progressing rather slowly, particularly at scale. We will finally see that other figures of merit will come into play and potentially change the landscape like the quality of computed results and the energetics of quantum computing. Although scientific and technological in nature, this inventory has broad business implications, on investment, education and cybersecurity related decision-making processes.Comment: 32 pages, 24 figure

    A Model for Internet Traffic Growth

    Get PDF
    A simple model that let us predict the doubling time of Internet traffic ispresented. The growth of this traffic depends on three factors, that is the doublingtime of the number of users that are online, the doubling time of the time that theyspend online and the doubling time of the bandwidth provided to the end user bytelecommunication networks. The first and the second depend primarily on marketingstrategies while the last one depends on Moore's Law. In 2006 in Europe these threedoubling times led to an expected doubling time for the traffic on the Internet ofroughly 1.2 years. The real value of the doubling time of the traffic of a group ofEuropean Internet Exchanges agrees well with the expected value

    Noise-based information processing: Noise-based logic and computing: what do we have so far?

    Full text link
    We briefly introduce noise-based logic. After describing the main motivations we outline classical, instantaneous (squeezed and non-squeezed), continuum, spike and random-telegraph-signal based schemes with applications such as circuits that emulate the brain functioning and string verification via a slow communication channel.Comment: Invited talk at the 21st International Conference on Noise and Fluctuations, Toronto, Canada, June 12-16, 201

    The Quest for Enabling Metaphors for Law and Lawyering in the Information Agae

    Get PDF
    A Review of James Boyle, Shamans, Software, and Spleens: Law and the Construction of the Information Society and M. Ethan Katsh, Law in a Digital Worl

    Marital Contracting in a Post-\u3cem\u3eWindsor\u3c/em\u3e World

    Get PDF

    Optical lithography

    Get PDF
    Optical lithography is a photon-based technique comprised of projecting an image into a photosensitive emulsion (photoresist) coated onto a substrate such as a silicon wafer. It is the most widely used lithography process in the high volume manufacturing of nano-electronics by the semiconductor industry. Optical lithography’s ubiquitous use is a direct result of its highly parallel nature allowing vast amounts of information to be transferred very rapidly. For example, a modern leading edge lithography tool produces 150-300-mm patterned wafers per hour with 40-nm two-dimensional pattern resolution, yielding a pixel throughput of approximately 1.8T pixels/s. Continual advances in optical lithography capabilities have enabled the computing revolution over the past 50 years
    corecore