920 research outputs found

    The Structure of Graphene on Graphene/C60/Cu Interfaces: A Molecular Dynamics Study

    Full text link
    Two experimental studies reported the spontaneous formation of amorphous and crystalline structures of C60 intercalated between graphene and a substrate. They observed interesting phenomena ranging from reaction between C60 molecules under graphene to graphene sagging between the molecules and control of strain in graphene. Motivated by these works, we performed fully atomistic reactive molecular dynamics simulations to study the formation and thermal stability of graphene wrinkles as well as graphene attachment to and detachment from the substrate when graphene is laid over a previously distributed array of C60 molecules on a copper substrate at different values of temperature. As graphene compresses the C60 molecules against the substrate, and graphene attachment to the substrate between C60s ("C60s" stands for plural of C60) depends on the height of graphene wrinkles, configurations with both frozen and non-frozen C60s structures were investigated in order to verify the experimental result of stable sagged graphene when the distance between C60s is about 4 nm and height of graphene wrinkles is about 0.8 nm. Below the distance of 4 nm between C60s, graphene becomes locally suspended and less strained. We show that this happens when C60s are allowed to deform under the compressive action of graphene. If we keep the C60s frozen, spontaneous "blanketing" of graphene happens only when the distance between them are equal or above 7 nm. Both above results for the existence of stable sagged graphene for C60 distances of 4 or 7 nm are shown to agree with a mechanical model relating the rigidity of graphene to the energy of graphene-substrate adhesion. In particular, this study might help the development of 2D confined nanoreactors that are considered in literature to be the next advanced step on chemical reactions.Comment: 7 pages, 4 figure

    Cluster Expansion by Transfer Learning from Empirical Potentials

    Full text link
    Cluster expansions provide effective representations of the potential energy landscape of multicomponent crystalline solids. Notwithstanding major advances in cluster expansion implementations, it remains computationally demanding to construct these expansions for systems of low dimension or with a large number of components, such as clusters, interfaces, and multimetallic alloys. We address these challenges by employing transfer learning to accelerate the computationally demanding step of generating configurational data from first principles. The proposed approach exploits Bayesian inference to incorporate prior knowledge from physics-based or machine-learning empirical potentials, enabling one to identify the most informative configurations within a dataset. The efficacy of the method is tested on face-centered cubic Pt:Ni binaries, yielding a two- to three-fold reduction in the number of first-principles calculations, while ensuring robust convergence of the energies with low statistical fluctuations

    On the Interpretation of Supernova Light Echo Profiles and Spectra

    Full text link
    The light echo systems of historical supernovae in the Milky Way and local group galaxies provide an unprecedented opportunity to reveal the effects of asymmetry on observables, particularly optical spectra. Scattering dust at different locations on the light echo ellipsoid witnesses the supernova from different perspectives and the light consequently scattered towards Earth preserves the shape of line profile variations introduced by asymmetries in the supernova photosphere. However, the interpretation of supernova light echo spectra to date has not involved a detailed consideration of the effects of outburst duration and geometrical scattering modifications due to finite scattering dust filament dimension, inclination, and image point-spread function and spectrograph slit width. In this paper, we explore the implications of these factors and present a framework for future resolved supernova light echo spectra interpretation, and test it against Cas A and SN 1987A light echo spectra. We conclude that the full modeling of the dimensions and orientation of the scattering dust using the observed light echoes at two or more epochs is critical for the correct interpretation of light echo spectra. Indeed, without doing so one might falsely conclude that differences exist when none are actually present.Comment: 18 pages, 22 figures, accepted for publication in Ap

    Secure, performance-oriented data management for nanoCMOS electronics

    Get PDF
    The EPSRC pilot project Meeting the Design Challenges of nanoCMOS Electronics (nanoCMOS) is focused upon delivering a production level e-Infrastructure to meet the challenges facing the semiconductor industry in dealing with the next generation of ‘atomic-scale’ transistor devices. This scale means that previous assumptions on the uniformity of transistor devices in electronics circuit and systems design are no longer valid, and the industry as a whole must deal with variability throughout the design process. Infrastructures to tackle this problem must provide seamless access to very large HPC resources for computationally expensive simulation of statistic ensembles of microscopically varying physical devices, and manage the many hundreds of thousands of files and meta-data associated with these simulations. A key challenge in undertaking this is in protecting the intellectual property associated with the data, simulations and design process as a whole. In this paper we present the nanoCMOS infrastructure and outline an evaluation undertaken on the Storage Resource Broker (SRB) and the Andrew File System (AFS) considering in particular the extent that they meet the performance and security requirements of the nanoCMOS domain. We also describe how metadata management is supported and linked to simulations and results in a scalable and secure manner

    Enabling quantitative data analysis through e-infrastructures

    Get PDF
    This paper discusses how quantitative data analysis in the social sciences can engage with and exploit an e-Infrastructure. We highlight how a number of activities which are central to quantitative data analysis, referred to as ‘data management’, can benefit from e-infrastructure support. We conclude by discussing how these issues are relevant to the DAMES (Data Management through e-Social Science) research Node, an ongoing project that aims to develop e-Infrastructural resources for quantitative data analysis in the social sciences

    Integrating security solutions to support nanoCMOS electronics research

    Get PDF
    The UK Engineering and Physical Sciences Research Council (EPSRC) funded Meeting the Design Challenges of nanoCMOS Electronics (nanoCMOS) is developing a research infrastructure for collaborative electronics research across multiple institutions in the UK with especially strong industrial and commercial involvement. Unlike other domains, the electronics industry is driven by the necessity of protecting the intellectual property of the data, designs and software associated with next generation electronics devices and therefore requires fine-grained security. Similarly, the project also demands seamless access to large scale high performance compute resources for atomic scale device simulations and the capability to manage the hundreds of thousands of files and the metadata associated with these simulations. Within this context, the project has explored a wide range of authentication and authorization infrastructures facilitating compute resource access and providing fine-grained security over numerous distributed file stores and files. We conclude that no single security solution meets the needs of the project. This paper describes the experiences of applying X.509-based certificates and public key infrastructures, VOMS, PERMIS, Kerberos and the Internet2 Shibboleth technologies for nanoCMOS security. We outline how we are integrating these solutions to provide a complete end-end security framework meeting the demands of the nanoCMOS electronics domain

    Research Cloud Data Communities

    Get PDF
    Big Data, big science, the data deluge, these are topics we are hearing about more and more in our research pursuits. Then, through media hype, comes cloud computing, the saviour that is going to resolve our Big Data issues. However, it is difficult to pinpoint exactly what researchers can actually do with data and with clouds, how they get to exactly solve their Big Data problems, and how they get help in using these relatively new tools and infrastructure. Since the beginning of 2012, the NeCTAR Research Cloud has been running at the University of Melbourne, attracting over 1,650 users from around the country. This has not only provided an unprecedented opportunity for researchers to employ clouds in their research, but it has also given us an opportunity to clearly understand how researchers can more easily solve their Big Data problems. The cloud is now used daily, from running web servers and blog sites, through to hosting virtual laboratories that can automatically create hundreds of servers depending on research demand. Of course, it has also helped us understand that infrastructure isn’t everything. There are many other skillsets needed to help researchers from the multitude of disciplines use the cloud effectively. How can we solve Big Data problems on cloud infrastructure? One of the key aspects are communities based on research platforms: Research is built on collaboration, connection and community, and researchers employ platforms daily, whether as bio-imaging platforms, computational platforms or cloud platforms (like DropBox). There are some important features which enabled this to work.. Firstly, the borders to collaboration are eased, allowing communities to access infrastructure that can be instantly built to be completely open, through to completely closed, all managed securely through (nationally) standardised interfaces. Secondly, it is free and easy to build servers and infrastructure, but it is also cheap to fail, allowing for experimentation not only at a code-level, but at a server or infrastructure level as well. Thirdly, this (virtual) infrastructure can be shared with collaborators, moving the practice of collaboration from sharing papers and code to sharing servers, pre-configured and ready to go. And finally, the underlying infrastructure is built with Big Data in mind, co-located with major data storage infrastructure and high-performance computers, and interconnected with high-speed networks nationally to research instruments. The research cloud is fundamentally new in that it easily allows communities of researchers, often connected by common geography (research precincts), discipline or long-term established collaborations, to build open, collaborative platforms. These open, sharable, and repeatable platforms encourage coordinated use and development, evolving to common community-oriented methods for Big Data access and data manipulation. In this paper we discuss in detail critical ingredients in successfully establishing these communities, as well as some outcomes as a result of these communities and their collaboration enabling platforms. We consider astronomy as an exemplar of a research field that has already looked to the cloud as a solution to the ensuing data tsunami

    Metal-semiconductor (semimetal) superlattices on a graphite sheet with vacancies

    Full text link
    It has been found that periodically closely spaced vacancies on a graphite sheet cause a significant rearrange-ment of its electronic spectrum: metallic waveguides with a high density of states near the Fermi level are formed along the vacancy lines. In the direction perpendicular to these lines, the spectrum exhibits a semimetal or semiconductor character with a gap where a vacancy miniband is degenerated into impurity levels.Comment: 4 pages, 3 figure
    corecore