3 research outputs found

    Formation of nanoscale structures by inductively coupled plasma etching

    Get PDF
    This paper will review the top down technique of ICP etching for the formation of nanometer scale structures. The increased difficulties of nanoscale etching will be described. However it will be shown and discussed that inductively coupled plasma (ICP) technology is well able to cope with the higher end of the nanoscale: features from 100nm down to about 40nm are relatively easy with current ICP technology. It is the ability of ICP to operate at low pressure yet with high plasma density and low (controllable) DC bias that helps greatly compared to simple reactive ion etching (RIE) and, though continual feature size reduction is increasingly challenging, improvements to ICP technology as well as improvements in masking are enabling sub-10nm features to be reached. Nanoscale ICP etching results will be illustrated in a range of materials and technologies. Techniques to facilitate etching (such as the use of cryogenic temperatures) and techniques to improve the mask performance will be described and illustrated

    Formation of nanoscale structures by inductively coupled plasma etching

    Get PDF
    This paper will review the top down technique of ICP etching for the formation of nanometer scale structures. The increased difficulties of nanoscale etching will be described. However it will be shown and discussed that inductively coupled plasma (ICP) technology is well able to cope with the higher end of the nanoscale: features from 100nm down to about 40nm are relatively easy with current ICP technology. It is the ability of ICP to operate at low pressure yet with high plasma density and low (controllable) DC bias that helps greatly compared to simple reactive ion etching (RIE) and, though continual feature size reduction is increasingly challenging, improvements to ICP technology as well as improvements in masking are enabling sub-10nm features to be reached. Nanoscale ICP etching results will be illustrated in a range of materials and technologies. Techniques to facilitate etching (such as the use of cryogenic temperatures) and techniques to improve the mask performance will be described and illustrated

    Bridging big data: procedures for combining non-equivalent cognitive measures from the ENIGMA Consortium

    No full text
    Investigators in the cognitive neurosciences have turned to Big Data to address persistent replication and reliability issues by increasing sample sizes, statistical power, and representativeness of data. While there is tremendous potential to advance science through open data sharing, these efforts unveil a host of new questions about how to integrate data arising from distinct sources and instruments. We focus on the most frequently assessed area of cognition - memory testing - and demonstrate a process for reliable data harmonization across three common measures. We aggregated raw data from 53 studies from around the world which measured at least one of three distinct verbal learning tasks, totaling N = 10,505 healthy and brain-injured individuals. A mega analysis was conducted using empirical bayes harmonization to isolate and remove site effects, followed by linear models which adjusted for common covariates. After corrections, a continuous item response theory (IRT) model estimated each individual subject’s latent verbal learning ability while accounting for item difficulties. Harmonization significantly reduced inter-site variance by 37% while preserving covariate effects. The effects of age, sex, and education on scores were found to be highly consistent across memory tests. IRT methods for equating scores across AVLTs agreed with held-out data of dually-administered tests, and these tools are made available for free online. This work demonstrates that large-scale data sharing and harmonization initiatives can offer opportunities to address reproducibility and integration challenges across the behavioral sciences
    corecore