657 research outputs found

    Nonphotolithographic nanoscale memory density prospects

    Get PDF
    Technologies are now emerging to construct molecular-scale electronic wires and switches using bottom-up self-assembly. This opens the possibility of constructing nanoscale circuits and memories where active devices are just a few nanometers square and wire pitches may be on the order of ten nanometers. The features can be defined at this scale without using photolithography. The available assembly techniques have relatively high defect rates compared to conventional lithographic integrated circuits and can only produce very regular structures. Nonetheless, with proper memory organization, it is reasonable to expect these technologies to provide memory densities in excess of 10/sup 11/ b/cm/sup 2/ with modest active power requirements under 0.6 W/Tb/s for random read operations

    A Linear Logic Programming Language for Concurrent Programming over Graph Structures

    Full text link
    We have designed a new logic programming language called LM (Linear Meld) for programming graph-based algorithms in a declarative fashion. Our language is based on linear logic, an expressive logical system where logical facts can be consumed. Because LM integrates both classical and linear logic, LM tends to be more expressive than other logic programming languages. LM programs are naturally concurrent because facts are partitioned by nodes of a graph data structure. Computation is performed at the node level while communication happens between connected nodes. In this paper, we present the syntax and operational semantics of our language and illustrate its use through a number of examples.Comment: ICLP 2014, TPLP 201

    SOMA A Tool for Synthesizing and Optimizing Memory Accesses in ASICs

    Get PDF
    Arbitrary memory dependencies and variable latency memory systems are major obstacles to the synthesis of large-scale ASIC systems in high-level synthesis. This paper presents SOMA, a synthesis framework for constructing Memory Access Network (MAN) architectures that inherently enforce memory consistency in the presence of dynamic memory access dependencies. A fundamental bottleneck in any such network is arbitrating between concurrent accesses to a shared memory resource. To alleviate this bottleneck, SOMA uses an application-specific concurrency analysis technique to predict the dynamic memory parallelism profile of the application. This is then used to customize the MAN architecture. Depending on the parallelism profile, the MAN may be optimized for latency, throughput or both. The optimized MAN is automatically synthesized into gate-level structural Verilog using a flexible library of network building blocks. SOMA has been successfully integrated into an automated C-to-hardware synthesis flow, which generates standard cell circuits from unrestricted ANSI-C programs. Post-layout experiments demonstrate that application specific MAN construction significantly improves power and performance

    Fault-tolerance techniques for hybrid CMOS/nanoarchitecture

    Get PDF
    The authors propose two fault-tolerance techniques for hybrid CMOS/nanoarchitecture implementing logic functions as look-up tables. The authors compare the efficiency of the proposed techniques with recently reported methods that use single coding schemes in tolerating high fault rates in nanoscale fabrics. Both proposed techniques are based on error correcting codes to tackle different fault rates. In the first technique, the authors implement a combined two-dimensional coding scheme using Hamming and Bose-Chaudhuri-Hocquenghem (BCH) codes to address fault rates greater than 5. In the second technique, Hamming coding is complemented with bad line exclusion technique to tolerate fault rates higher than the first proposed technique (up to 20). The authors have also estimated the improvement that can be achieved in the circuit reliability in the presence of Don-t Care Conditions. The area, latency and energy costs of the proposed techniques were also estimated in the CMOS domain

    Students Have \u27Voice\u27

    Get PDF
    This was an article written about The Student Voice. It provides background on how it was created and the vision of the newspaper staff.https://fuse.franklin.edu/student_voice/1013/thumbnail.jp

    Oral History Interview: Berridge Copen

    Get PDF
    This interview is one of a series conducted concerning the history of Marshall University. Berridge Copen discusses sororities and fraternities at Marshall, student government, campus sports, classes and grading standards, and student life.https://mds.marshall.edu/oral_history/1257/thumbnail.jp

    Hydrologic investigation of three constructed mitigation wetlands and one natural wetland in West Virginia

    Get PDF
    The goal of this study was to increase the probability of success for future mitigation wetland projects by performing a hydrologic investigation at three constructed mitigation wetland sites and one natural reference wetland site. In addition, a numerical groundwater model was developed for a portion of one of the constructed mitigation wetlands.;Hydrologic, meteorologic, topographic, and geotechnical data were collected at three constructed mitigation wetlands and one reference natural wetland. Hydrologic data were collected using monitoring wells, piezometers, automated water level recorders, and a water level meter. Meteorologic data were obtained from the National Oceanic and Atmospheric Administration\u27s weather station in Elkins, West Virginia. Topographic data were collected using traditional surveying techniques. Geotechnical data were inferred from bag samples obtained during hydrologic monitoring device installation, from Shelby tubes, and from in-situ slug tests. These data were used to compare hydrologic conditions at the constructed mitigation wetlands to the natural reference wetland. In addition, a detailed numerical groundwater model was developed for one of the mitigation sites.;Persistent sources of groundwater were determined to be the most important factor favoring the successful development of the mitigation wetlands. Areas at the mitigation sites that were not developing satisfactorily were found to have the deepest and most variable groundwater levels. It is recommended that future wetland mitigation sites be selected where groundwater can be used as the primary source of water

    Factors that promote or inhibit the implementation of e-health systems: an explanatory systematic review

    Get PDF
    OBJECTIVE: To systematically review the literature on the implementation of e-health to identify: (i) barriers and facilitators to e-health implementation, and (ii) outstanding gaps in research on the subject.METHODS: MEDLINE, EMBASE, CINAHL, PSYCINFO and the Cochrane Library were searched for reviews published between 1 January 1995 and 17 March 2009. Studies had to be systematic reviews, narrative reviews, qualitative metasyntheses or meta-ethnographies of e-health implementation. Abstracts and papers were double screened and data were extracted on country of origin; e-health domain; publication date; aims and methods; databases searched; inclusion and exclusion criteria and number of papers included. Data were analysed qualitatively using normalization process theory as an explanatory coding framework.FINDINGS: Inclusion criteria were met by 37 papers; 20 had been published between 1995 and 2007 and 17 between 2008 and 2009. Methodological quality was poor: 19 papers did not specify the inclusion and exclusion criteria and 13 did not indicate the precise number of articles screened. The use of normalization process theory as a conceptual framework revealed that relatively little attention was paid to: (i) work directed at making sense of e-health systems, specifying their purposes and benefits, establishing their value to users and planning their implementation; (ii) factors promoting or inhibiting engagement and participation; (iii) effects on roles and responsibilities; (iv) risk management, and (v) ways in which implementation processes might be reconfigured by user-produced knowledge.CONCLUSION: The published literature focused on organizational issues, neglecting the wider social framework that must be considered when introducing new technologies.<br/

    Optimal Brain MRI Protocol for New Neurological Complaint

    Get PDF
    Background/Purpose Patients with neurologic complaints are imaged with MRI protocols that may include many pulse sequences. It has not been documented which sequences are essential. We assessed the diagnostic accuracy of a limited number of sequences in patients with new neurologic complaints. Methods: 996 consecutive brain MRI studies from patients with new neurological complaints were divided into 2 groups. In group 1, reviewers used a 3-sequence set that included sagittal T1-weighted, axial T2-weighted fluid-attenuated inversion recovery, and axial diffusion-weighted images. Subsequently, another group of studies were reviewed using axial susceptibility-weighted images in addition to the 3 sequences. The reference standard was the study's official report. Discrepancies between the limited sequence review and the reference standard including Level I findings (that may require immediate change in patient management) were identified. Results: There were 84 major findings in 497 studies in group 1 with 21 not identified in the limited sequence evaluations: 12 enhancing lesions and 3 vascular abnormalities identified on MR angiography. The 3-sequence set did not reveal microhemorrhagic foci in 15 of 19 studies. There were 117 major findings in 499 studies in group 2 with 19 not identified on the 4-sequence set: 17 enhancing lesions and 2 vascular lesions identified on angiography. All 87 Level I findings were identified using limited sequence (56 acute infarcts, 16 hemorrhages, and 15 mass lesions). Conclusion: A 4-pulse sequence brain MRI study is sufficient to evaluate patients with a new neurological complaint except when contrast or angiography is indicated

    The Massachusetts General Hospital Acute Stroke Imaging Algorithm: An Experience and Evidence Based Approach

    Get PDF
    The Massachusetts General Hospital Neuroradiology Division employed an experience and evidence based approach to develop a neuroimaging algorithm to best select patients with severe ischemic strokes caused by anterior circulation occlusions (ACOs) for intravenous tissue plasminogen activator and endovascular treatment. Methods found to be of value included the National Institutes of Health Stroke Scale (NIHSS), non-contrast CT, CT angiography (CTA) and diffusion MRI. Perfusion imaging by CT and MRI were found to be unnecessary for safe and effective triage of patients with severe ACOs. An algorithm was adopted that includes: non-contrast CT to identify hemorrhage and large hypodensity followed by CTA to identify the ACO; diffusion MRI to estimate the core infarct; and NIHSS in conjunction with diffusion data to estimate the clinical penumbra
    corecore