279 research outputs found

    Testing Library Specifications by Verifying Conformance Tests

    Get PDF
    Abstract. Formal specifications of standard libraries are necessary when statically verifying software that uses those libraries. Library specifications must be both correct, accurately reflecting library behavior, and useful, describing library behavior in sufficient detail to allow static verification of client programs. Specification and verification researchers regularly face the question of whether the library specifications we use are correct and useful, and we have collectively provided no good answers. Over the past few years we have created and refined a software engineering process, which we call the Formal CTD Process (FCTD), to address this problem. Although FCTD is primarily targeted toward those who write Java libraries (or specifications for existing Java libraries) using the Java Modeling Language (JML), its techniques are broadly applicable. The key to FCTD is its novel usage of library conformance test suites. Rather than executing the conformance tests, FCTD uses them to measure the correctness and utility of specifications through static verification. FCTD is beginning to see significant use within the JML community and is the cornerstone process of the JML Spec-a-thons, meetings that bring JML researchers and practitioners together for intensive specification writing sessions. This article describes the Formal CTD Process, its use in small case studies, and its broad application to the standard Java class library.

    Single Spin Measurement using Single Electron Transistors to Probe Two Electron Systems

    Get PDF
    We present a method for measuring single spins embedded in a solid by probing two electron systems with a single electron transistor (SET). Restrictions imposed by the Pauli Principle on allowed two electron states mean that the spin state of such systems has a profound impact on the orbital states (positions) of the electrons, a parameter which SET's are extremely well suited to measure. We focus on a particular system capable of being fabricated with current technology: a Te double donor in Si adjacent to a Si/SiO2 interface and lying directly beneath the SET island electrode, and we outline a measurement strategy capable of resolving single electron and nuclear spins in this system. We discuss the limitations of the measurement imposed by spin scattering arising from fluctuations emanating from the SET and from lattice phonons. We conclude that measurement of single spins, a necessary requirement for several proposed quantum computer architectures, is feasible in Si using this strategy.Comment: 22 Pages, 8 Figures; revised version contains updated references and small textual changes. Submitted to Phys. Rev.

    Black Hole Spin via Continuum Fitting and the Role of Spin in Powering Transient Jets

    Full text link
    The spins of ten stellar black holes have been measured using the continuum-fitting method. These black holes are located in two distinct classes of X-ray binary systems, one that is persistently X-ray bright and another that is transient. Both the persistent and transient black holes remain for long periods in a state where their spectra are dominated by a thermal accretion disk component. The spin of a black hole of known mass and distance can be measured by fitting this thermal continuum spectrum to the thin-disk model of Novikov and Thorne; the key fit parameter is the radius of the inner edge of the black hole's accretion disk. Strong observational and theoretical evidence links the inner-disk radius to the radius of the innermost stable circular orbit, which is trivially related to the dimensionless spin parameter a_* of the black hole (|a_*| < 1). The ten spins that have so far been measured by this continuum-fitting method range widely from a_* \approx 0 to a_* > 0.95. The robustness of the method is demonstrated by the dozens or hundreds of independent and consistent measurements of spin that have been obtained for several black holes, and through careful consideration of many sources of systematic error. Among the results discussed is a dichotomy between the transient and persistent black holes; the latter have higher spins and larger masses. Also discussed is recently discovered evidence in the transient sources for a correlation between the power of ballistic jets and black hole spin.Comment: 30 pages. Accepted for publication in Space Science Reviews. Also to appear in hard cover in the Space Sciences Series of ISSI "The Physics of Accretion onto Black Holes" (Springer Publisher). Changes to Sections 5.2, 6.1 and 7.4. Section 7.4 responds to Russell et al. 2013 (MNRAS, 431, 405) who find no evidence for a correlation between the power of ballistic jets and black hole spi

    Effects of watershed land use on nitrogen concentrations and δ15 Nitrogen in groundwater

    Get PDF
    Author Posting. © The Authors, 2005. This is the author's version of the work. It is posted here by permission of Springer for personal use, not for redistribution. The definitive version was published in Biogeochemistry 77 (2006): 199-215, doi:10.1007/s10533-005-1036-2.Eutrophication is a major agent of change affecting freshwater, estuarine, and marine systems. It is largely driven by transportation of nitrogen from natural and anthropogenic sources. Research is needed to quantify this nitrogen delivery and to link the delivery to specific land-derived sources. In this study we measured nitrogen concentrations and δ15N values in seepage water entering three freshwater ponds and six estuaries on Cape Cod, Massachusetts and assessed how they varied with different types of land use. Nitrate concentrations and δ15N values in groundwater reflected land use in developed and pristine watersheds. In particular, watersheds with larger populations delivered larger nitrate loads with higher δ15N values to receiving waters. The enriched δ15N values confirmed nitrogen loading model results identifying wastewater contributions from septic tanks as the major N source. Furthermore, it was apparent that N coastal sources had a relatively larger impact on the N loads and isotopic signatures than did inland N sources further upstream in the watersheds. This finding suggests that management priorities could focus on coastal sources as a first course of action. This would require management constraints on a much smaller population.This work was supported by funds from the Woods Hole Oceanographic Institution Sea Grant Program, from the Cooperative Institute for Coastal and Estuarine Environmental Technology, from Massachusetts Department of Environmental Protection to Applied Science Associates, Narragansett, RI, as well as from Palmer/McLeod and NOAA National Estuarine Research Reserve Fellowships to Kevin Kroeger. This work is the result of research sponsored by NOAA National Sea Grant College Program Office, Department of Commerce, under Grant No. NA86RG0075, Woods Hole Oceanographic Institution Sea Grant Project No. R/M-40

    CASA: An Efficient Automated Assignment of Protein Mainchain NMR Data Using an Ordered Tree Search Algorithm

    Full text link
    Rapid analysis of protein structure, interaction, and dynamics requires fast and automated assignments of 3D protein backbone triple-resonance NMR spectra. We introduce a new depth-first ordered tree search method of automated assignment, CASA, which uses hand-edited peak-pick lists of a flexible number of triple resonance experiments. The computer program was tested on 13 artificially simulated peak lists for proteins up to 723 residues, as well as on the experimental data for four proteins. Under reasonable tolerances, it generated assignments that correspond to the ones reported in the literature within a few minutes of CPU time. The program was also tested on the proteins analyzed by other methods, with both simulated and experimental peaklists, and it could generate good assignments in all relevant cases. The robustness was further tested under various situations.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/43050/1/10858_2005_Article_4079.pd

    The prevalence of mild cognitive impairment in diverse geographical and ethnocultural regions: The COSMIC Collaboration

    Get PDF
    Background Changes in criteria and differences in populations studied and methodology have produced a wide range of prevalence estimates for mild cognitive impairment (MCI). Methods Uniform criteria were applied to harmonized data from 11 studies from USA, Europe, Asia and Australia, and MCI prevalence estimates determined using three separate definitions of cognitive impairment. Results The published range of MCI prevalence estimates was 5.0%-36.7%. This was reduced with all cognitive impairment definitions: performance in the bottom 6.681% (3.2%-10.8%); Clinical Dementia Rating of 0.5 (1.8%-14.9%); Mini-Mental State Examination score of 24-27 (2.1%-20.7%). Prevalences using the first definition were 5.9% overall, and increased with age (P < .001) but were unaffected by sex or the main races/ethnicities investigated (Whites and Chinese). Not completing high school increased the likelihood of MCI (P = .01). Conclusion Applying uniform criteria to harmonized data greatly reduced the variation in MCI prevalence internationally

    History of clinical transplantation

    Get PDF
    How transplantation came to be a clinical discipline can be pieced together by perusing two volumes of reminiscences collected by Paul I. Terasaki in 1991-1992 from many of the persons who were directly involved. One volume was devoted to the discovery of the major histocompatibility complex (MHC), with particular reference to the human leukocyte antigens (HLAs) that are widely used today for tissue matching.1 The other focused on milestones in the development of clinical transplantation.2 All the contributions described in both volumes can be traced back in one way or other to the demonstration in the mid-1940s by Peter Brian Medawar that the rejection of allografts is an immunological phenomenon.3,4 © 2008 Springer New York

    Gendering the careers of young professionals: some early findings from a longitudinal study. in Organizing/theorizing: developments in organization theory and practice

    Full text link
    Wonders whether companies actually have employees best interests at heart across physical, mental and spiritual spheres. Posits that most organizations ignore their workforce – not even, in many cases, describing workers as assets! Describes many studies to back up this claim in theis work based on the 2002 Employment Research Unit Annual Conference, in Cardiff, Wales
    corecore