1,802 research outputs found

    Prototype resupply scheduler

    Get PDF
    Resupply scheduling for the Space Station presents some formidable logistics problems. One of the most basic problems is assigning supplies to a series of shuttle resupply missions. A prototype logistics expert system which constructs resupply schedules was developed. This prototype is able to reconstruct feasible resupply plans. In addition, analysts can use the system to evaluate the impact of adding, deleting or modifying launches, cargo space, experiments, etc

    The Ukraine crisis: a problem of trust

    Get PDF
    Many observers now believe a war between Russia and Ukraine is inevitable. Jim Hughes explains how the erosion of trust between Russia and the West has brought us to the brink of a conflict that could have far reaching consequences for Europe

    Estimating the accuracy of polymerase chain reaction-based tests using endpoint dilution

    Get PDF
    PCR-based tests for various microorganisms or target DNA sequences are generally acknowledged to be highly sensitive yet the concept of sensitivity is ill-defined in the literature on these tests. We propose that sensitivity should be expressed as a function of the number of target DNA molecules in the sample (or specificity when the target number is 0). However, estimating this sensitivity curve is problematic since it is difficult to construct samples with a fixed number of targets. Nonetheless, using serially diluted replicate aliquots of a known concentration of the target DNA sequence, we show that it is possible to disentangle random variations in the number of target DNA molecules from the underlying test sensitivity. We develop parametric, nonparametric and semiparametric (spline-based) models for the sensitivity curve. The methods are compared on a new test for M. genitalium

    Understanding IS Evaluation as a Complex Social Process

    Get PDF
    There is an increasing concern that information systems (IS) are not delivering anticipated value and benefits. There is a push for the development and adoption of improved evaluation metrics in an attempt to better quantify IS benefits. This has led to a growing number of well-developed methods for assessing returns. In this paper we take stock of the current situation and ask whether improvement lies not with the development of better quantitative methods, but rather by better understanding the experiences of multiple IS stakeholders. Using case material and current literature in IS/IT evaluation we draw predominantly upon the work of Heidegger and Suchman to explore the concept of IS evaluation as a highly complex social process. The analysis leads to an understanding of situated (context dependent) IS evaluation which suggests that interpretive evaluation methods may play a key role in helping practitioners and academics understand the complexity surrounding this area

    Reflections on the Use of Grounded Theory in Interpretive Information Systems Research

    Get PDF
    In Information Systems research there are a growing number of studies that must necessarily draw upon the contexts, experiences and narratives of practitioners. This calls for research approaches that are qualitative and may also be interpretive. These may include case studies or action research projects. For some researchers, particularly those with limited experience of interpretive qualitative research, there may be a lack of confidence when faced with the prospect of collecting and analysing the data from studies of this kind. In this paper we reflect on the lessons learned from using Grounded Theory in an interpretive case study based piece of research. The paper discusses the lessons and provides guidance for the use of the method in interpretive studies

    Efficiently Identifying Failures using Quantitative Tests, Matrix-Pooling and the EM-Algorithm

    Get PDF
    Pooled-testing methods can greatly reduce the number of tests needed to identify failures in a collection of samples. Existing methodology has focused primarily on binary tests, but there is a clear need for improved efficiency when using expensive quantitative tests, such as tests for HIV viral load in resource-limited settings. We propose a matrix-pooling method which, based on pooled-test results, uses the EM algorithm to identify individual samples most likely to be failures. Two hundred datasets for each of a wide range of failure prevalence were simulated to test the method. When the measurement of interest was normally distributed, at a failure prevalence level of 15.6% the EM method yielded a 47.3% reduction in the number of tests needed to identify failures (as compared to testing each specimen individually). These results are somewhat better than the reduction gained by using the Simple Search method (44.9%) previously published by May et al. (2010). However, the EM procedure was able to identify failures in just 2.6 testing rounds, on average, as compared to an average of 19.2 testing rounds required by Simple Search. In settings where the turn-around time for testing services is significant, the reduction in testing rounds provided by the EM method is substantial. Unfortunately the EM method does not perform as well when the measurements of interest are highly skewed, as is often the case with viral load concentrations

    Data handling methods and target detection results for multibeam and sidescan data collected as part of the search for SwissAir Flight 111

    Get PDF
    The crash of SwissAir Flight 111, off Nova Scotia in September 1998, triggered one of the largest seabed search surveys in Canadian history. The primary search tools used were sidescan sonars (both conventional and focussed types) and multibeam sonars. The processed search data needed to be distributed on a daily basis to other elements of the fleet for precise location of divers and other optical seabed search instruments (including laser linescan and ROV video). As a result of the glacial history of the region, many natural targets, similar in gross nature to aircraft debris were present. These included widespread linear bedrock outcrop patterns together with near ubiquitous glacial erratic boulders. Because of the severely broken-up nature of the remaining aircraft debris, sidescan imaging alone was often insufficient to unambiguously identify targets. The complementary attributes of higher resolution, but poorly located, sidescan imagery together with slightly lower resolution, but excellently navigated multibeam sonar proved to be one of critical factors in the success of the search. It proved necessary to rely heavily on the regional context of the seabed (provided by the multibeam sonar bathymetry and backscatter imagery) to separate natural geomorphic targets from anomalous anthropogenic debris. In order to confidently prove or disprove a potential target, the interpreter required simultaneous access to the full resolution sidescan data in the geographic context of the multibeam framework. Specific software tools had to be adapted or developed shipboard to provide this capability. Whilst developed specifically for this application, these survey tools can provide improved processing speed and confidence as part of more general mine hunting, hydrographic, engineering or scientific surveys

    BSC Football: The Swenson Era

    Get PDF
    A recollection of Bridgewater State College Coach Edward Swenson’s efforts to bring varsity-level football back into a Massachusetts state college and the stories of the trials and tribulations of his first eight years as head coach. The book is a Bridgewater State University Football Alumni project inspired by many of Coach Swenson’s former players, commemorating the coach and several of their former teammates
    • …
    corecore