3,376 research outputs found

    Final Report: Improved Site Characterization And Storage Prediction Through Stochastic Inversion Of Time-Lapse Geophysical And Geochemical Data

    Get PDF
    During the last months of this project, our project activities have concentrated on four areas: (1) performing a stochastic inversion of pattern 16 seismic data to deduce reservoir bulk/shear moduli and density; the need for this inversion was not anticipated in the original scope of work, (2) performing a stochastic inversion of pattern 16 seismic data to deduce reservoir porosity and permeability, (3) complete the software needed to perform geochemical inversions and (4) use the software to perform stochastic inversion of aqueous chemistry data to deduce mineral volume fractions. This report builds on work described in progress reports previously submitted (Ramirez et al., 2009, 2010, 2011 - reports fulfilled the requirements of deliverables D1-D4) and fulfills deliverable D5: Field-based single-pattern simulations work product. The main challenge with our stochastic inversion approach is its large computational expense, even for single reservoir patterns. We dedicated a significant level of effort to improve computational efficiency but inversions involving multiple patterns were still intractable by project's end. As a result, we were unable to fulfill Deliverable D6: Field-based multi-pattern simulations work product

    Syndromic surveillance using veterinary laboratory data : algorithm combination and customization of alerts

    Get PDF
    Background: Syndromic surveillance research has focused on two main themes: the search for data sources that can provide early disease detection; and the development of efficient algorithms that can detect potential outbreak signals. Methods: This work combines three algorithms that have demonstrated solid performance in detecting simulated outbreak signals of varying shapes in time series of laboratory submissions counts. These are: the Shewhart control charts designed to detect sudden spikes in counts; the EWMA control charts developed to detect slow increasing outbreaks; and the Holt-Winters exponential smoothing, which can explicitly account for temporal effects in the data stream monitored. A scoring system to detect and report alarms using these algorithms in a complementary way is proposed. Results: The use of multiple algorithms in parallel resulted in increased system sensitivity. Specificity was decreased in simulated data, but the number of false alarms per year when the approach was applied to real data was considered manageable (between 1 and 3 per year for each of ten syndromic groups monitored). The automated implementation of this approach, including a method for on-line filtering of potential outbreak signals is described. Conclusion: The developed system provides high sensitivity for detection of potential outbreak signals while also providing robustness and flexibility in establishing what signals constitute an alarm. This flexibility allows an analyst to customize the system for different syndromes

    Factors associated with whole carcass condemnation rates in provincially-inspected abattoirs in Ontario 2001-2007: implications for food animal syndromic surveillance

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Ontario provincial abattoirs have the potential to be important sources of syndromic surveillance data for emerging diseases of concern to animal health, public health and food safety. The objectives of this study were to: (1) describe provincially inspected abattoirs processing cattle in Ontario in terms of the number of abattoirs, the number of weeks abattoirs process cattle, geographical distribution, types of whole carcass condemnations reported, and the distance animals are shipped for slaughter; and (2) identify various seasonal, secular, disease and non-disease factors that might bias the results of quantitative methods, such as cluster detection methods, used for food animal syndromic surveillance.</p> <p>Results</p> <p>Data were collected from the Ontario Ministry of Agriculture, Food and Rural Affairs and the Ontario Cattlemen's Association regarding whole carcass condemnation rates for cattle animal classes, abattoir compliance ratings, and the monthly sales-yard price for various cattle classes from 2001-2007. To analyze the association between condemnation rates and potential explanatory variables including abattoir characteristics, season, year and commodity price, as well as animal class, negative binomial regression models were fit using generalized estimating equations (GEE) to account for autocorrelation among observations from the same abattoir. Results of the fitted model found animal class, year, season, price, and audit rating are associated with condemnation rates in Ontario abattoirs. In addition, a subset of data was used to estimate the average distance cattle are shipped to Ontario provincial abattoirs. The median distance from the farm to the abattoir was approximately 82 km, and 75% of cattle were shipped less than 100 km.</p> <p>Conclusions</p> <p>The results suggest that secular and seasonal trends, as well as some non-disease factors will need to be corrected for when applying quantitative methods for syndromic surveillance involving these data. This study also demonstrated that animals shipped to Ontario provincial abattoirs come from relatively local farms, which is important when considering the use of spatial surveillance methods for these data.</p

    Exploratory analysis of methods for automated classification of laboratory test orders into syndromic groups in veterinary medicine

    Get PDF
    Background: Recent focus on earlier detection of pathogen introduction in human and animal populations has led to the development of surveillance systems based on automated monitoring of health data. Real- or near real-time monitoring of pre-diagnostic data requires automated classification of records into syndromes-syndromic surveillance-using algorithms that incorporate medical knowledge in a reliable and efficient way, while remaining comprehensible to end users. Methods: This paper describes the application of two of machine learning (Naïve Bayes and Decision Trees) and rule-based methods to extract syndromic information from laboratory test requests submitted to a veterinary diagnostic laboratory. Results: High performance (F1-macro = 0.9995) was achieved through the use of a rule-based syndrome classifier, based on rule induction followed by manual modification during the construction phase, which also resulted in clear interpretability of the resulting classification process. An unmodified rule induction algorithm achieved an F1-micro score of 0.979 though this fell to 0.677 when performance for individual classes was averaged in an unweighted manner (F1-macro), due to the fact that the algorithm failed to learn 3 of the 16 classes from the training set. Decision Trees showed equal interpretability to the rule-based approaches, but achieved an F1-micro score of 0.923 (falling to 0.311 when classes are given equal weight). A Naïve Bayes classifier learned all classes and achieved high performance (F1-micro = 0.994 and F1-macro =. 955), however the classification process is not transparent to the domain experts. Conclusion: The use of a manually customised rule set allowed for the development of a system for classification of laboratory tests into syndromic groups with very high performance, and high interpretability by the domain experts. Further research is required to develop internal validation rules in order to establish automated methods to update model rules without user input

    Managing Dynamic User Communities in a Grid of Autonomous Resources

    Get PDF
    One of the fundamental concepts in Grid computing is the creation of Virtual Organizations (VO's): a set of resource consumers and providers that join forces to solve a common problem. Typical examples of Virtual Organizations include collaborations formed around the Large Hadron Collider (LHC) experiments. To date, Grid computing has been applied on a relatively small scale, linking dozens of users to a dozen resources, and management of these VO's was a largely manual operation. With the advance of large collaboration, linking more than 10000 users with a 1000 sites in 150 counties, a comprehensive, automated management system is required. It should be simple enough not to deter users, while at the same time ensuring local site autonomy. The VO Management Service (VOMS), developed by the EU DataGrid and DataTAG projects[1, 2], is a secured system for managing authorization for users and resources in virtual organizations. It extends the existing Grid Security Infrastructure[3] architecture with embedded VO affiliation assertions that can be independently verified by all VO members and resource providers. Within the EU DataGrid project, Grid services for job submission, file- and database access are being equipped with fine- grained authorization systems that take VO membership into account. These also give resource owners the ability to ensure site security and enforce local access policies. This paper will describe the EU DataGrid security architecture, the VO membership service and the local site enforcement mechanisms Local Centre Authorization Service (LCAS), Local Credential Mapping Service(LCMAPS) and the Java Trust and Authorization Manager.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, 7 pages, LaTeX, 5 eps figures. PSN TUBT00

    Enterprise architecture driven design of an artefact to support strategic Information Technology decision-making of Small Enterprises in Nigeria and South Africa

    Get PDF
    Information Technology (IT) is inevitably influencing the way enterprises operate, compete, and grow. The contemporary disruption has not excluded small companies. Small enterprises play a significant role in the growth of every economy but are hindered by limited skills, time, and money. The attributes of small enterprises influence the strategic and day-to-day operations. Small enterprise owners are often the managers who make the strategic decisions in order to solve specific problems. The decision style of small enterprise owner-managers limits the leveraging of IT. To ensure the sustainability of small enterprises in a contemporary business ecosystem, it is pertinent to strategise IT investment decisions. Enterprise architecture is a well-known approach to business and IT alignment. This study aimsto discover and develop how the complex enterprise architecture principles can strategise the IT decisionmaking process in small enterprises with limited resources and informal structures. The pragmatic philosophic stance was the premise for understanding the decision challenges and the development of a roadmap to intervene the problems the researcher identified. The Vaishnavi and Kuechler design science research methodology guided this study. The qualitative research approach was employed to collect verbal data with eleven small enterprise ownermanagers to understand the processes and the challenges of making IT decision in small enterprises. A thematic analysis of the findings revealed that lack of formalisation, limited information, and lack of IT skill created a critical bottleneck of IT investment decisions in small enterprises. An enterprise architecture-driven framework was developed to overcome the bounded rationality approach to IT choices in small enterprises. The framework holistically assesses organisational business-IT capabilities, constraints, and criteria to guide the decisionmaker's choice. The characteristics of small enterprises limit the successful implementation of the enterprise architecture-driven framework as a theoretical guideline for making optimal IT decisions in small enterprises. This study further developed an online IT decision-assistive tool informed by the framework. The instantiation artefact was demonstrated with six small enterprise owners from Nigeria and South Africa. The findings affirmed the prospect, potential, and relevance of an enterprise architecture-driven artefact as a tool to optimise strategic IT decisions in small manufacturing, service, and retail enterprises. The artefact developed in this study provided a practical intervention to the challenges of IT investment decisions in small enterprises

    Type I IFN induces IL-10 production in an IL-27-independent manner and blocks responsiveness to IFN-gamma for production of IL-12 and bacterial killing in Mycobacterium tuberculosis-infected macrophages

    Get PDF
    Tuberculosis, caused by the intracellular bacterium Mycobacterium tuberculosis, currently causes ~1.4 million deaths per year, and it therefore remains a leading global health problem. The immune response during tuberculosis remains incompletely understood, particularly regarding immune factors that are harmful rather than protective to the host. Overproduction of the type I IFN family of cytokines is associated with exacerbated tuberculosis in both mouse models and in humans, although the mechanisms by which type I IFN promotes disease are not well understood. We have investigated the effect of type I IFN on M. tuberculosis-infected macrophages and found that production of host-protective cytokines such as TNF-a, IL-12, and IL-1ß is inhibited by exogenous type I IFN, whereas production of immunosuppressive IL-10 is promoted in an IL-27-independent manner. Furthermore, much of the ability of type I IFN to inhibit cytokine production was mediated by IL-10. Additionally, type I IFN compromised macrophage activation by the lymphoid immune response through severely disrupting responsiveness to IFN-?, including M. tuberculosis killing. These findings describe important mechanisms by which type I IFN inhibits the immune response during tuberculosis.This work was funded by Medical Research Council, U.K. Grant U117565642 and European Research Council Grant 294682-TB-PATH. M.S. and L.M.-T. were funded by the Fundacao para a Ciencia e Tecnologia, Portugal. M.S. is a Fundacao para a Ciencia e Tecnologia, Portugal investigator. L.M.T. was supported by Fundacao para a Ciencia e Tecnologia, Portugal Grant SFRH/BPD/77399/2011

    Syndromic surveillance using veterinary laboratory data : data pre-processing and algorithm performance evaluation

    Get PDF
    Diagnostic test orders to an animal laboratory were explored as a data source for monitoring trends in the incidence of clinical syndromes in cattle. Four years of real data and over 200 simulated outbreak signals were used to compare pre-processing methods that could remove temporal effects in the data, as well as temporal aberration detection algorithms that provided high sensitivity and specificity. Weekly differencing demonstrated solid performance in removing day-of-week effects, even in series with low daily counts. For aberration detection, the results indicated that no single algorithm showed performance superior to all others across the range of outbreak scenarios simulated. Exponentially weighted moving average charts and Holt-Winters exponential smoothing demonstrated complementary performance, with the latter offering an automated method to adjust to changes in the time series that will likely occur in the future. Shewhart charts provided lower sensitivity but earlier detection in some scenarios. Cumulative sum charts did not appear to add value to the system; however, the poor performance of this algorithm was attributed to characteristics of the data monitored. These findings indicate that automated monitoring aimed at early detection of temporal aberrations will likely be most effective when a range of algorithms are implemented in parallel
    • …
    corecore