167 research outputs found

    Infant transport incubator cold performance

    Get PDF
    The Mansell Infant Retrieval System is designed to be used by clinical personnel to transport under intensive-care conditions, premature or critically-ill infants to a medical centre by using either a road ambulance, fixed wing or rotary wing aircraft. The key component of the Mansell Infant Retrieval system is the Neocot which is a controlled environment capsule to accommodate the infant. This capsule needs to maintain a relatively constant internal temperature of 36 degrees Celsius. This project tests a newly designed heating element to determine if under certain environmental conditions, the Neocot will operate in a manner complying with the current Australian standard IEC.60601-2-20. In order to pass the standard requirements, the new heater in the Neocot needs to be set at 36 degrees Celsius and be able to stay within a temperature range of ��±3 degrees Celsius when the ambient external temperature was at -5 degrees Celsius for a total period of 15 minutes. Once 15 minutes have elapsed, the Neocot then needs to be placed in an external ambient temperature of between 20 to 25 degrees Celsius for a total period of 30 minutes. This means that in order for the Neocot to pass these requirements the internal temperature should not go above 39 degrees Celsius or below 33 degrees Celsius. The test, undertaken using the developed temperature logging equipments described in this project proved that the new heater design successfully passed the requirements of the IEC.60601-2-20 standard. The new heater was capable of maintaining the internal temperature of the Neocot within the required IEC.60601-2-20 Australian Standards in a cold temperature environment which exceeded the standards requirements. The following dissertation details the steps undertaken in designing a calibrated device to properly measure the temperature deviations of the Neocot as well as the results obtained from the cold test

    KNODWAT: A scientific framework application for testing knowledge discovery methods for the biomedical domain

    Get PDF
    BACKGROUND: Professionals in the biomedical domain are confronted with an increasing mass of data. Developing methods to assist professional end users in the field of Knowledge Discovery to identify, extract, visualize and understand useful information from these huge amounts of data is a huge challenge. However, there are so many diverse methods and methodologies available, that for biomedical researchers who are inexperienced in the use of even relatively popular knowledge discovery methods, it can be very difficult to select the most appropriate method for their particular research problem. RESULTS: A web application, called KNODWAT (KNOwledge Discovery With Advanced Techniques) has been developed, using Java on Spring framework 3.1. and following a user-centered approach. The software runs on Java 1.6 and above and requires a web server such as Apache Tomcat and a database server such as the MySQL Server. For frontend functionality and styling, Twitter Bootstrap was used as well as jQuery for interactive user interface operations. CONCLUSIONS: The framework presented is user-centric, highly extensible and flexible. Since it enables methods for testing using existing data to assess suitability and performance, it is especially suitable for inexperienced biomedical researchers, new to the field of knowledge discovery and data mining. For testing purposes two algorithms, CART and C4.5 were implemented using the WEKA data mining framework

    Computer theorem proving in math

    Get PDF
    We give an overview of issues surrounding computer-verified theorem proving in the standard pure-mathematical context. This is based on my talk at the PQR conference (Brussels, June 2003)

    Types with potential: polynomial resource bounds via automatic amortized analysis

    Get PDF
    A primary feature of a computer program is its quantitative performance characteristics: the amount of resources such as time, memory, and power the program needs to perform its task. Concrete resource bounds for specific hardware have many important applications in software development but their manual determination is tedious and error-prone. This dissertation studies the problem of automatically determining concrete worst-case bounds on the quantitative resource consumption of functional programs. Traditionally, automatic resource analyses are based on recurrence relations. The difficulty of both extracting and solving recurrence relations has led to the development of type-based resource analyses that are compositional, modular, and formally verifiable. However, existing automatic analyses based on amortization or sized types can only compute bounds that are linear in the sizes of the arguments of a function. This work presents a novel type system that derives polynomial bounds from first-order functional programs. As pioneered by Hofmann and Jost for linear bounds, it relies on the potential method of amortized analysis. Types are annotated with multivariate resource polynomials, a rich class of functions that generalize non-negative linear combinations of binomial coefficients. The main theorem states that type derivations establish resource bounds that are sound with respect to the resource-consumption of programs which is formalized by a big-step operational semantics. Simple local type rules allow for an efficient inference algorithm for the type annotations which relies on linear constraint solving only. This gives rise to an analysis system that is fully automatic if a maximal degree of the bounding polynomials is given. The analysis is generic in the resource of interest and can derive bounds on time and space usage. The bounds are naturally closed under composition and eventually summarized in closed, easily understood formulas. The practicability of this automatic amortized analysis is verified with a publicly available implementation and a reproducible experimental evaluation. The experiments with a wide range of examples from functional programming show that the inference of the bounds only takes a couple of seconds in most cases. The derived heap-space and evaluation-step bounds are compared with the measured worst-case behavior of the programs. Most bounds are asymptotically tight, and the constant factors are close or even identical to the optimal ones. For the first time we are able to automatically and precisely analyze the resource consumption of involved programs such as quick sort for lists of lists, longest common subsequence via dynamic programming, and multiplication of a list of matrices with different, fitting dimensions

    Types with potential: polynomial resource bounds via automatic amortized analysis

    Get PDF
    A primary feature of a computer program is its quantitative performance characteristics: the amount of resources such as time, memory, and power the program needs to perform its task. Concrete resource bounds for specific hardware have many important applications in software development but their manual determination is tedious and error-prone. This dissertation studies the problem of automatically determining concrete worst-case bounds on the quantitative resource consumption of functional programs. Traditionally, automatic resource analyses are based on recurrence relations. The difficulty of both extracting and solving recurrence relations has led to the development of type-based resource analyses that are compositional, modular, and formally verifiable. However, existing automatic analyses based on amortization or sized types can only compute bounds that are linear in the sizes of the arguments of a function. This work presents a novel type system that derives polynomial bounds from first-order functional programs. As pioneered by Hofmann and Jost for linear bounds, it relies on the potential method of amortized analysis. Types are annotated with multivariate resource polynomials, a rich class of functions that generalize non-negative linear combinations of binomial coefficients. The main theorem states that type derivations establish resource bounds that are sound with respect to the resource-consumption of programs which is formalized by a big-step operational semantics. Simple local type rules allow for an efficient inference algorithm for the type annotations which relies on linear constraint solving only. This gives rise to an analysis system that is fully automatic if a maximal degree of the bounding polynomials is given. The analysis is generic in the resource of interest and can derive bounds on time and space usage. The bounds are naturally closed under composition and eventually summarized in closed, easily understood formulas. The practicability of this automatic amortized analysis is verified with a publicly available implementation and a reproducible experimental evaluation. The experiments with a wide range of examples from functional programming show that the inference of the bounds only takes a couple of seconds in most cases. The derived heap-space and evaluation-step bounds are compared with the measured worst-case behavior of the programs. Most bounds are asymptotically tight, and the constant factors are close or even identical to the optimal ones. For the first time we are able to automatically and precisely analyze the resource consumption of involved programs such as quick sort for lists of lists, longest common subsequence via dynamic programming, and multiplication of a list of matrices with different, fitting dimensions

    Assessing European wheat sensitivities to parastagonospora nodorum necrotrophic effectors and fine-mapping the Snn3-B1 locus conferring sensitivity to the effector SnTox3

    Get PDF
    © 2018 Downie, Bouvet, Furuki, Gosman, Gardner, Mackay, Campos Mantello, Mellers, Phan, Rose, Tan, Oliver and Cockram. Parastagonospora nodorum is a necrotrophic fungal pathogen of wheat (Triticum aestivum L.), one of the world’s most important crops. P. nodorum mediates host cell death using proteinaceous necrotrophic effectors, presumably liberating nutrients that allow the infection process to continue. The identification of pathogen effectors has allowed host genetic resistance mechanisms to be separated into their constituent parts. In P. nodorum, three proteinaceous effectors have been cloned: SnToxA, SnTox1, and SnTox3. Here, we survey sensitivity to all three effectors in a panel of 480 European wheat varieties, and fine-map the wheat SnTox3 sensitivity locus Snn3-B1 using genome-wide association scans (GWAS) and an eight-founder wheat multi-parent advanced generation inter-cross (MAGIC) population. Using a Bonferroni corrected P = 0.05 significance threshold, GWAS identified 10 significant markers defining a single locus, Snn3-B1, located on the short arm of chromosome 5B explaining 32% of the phenotypic variation [peak single nucleotide polymorphisms (SNPs), Excalibur_c47452_183 and GENE-3324_338, -log10P = 20.44]. Single marker analysis of SnTox3 sensitivity in the MAGIC population located Snn3-B1 via five significant SNPs, defining a 6.2-kb region that included the two peak SNPs identified in the association mapping panel. Accordingly, SNP Excalibur_c47452_183 was converted to the KASP genotyping system, and validated by screening a subset of 95 wheat varieties, providing a valuable resource for marker assisted breeding and for further genetic investigation. In addition, composite interval mapping in the MAGIC population identified six minor SnTox3 sensitivity quantitative trait loci, on chromosomes 2A (QTox3.niab-2A.1, P-value = 9.17-7), 2B (QTox3.niab-2B.1, P = 0.018), 3B (QTox3.niab-3B.1, P = 48.51-4), 4D (QTox3.niab-4D.1, P = 0.028), 6A (QTox3.niab-6A.1, P = 8.51-4), and 7B (QTox3.niab-7B.1, P = 0.020), each accounting for between 3.1 and 6.0 % of the phenotypic variance. Collectively, the outcomes of this study provides breeders with knowledge and resources regarding the sensitivity of European wheat germplasm to P. nodorum effectors, as well as simple diagnostic markers for determining allelic state at Snn3-B1

    Evaluation of Personnel Parameters in Software Cost Estimating Models

    Get PDF
    Software capabilities have steadily increased over the last half century. The Department of Defense has seized this increased capability and used it to advance the warfighter\u27s weapon systems However, this dependence on software capabilities has come with enormous cost. The risks of software development must be understood to develop an accurate cost estimate

    Bootstrapping Inductive and Coinductive Types in HasCASL

    Full text link
    We discuss the treatment of initial datatypes and final process types in the wide-spectrum language HasCASL. In particular, we present specifications that illustrate how datatypes and process types arise as bootstrapped concepts using HasCASL's type class mechanism, and we describe constructions of types of finite and infinite trees that establish the conservativity of datatype and process type declarations adhering to certain reasonable formats. The latter amounts to modifying known constructions from HOL to avoid unique choice; in categorical terminology, this means that we establish that quasitoposes with an internal natural numbers object support initial algebras and final coalgebras for a range of polynomial functors, thereby partially generalising corresponding results from topos theory. Moreover, we present similar constructions in categories of internal complete partial orders in quasitoposes

    Activities of the Remote Sensing Information Sciences Research Group

    Get PDF
    Topics on the analysis and processing of remotely sensed data in the areas of vegetation analysis and modelling, georeferenced information systems, machine assisted information extraction from image data, and artificial intelligence are investigated. Discussions on support field data and specific applications of the proposed technologies are also included
    corecore