8,476 research outputs found

    Inertial Upper Stage (IUS) software analysis

    Get PDF
    The Inertial Upper Stage (IUS) System, an extension of the Space Transportation System (STS) operating regime to include higher orbits, orbital plane changes, geosynchronous orbits, and interplanetary trajectories is presented. The IUS software design, the IUS software interfaces with other systems, and the cost effectiveness in software verification are described. Tasks of the IUS discussed include: (1) design analysis; (2) validation requirements analysis; (3) interface analysis; and (4) requirements analysis

    Certification of Real Inequalities -- Templates and Sums of Squares

    Full text link
    We consider the problem of certifying lower bounds for real-valued multivariate transcendental functions. The functions we are dealing with are nonlinear and involve semialgebraic operations as well as some transcendental functions like cos\cos, arctan\arctan, exp\exp, etc. Our general framework is to use different approximation methods to relax the original problem into polynomial optimization problems, which we solve by sparse sums of squares relaxations. In particular, we combine the ideas of the maxplus estimators (originally introduced in optimal control) and of the linear templates (originally introduced in static analysis by abstract interpretation). The nonlinear templates control the complexity of the semialgebraic relaxations at the price of coarsening the maxplus approximations. In that way, we arrive at a new - template based - certified global optimization method, which exploits both the precision of sums of squares relaxations and the scalability of abstraction methods. We analyze the performance of the method on problems from the global optimization literature, as well as medium-size inequalities issued from the Flyspeck project.Comment: 27 pages, 3 figures, 4 table

    De-commoditizing Ethiopian coffees after the establishment of the Ethiopian Commodity Exchange : an empirical investigation of smallholder coffee producers in Ethiopia

    Get PDF
    The repercussions of reforming an agricultural market are mainly observed at the most vulnerable segment of the value chain, namely, the producers. In the current commodity market created with trade through the Ethiopian Commodity Exchange (ECX), coffee is less traceable to its producers. Only cooperatives that sell certified coffee through the unions they belong to, are allowed to bypass the more commodified ECX market. This study aims to investigate if small-scale coffee producers in southwestern Ethiopia that sell coffee through the certified cooperative are better off. It is assumed that the coffee sales through, and membership of, a cooperative, allows farmers to improve their coffee production as well as to improve other aspects of their livelihood. A sustainable livelihood approach was used as the inspiration for the welfare indicators that needed to be considered, data collected amongst members and non-members of certified cooperatives, and a propensity score model to investigate the impact of cooperative membership on the livelihood indicators. Results suggest that members of certified cooperatives indeed receive, on average, better prices. Yet, no evidence was found that indicates that the higher price is translated into better household income. Furthermore, coffee plantation productivity of those members who were interviewed was lower than that of the non-members. This finding could explain the failure to find an overall effect. Since the majority of the producers' income emanate from coffee, a sustainable way of enhancing the productivity of the coffee could revitalize the welfare of the coffee producers

    FORM version 4.0

    Full text link
    We present version 4.0 of the symbolic manipulation system FORM. The most important new features are manipulation of rational polynomials and the factorization of expressions. Many other new functions and commands are also added; some of them are very general, while others are designed for building specific high level packages, such as one for Groebner bases. New is also the checkpoint facility, that allows for periodic backups during long calculations. Lastly, FORM 4.0 has become available as open source under the GNU General Public License version 3.Comment: 26 pages. Uses axodra

    A Verified Certificate Checker for Finite-Precision Error Bounds in Coq and HOL4

    Full text link
    Being able to soundly estimate roundoff errors of finite-precision computations is important for many applications in embedded systems and scientific computing. Due to the discrepancy between continuous reals and discrete finite-precision values, automated static analysis tools are highly valuable to estimate roundoff errors. The results, however, are only as correct as the implementations of the static analysis tools. This paper presents a formally verified and modular tool which fully automatically checks the correctness of finite-precision roundoff error bounds encoded in a certificate. We present implementations of certificate generation and checking for both Coq and HOL4 and evaluate it on a number of examples from the literature. The experiments use both in-logic evaluation of Coq and HOL4, and execution of extracted code outside of the logics: we benchmark Coq extracted unverified OCaml code and a CakeML-generated verified binary

    Machine learning for acquiring knowledge in astro-particle physics

    Get PDF
    This thesis explores the fundamental aspects of machine learning, which are involved with acquiring knowledge in the research field of astro-particle physics. This research field substantially relies on machine learning methods, which reconstruct the properties of astro-particles from the raw data that specialized telescopes record. These methods are typically trained from resource-intensive simulations, which reflect the existing knowledge about the particles—knowledge that physicists strive to expand. We study three fundamental machine learning tasks, which emerge from this goal. First, we address ordinal quantification, the task of estimating the prevalences of ordered classes in sets of unlabeled data. This task emerges from the need for testing the agreement of astro-physical theories with the class prevalences that a telescope observes. To this end, we unify existing methods on quantification, propose an alternative optimization process, and develop regularization techniques to address ordinality in quantification problems, both in and outside of astro-particle physics. These advancements provide more accurate reconstructions of the energy spectra of cosmic gamma ray sources and, hence, support physicists in drawing conclusions from their telescope data. Second, we address learning under class-conditional label noise. More particularly, we focus on a novel setting, in which one of the class-wise noise rates is known and one is not. This setting emerges from a data acquisition protocol, through which astro-particle telescopes simultaneously observe a region of interest and several background regions. We enable learning under this type of label noise with algorithms for consistent, noise-aware decision thresholding. These algorithms yield binary classifiers, which outperform the existing state-of-the-art in gamma hadron classification with the FACT telescope. Moreover, unlike the state-of-the-art, our classifiers are entirely trained from the real telescope data and thus do not require any resource-intensive simulation. Third, we address active class selection, the task of actively finding those proportions of classes which optimize the classification performance. In astro-particle physics, this task emerges from the simulation, which produces training data in any desired class proportions. We clarify the implications of this setting from two theoretical perspectives, one of which provides us with bounds of the resulting classification performance. We employ these bounds in a certificate of model robustness, which declares a set of class proportions for which the model is accurate with a high probability. We also employ these bounds in an active strategy for class-conditional data acquisition. Our strategy uniquely considers existing uncertainties about those class proportions that have to be handled during the deployment of the classifier, while being theoretically well-justified

    Real Time NIR Imaging Image Enhancement by using 2D Frangi Filter via Segmentation

    Get PDF
    This paper presents the NIR imaging images enhancement by using 2D Frangi Filter segmentation which specifically apply in biomedical NIR vein localization imaging. The unseen subcutaneous vein causing clinical practitioner face the difficulties to perform intravenous catheterization and thus lead to the needles tick injuries. There are few imaging techniques which can be used for bein localization but the most widely used is Near Infrared (NIR) imaging due to its non-invasive and non-ionizing properties. The input images from NIR imaging setup is processed in order to enhance the vein visibility and contrast between vein and skin tissue. It is required to filter noise from the display image using some image processing technique. This work is done by applying image segmentation method to NIR venous image in order to extract veins and eliminate the noise. First, the gray scale image was segmented to 10 pieces of fragment plane with constant step size to produce 3 set of 2D planes. Second, these 3 sets of 2D planes will then apply in Frangi filter in order to obtain the eigenvalue image structure. Lastly, a least noise image is produce by this integrated plane through the 2D Frangi filter

    Accurate molecular imaging of small animals taking into account animal models, handling, anaesthesia, quality control and imaging system performance

    Get PDF
    Small-animal imaging has become an important technique for the development of new radiotracers, drugs and therapies. Many laboratories have now a combination of different small-animal imaging systems, which are being used by biologists, pharmacists, medical doctors and physicists. The aim of this paper is to give an overview of the important factors in the design of a small animal, nuclear medicine and imaging experiment. Different experts summarize one specific aspect important for a good design of a small-animal experiment

    Polynomials in the Bernstein Basis and Their Use in Stability Analysis

    Get PDF
    corecore