123 research outputs found

    Complete blood count

    Get PDF

    Verificación de Programas no Determinísticos

    Get PDF
    We continue with our series of introductory articles on the axiomatic verification of programs. In this second work, we focus on the nondeterministic sequential paradigm, always within the framework of imperative input/output programs. As nondeterminism is manifested in concurrency, the article also serves as an introduction to the verification of concurrent programs, in which a formal treatment of correctness verification is more justified due to their complexity. We work with a classic programming language, with nondeterministic conditional selections and repetitions, and then incorporating random assignments. For the verification of the programs we propose an adaptation of the verification axiomatic method described in the previous publication, limited to deterministic sequential programming. We present examples of the application of the method and a systematic program development is also included, emphasizing again the approach of using the axioms and rules for programming as well as verifying, in order to obtain correct programs by construction. Finally, we introduce the concept of fairness, which effect is to reduce the degree of nondeterminism of a program based on certain equity criteria in the execution environment, and we describe a couple of adaptations in the verification rules to contemplate this aspect.Continuando con nuestra serie de artículos intoductorios sobre la verificación axiomática de programas, en este segundo trabajo nos enfocamos en el paradigma secuencial no determinístico, siempre en el marco de los programas imperativos de entrada/salida. Como el no determinismo se manifiesta en la concurrencia, el artículo sirve también como introducción a la verificación de programas concurrentes, en los que más se justifica por su complejidad un tratamiento formal de las pruebas de correctitud. Trabajamos con un clásico lenguaje de programación, con selección condicional y repetición no determinísticas, al que luego se incorporan asignaciones aleatorias. Para las pruebas de los programas planteamos una adaptación del método axiomático de verificación descripto en la publicación previa, limitado a la programación secuencial determinística. Presentamos ejemplos de aplicación del método e incluimos un desarrollo sistemático de programa, volviendo a destacar el approach de utilizar los axiomas y reglas para programar al mismo tiempo que verificar, con el objeto de obtener programas correctos por construcción. Finalmente introducimos el concepto de fairness, cuyo efecto es reducir el grado de no determinismo de un programa en base a determinados criterios de equidad en el entorno de ejecución, y describimos un par de adaptaciones en las reglas de prueba para contemplar este aspecto

    Hemograma

    Full text link

    Computabilidad, complejidad computacional y verificación de programas

    Get PDF
    Computabilidad, Complejidad Computacional y Verificación de Programas contiene las quince clases que conforman la asignatura Teoría de la Computación y Verificación de Programas, una introducción a la teoría de la computabilidad y complejidad computacional de problemas y la teoría de correctitud de programas, que dicto en la Licenciatura en Informática de la Facultad de Informática de la Universidad Nacional de La Plata desde hace varios años. El libro es una suerte de segunda edición reducida de Teoría de la Computación y Verificación de Programas, de los mismos autores, editado en 2010 por la EDULP conjuntamente con McGraw-Hill, el cual incluye además de las clases de la asignatura básica, las de Teoría de la Computación y Verificación de Programas Avanzada, asignatura que también dicto en la misma carrera desde hace tiempo. El nuevo trabajo excluye principalmente la complejidad espacial, la verificación de los programas no determinísticos y concurrentes, el empleo de la lógica temporal para verificar los programas reactivos, y la semántica denotacional de los lenguajes de programación, tópicos tratados en la obra anterior. De todos modos, en la presente publicación hay secciones, breves, dedicadas a la jerarquía espacial, la terminación con hipótesis de fairnes de los programas no determinísticos, y la verificación de los programas concurrentes con memoria compartida, desarrolladas de la manera en que dichos temas son referenciados en la asignatura básica.Facultad de Informátic

    Lógica para Informática

    Get PDF
    El contenido de Lógica para Informática se basa en la asignatura Lógica e Inteligencia Artificial (capítulos 1, 2 y 3, sobre la lógica proposicional o de enunciados, la lógica de predicados de primer orden y la lógica modal, respectivamente) y en parte de las asignaturas Teoría de la Computación y Verificación de Programas y Teoría de la Computación y Verificación de Programas Avanzada (capítulo 4, sobre la verificación axiomática de programas), asignaturas que los autores dictamos desde hace tiempo en la Licenciatura en Informática de la Universidad Nacional de La Plata. Los cuatro tópicos referidos conforman el objeto de estudio de este libro: la lógica (matemática), considerando tres de las lógicas más difundidas, y una de sus aplicaciones más interesantes en el contexto de la informática. El título del libro puede parecer restrictivo, pero nuestra intención es la contraria. El libro es efectivamente para Informática porque nuestra idea primaria es aportar material bibliográfico para el dictado de las asignaturas mencionadas o similares en el marco de los planes de estudio para Informática. Lo es también porque estamos convencidos de la importancia del estudio de la lógica en la formación de los profesionales de la computación. Sin embargo el libro está orientado a informáticos y no informáticos, nuestro propósito es inclusivo, por el hecho de que la lógica forma parte de numerosos planes de estudio no solo de informática y matemática, y además porque la problemática del desarrollo de programas de computadora correctos es un tema de interés cada vez más amplio.Facultad de Informátic

    A novel satellite mission concept for upper air water vapour, aerosol and cloud observations using integrated path differential absorption LiDAR limb sounding

    Get PDF
    We propose a new satellite mission to deliver high quality measurements of upper air water vapour. The concept centres around a LiDAR in limb sounding by occultation geometry, designed to operate as a very long path system for differential absorption measurements. We present a preliminary performance analysis with a system sized to send 75 mJ pulses at 25 Hz at four wavelengths close to 935 nm, to up to 5 microsatellites in a counter-rotating orbit, carrying retroreflectors characterized by a reflected beam divergence of roughly twice the emitted laser beam divergence of 15 µrad. This provides water vapour profiles with a vertical sampling of 110 m; preliminary calculations suggest that the system could detect concentrations of less than 5 ppm. A secondary payload of a fairly conventional medium resolution multispectral radiometer allows wide-swath cloud and aerosol imaging. The total weight and power of the system are estimated at 3 tons and 2,700 W respectively. This novel concept presents significant challenges, including the performance of the lasers in space, the tracking between the main spacecraft and the retroreflectors, the refractive effects of turbulence, and the design of the telescopes to achieve a high signal-to-noise ratio for the high precision measurements. The mission concept was conceived at the Alpbach Summer School 2010

    A review of nitrogen isotopic alteration in marine sediments

    Get PDF
    Key Points: Use of sedimentary nitrogen isotopes is examined; On average, sediment 15N/14N increases approx. 2 per mil during early burial; Isotopic alteration scales with water depth Abstract: Nitrogen isotopes are an important tool for evaluating past biogeochemical cycling from the paleoceanographic record. However, bulk sedimentary nitrogen isotope ratios, which can be determined routinely and at minimal cost, may be altered during burial and early sedimentary diagenesis, particularly outside of continental margin settings. The causes and detailed mechanisms of isotopic alteration are still under investigation. Case studies of the Mediterranean and South China Seas underscore the complexities of investigating isotopic alteration. In an effort to evaluate the evidence for alteration of the sedimentary N isotopic signal and try to quantify the net effect, we have compiled and compared data demonstrating alteration from the published literature. A >100 point comparison of sediment trap and surface sedimentary nitrogen isotope values demonstrates that, at sites located off of the continental margins, an increase in sediment 15N/14N occurs during early burial, likely at the seafloor. The extent of isotopic alteration appears to be a function of water depth. Depth-related differences in oxygen exposure time at the seafloor are likely the dominant control on the extent of N isotopic alteration. Moreover, the compiled data suggest that the degree of alteration is likely to be uniform through time at most sites so that bulk sedimentary isotope records likely provide a good means for evaluating relative changes in the global N cycle

    A systematic review to identify areas of enhancements of pandemic simulation models for operational use at provincial and local levels

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In recent years, computer simulation models have supported development of pandemic influenza preparedness policies. However, U.S. policymakers have raised several <it>concerns </it>about the practical use of these models. In this review paper, we examine the extent to which the current literature already addresses these <it>concerns </it>and identify means of enhancing the current models for higher operational use.</p> <p>Methods</p> <p>We surveyed PubMed and other sources for published research literature on simulation models for influenza pandemic preparedness. We identified 23 models published between 1990 and 2010 that consider single-region (e.g., country, province, city) outbreaks and multi-pronged mitigation strategies. We developed a plan for examination of the literature based on the concerns raised by the policymakers.</p> <p>Results</p> <p>While examining the concerns about the adequacy and validity of data, we found that though the epidemiological data supporting the models appears to be adequate, it should be validated through as many updates as possible during an outbreak. Demographical data must improve its interfaces for access, retrieval, and translation into model parameters. Regarding the concern about credibility and validity of modeling assumptions, we found that the models often simplify reality to reduce computational burden. Such simplifications may be permissible if they do not interfere with the performance assessment of the mitigation strategies. We also agreed with the concern that social behavior is inadequately represented in pandemic influenza models. Our review showed that the models consider only a few social-behavioral aspects including contact rates, withdrawal from work or school due to symptoms appearance or to care for sick relatives, and compliance to social distancing, vaccination, and antiviral prophylaxis. The concern about the degree of accessibility of the models is palpable, since we found three models that are currently accessible by the public while other models are seeking public accessibility. Policymakers would prefer models scalable to any population size that can be downloadable and operable in personal computers. But scaling models to larger populations would often require computational needs that cannot be handled with personal computers and laptops. As a limitation, we state that some existing models could not be included in our review due to their limited available documentation discussing the choice of relevant parameter values.</p> <p>Conclusions</p> <p>To adequately address the concerns of the policymakers, we need continuing model enhancements in critical areas including: updating of epidemiological data during a pandemic, smooth handling of large demographical databases, incorporation of a broader spectrum of social-behavioral aspects, updating information for contact patterns, adaptation of recent methodologies for collecting human mobility data, and improvement of computational efficiency and accessibility.</p
    corecore