41,838 research outputs found

    Formal Executable Models for Automatic Detection of Timing Anomalies

    Get PDF
    A timing anomaly is a counterintuitive timing behavior in the sense that a local fast execution slows down an overall global execution. The presence of such behaviors is inconvenient for the WCET analysis which requires, via abstractions, a certain monotony property to compute safe bounds. In this paper we explore how to systematically execute a previously proposed formal definition of timing anomalies. We ground our work on formal designs of architecture models upon which we employ guided model checking techniques. Our goal is towards the automatic detection of timing anomalies in given computer architecture designs

    Making stillbirths count, making numbers talk - issues in data collection for stillbirths.

    Get PDF
    BACKGROUND: Stillbirths need to count. They constitute the majority of the world's perinatal deaths and yet, they are largely invisible. Simply counting stillbirths is only the first step in analysis and prevention. From a public health perspective, there is a need for information on timing and circumstances of death, associated conditions and underlying causes, and availability and quality of care. This information will guide efforts to prevent stillbirths and improve quality of care. DISCUSSION: In this report, we assess how different definitions and limits in registration affect data capture, and we discuss the specific challenges of stillbirth registration, with emphasis on implementation. We identify what data need to be captured, we suggest a dataset to cover core needs in registration and analysis of the different categories of stillbirths with causes and quality indicators, and we illustrate the experience in stillbirth registration from different cultural settings. Finally, we point out gaps that need attention in the International Classification of Diseases and review the qualities of alternative systems that have been tested in low- and middle-income settings. SUMMARY: Obtaining high-quality data will require consistent definitions for stillbirths, systematic population-based registration, better tools for surveys and verbal autopsies, capacity building and training in procedures to identify causes of death, locally adapted quality indicators, improved classification systems, and effective registration and reporting systems

    A Faster-Than Relation for Semi-Markov Decision Processes

    Get PDF
    When modeling concurrent or cyber-physical systems, non-functional requirements such as time are important to consider. In order to improve the timing aspects of a model, it is necessary to have some notion of what it means for a process to be faster than another, which can guide the stepwise refinement of the model. To this end we study a faster-than relation for semi-Markov decision processes and compare it to standard notions for relating systems. We consider the compositional aspects of this relation, and show that the faster-than relation is not a precongruence with respect to parallel composition, hence giving rise to so-called parallel timing anomalies. We take the first steps toward understanding this problem by identifying decidable conditions sufficient to avoid parallel timing anomalies in the absence of non-determinism.Comment: In Proceedings QAPL 2019, arXiv:2001.0616

    Different atmospheric moisture divergence responses to extreme and moderate El Niños

    Get PDF
    On seasonal and inter-annual time scales, vertically integrated moisture divergence provides a useful measure of the tropical atmospheric hydrological cycle. It reflects the combined dynamical and thermodynamical effects, and is not subject to the limitations that afflict observations of evaporation minus precipitation. An empirical orthogonal function (EOF) analysis of the tropical Pacific moisture divergence fields calculated from the ERA-Interim reanalysis reveals the dominant effects of the El Niño-Southern Oscillation (ENSO) on inter-annual time scales. Two EOFs are necessary to capture the ENSO signature, and regression relationships between their Principal Components and indices of equatorial Pacific sea surface temperature (SST) demonstrate that the transition from strong La Niña through to extreme El Niño events is not a linear one. The largest deviation from linearity is for the strongest El Niños, and we interpret that this arises at least partly because the EOF analysis cannot easily separate different patterns of responses that are not orthogonal to each other. To overcome the orthogonality constraints, a self-organizing map (SOM) analysis of the same moisture divergence fields was performed. The SOM analysis captures the range of responses to ENSO, including the distinction between the moderate and strong El Niños identified by the EOF analysis. The work demonstrates the potential for the application of SOM to large scale climatic analysis, by virtue of its easier interpretation, relaxation of orthogonality constraints and its versatility for serving as an alternative classification method. Both the EOF and SOM analyses suggest a classification of “moderate” and “extreme” El Niños by their differences in the magnitudes of the hydrological cycle responses, spatial patterns and evolutionary paths. Classification from the moisture divergence point of view shows consistency with results based on other physical variables such as SST

    The development and application of a new tool to assess the adequacy of the content and timing of antenatal care

    Get PDF
    Abstract Background: Current measures of antenatal care use are limited to initiation of care and number of visits. This study aimed to describe the development and application of a tool to assess the adequacy of the content and timing of antenatal care. Methods: The Content and Timing of care in Pregnancy (CTP) tool was developed based on clinical relevance for ongoing antenatal care and recommendations in national and international guidelines. The tool reflects minimal care recommended in every pregnancy, regardless of parity or risk status. CTP measures timing of initiation of care, content of care (number of blood pressure readings, blood tests and ultrasound scans) and whether the interventions were received at an appropriate time. Antenatal care trajectories for 333 pregnant women were then described using a standard tool (the APNCU index), that measures the quantity of care only, and the new CTP tool. Both tools categorise care into 4 categories, from ‘Inadequate’ (both tools) to ‘Adequate plus’ (APNCU) or ‘Appropriate’ (CTP). Participants recorded the timing and content of their antenatal care prospectively using diaries. Analysis included an examination of similarities and differences in categorisation of care episodes between the tools. Results: According to the CTP tool, the care trajectory of 10,2% of the women was classified as inadequate, 8,4% as intermediate, 36% as sufficient and 45,3% as appropriate. The assessment of quality of care differed significantly between the two tools. Seventeen care trajectories classified as ‘Adequate’ or ‘Adequate plus’ by the APNCU were deemed ‘Inadequate’ by the CTP. This suggests that, despite a high number of visits, these women did not receive the minimal recommended content and timing of care. Conclusions: The CTP tool provides a more detailed assessment of the adequacy of antenatal care than the current standard index. However, guidelines for the content of antenatal care vary, and the tool does not at the moment grade over-use of interventions as ‘Inappropriate’. Further work needs to be done to refine the content items prior to larger scale testing of the impact of the new measure

    On the spatio-temporal analysis of hydrological droughts from global hydrological models

    Get PDF
    The recent concerns for world-wide extreme events related to climate change have motivated the development of large scale models that simulate the global water cycle. In this context, analysis of hydrological extremes is important and requires the adaptation of identification methods used for river basin models. This paper presents two methodologies that extend the tools to analyze spatio-temporal drought development and characteristics using large scale gridded time series of hydrometeorological data. The methodologies are classified as non-contiguous and contiguous drought area analyses (i.e. NCDA and CDA). The NCDA presents time series of percentages of areas in drought at the global scale and for pre-defined regions of known hydroclimatology. The CDA is introduced as a complementary method that generates information on the spatial coherence of drought events at the global scale. Spatial drought events are found through CDA by clustering patterns (contiguous areas). In this study the global hydrological model WaterGAP was used to illustrate the methodology development. Global gridded time series of subsurface runoff (resolution 0.5°) simulated with the WaterGAP model from land points were used. The NCDA and CDA were developed to identify drought events in runoff. The percentages of area in drought calculated with both methods show complementary information on the spatial and temporal events for the last decades of the 20th century. The NCDA provides relevant information on the average number of droughts, duration and severity (deficit volume) for pre-defined regions (globe, 2 selected hydroclimatic regions). Additionally, the CDA provides information on the number of spatially linked areas in drought, maximum spatial event and their geographic location on the globe. Some results capture the overall spatio-temporal drought extremes over the last decades of the 20th century. Events like the El Niño Southern Oscillation (ENSO) in South America and the pan-European drought in 1976 appeared clearly in both analyses. The methodologies introduced provide an important basis for the global characterization of droughts, model inter-comparison of drought identified from global hydrological models and spatial event analyse

    No NAT'd User left Behind: Fingerprinting Users behind NAT from NetFlow Records alone

    Full text link
    It is generally recognized that the traffic generated by an individual connected to a network acts as his biometric signature. Several tools exploit this fact to fingerprint and monitor users. Often, though, these tools assume to access the entire traffic, including IP addresses and payloads. This is not feasible on the grounds that both performance and privacy would be negatively affected. In reality, most ISPs convert user traffic into NetFlow records for a concise representation that does not include, for instance, any payloads. More importantly, large and distributed networks are usually NAT'd, thus a few IP addresses may be associated to thousands of users. We devised a new fingerprinting framework that overcomes these hurdles. Our system is able to analyze a huge amount of network traffic represented as NetFlows, with the intent to track people. It does so by accurately inferring when users are connected to the network and which IP addresses they are using, even though thousands of users are hidden behind NAT. Our prototype implementation was deployed and tested within an existing large metropolitan WiFi network serving about 200,000 users, with an average load of more than 1,000 users simultaneously connected behind 2 NAT'd IP addresses only. Our solution turned out to be very effective, with an accuracy greater than 90%. We also devised new tools and refined existing ones that may be applied to other contexts related to NetFlow analysis

    Timing Anomalies Reloaded

    Get PDF
    Computing tight WCET bounds in the presence of timing anomalies - found in almost any modern hardware architecture - is a major challenge of timing analysis. In this paper, we renew the discussion about timing anomalies, demonstrating that even simple hardware architectures are prone to timing anomalies. We furthermore complete the list of timing-anomalous cache replacement policies, proving that the most-recently-used replacement policy (MRU) also exhibits a domino effect

    Primary and successive events in the Madden–Julian Oscillation

    Get PDF
    Conventional analyses of the MJO tend to produce a repeating cycle, such that any particular feature cannot be unambiguously attributed to the current or previous event. We take advantage of the sporadic nature of the MJO and classify each observed Madden-Julian (MJ) event as either primary, with no immediately preceding MJ event, or successive, which does immediately follow a preceding event. 40% of MJ events are primary events. Precursor features of the primary events can be unambiguously attributed to that event. A suppressed convective anomaly grows and decays in situ over the Indian Ocean, prior to the start of most primary MJ events. An associated mid-tropospheric temperature anomaly destabilises the atmosphere, leading to the generation of the active MJ event. Hence, primary MJ events appear to be thermodynamically triggered by a previous dry period, although stochastic forcing may also be important. Other theories predict that boundary-layer convergence, humidity, propagation of dynamical structures around the Equator, sea surface temperatures, and lateral forcing by extratropical transients may all be important in triggering an event. Although precursor signals from these mechanisms are diagnosed from reanalysis and satellite observational data in the successive MJ events, they are all absent in the primary MJ events. Hence, it appears that these apparent precursor signals are part of the MJO once it is established, but do not play a role in the spontaneous generation of the MJO. The most frequent starting location of the primary events is the Indian Ocean, but over half of them start elsewhere, from the maritime continent to the western Pacific
    corecore