176,796 research outputs found

    Platform independent tool for local event correlation

    Get PDF
    Event correlation plays a crucial role in network management systems, helping to reduce the amount of event messages and making their meaning clearer to a human operator. In early network management systems, events were correlated only at network management servers. Most modern network management systems also provide means for local event correlation at agents, in order to increase the scalability of the system and to reduce network load. Unfortunately all event correlation tools currently available are commercial, quite expensive, and highly platform dependent. The author presents a free platform independent tool called sec for correlating network management events locally at an agent's side

    VIOLA - A multi-purpose and web-based visualization tool for neuronal-network simulation output

    Full text link
    Neuronal network models and corresponding computer simulations are invaluable tools to aid the interpretation of the relationship between neuron properties, connectivity and measured activity in cortical tissue. Spatiotemporal patterns of activity propagating across the cortical surface as observed experimentally can for example be described by neuronal network models with layered geometry and distance-dependent connectivity. The interpretation of the resulting stream of multi-modal and multi-dimensional simulation data calls for integrating interactive visualization steps into existing simulation-analysis workflows. Here, we present a set of interactive visualization concepts called views for the visual analysis of activity data in topological network models, and a corresponding reference implementation VIOLA (VIsualization Of Layer Activity). The software is a lightweight, open-source, web-based and platform-independent application combining and adapting modern interactive visualization paradigms, such as coordinated multiple views, for massively parallel neurophysiological data. For a use-case demonstration we consider spiking activity data of a two-population, layered point-neuron network model subject to a spatially confined excitation originating from an external population. With the multiple coordinated views, an explorative and qualitative assessment of the spatiotemporal features of neuronal activity can be performed upfront of a detailed quantitative data analysis of specific aspects of the data. Furthermore, ongoing efforts including the European Human Brain Project aim at providing online user portals for integrated model development, simulation, analysis and provenance tracking, wherein interactive visual analysis tools are one component. Browser-compatible, web-technology based solutions are therefore required. Within this scope, with VIOLA we provide a first prototype.Comment: 38 pages, 10 figures, 3 table

    Precise Request Tracing and Performance Debugging for Multi-tier Services of Black Boxes

    Full text link
    As more and more multi-tier services are developed from commercial components or heterogeneous middleware without the source code available, both developers and administrators need a precise request tracing tool to help understand and debug performance problems of large concurrent services of black boxes. Previous work fails to resolve this issue in several ways: they either accept the imprecision of probabilistic correlation methods, or rely on knowledge of protocols to isolate requests in pursuit of tracing accuracy. This paper introduces a tool named PreciseTracer to help debug performance problems of multi-tier services of black boxes. Our contributions are two-fold: first, we propose a precise request tracing algorithm for multi-tier services of black boxes, which only uses application-independent knowledge; secondly, we present a component activity graph abstraction to represent causal paths of requests and facilitate end-to-end performance debugging. The low overhead and tolerance of noise make PreciseTracer a promising tracing tool for using on production systems

    PS-Sim: A Framework for Scalable Simulation of Participatory Sensing Data

    Full text link
    Emergence of smartphone and the participatory sensing (PS) paradigm have paved the way for a new variant of pervasive computing. In PS, human user performs sensing tasks and generates notifications, typically in lieu of incentives. These notifications are real-time, large-volume, and multi-modal, which are eventually fused by the PS platform to generate a summary. One major limitation with PS is the sparsity of notifications owing to lack of active participation, thus inhibiting large scale real-life experiments for the research community. On the flip side, research community always needs ground truth to validate the efficacy of the proposed models and algorithms. Most of the PS applications involve human mobility and report generation following sensing of any event of interest in the adjacent environment. This work is an attempt to study and empirically model human participation behavior and event occurrence distributions through development of a location-sensitive data simulation framework, called PS-Sim. From extensive experiments it has been observed that the synthetic data generated by PS-Sim replicates real participation and event occurrence behaviors in PS applications, which may be considered for validation purpose in absence of the groundtruth. As a proof-of-concept, we have used real-life dataset from a vehicular traffic management application to train the models in PS-Sim and cross-validated the simulated data with other parts of the same dataset.Comment: Published and Appeared in Proceedings of IEEE International Conference on Smart Computing (SMARTCOMP-2018

    A Comprehensive Workflow for General-Purpose Neural Modeling with Highly Configurable Neuromorphic Hardware Systems

    Full text link
    In this paper we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware-experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results

    Response and Recovery of the Comanche Carbonate Platform Surrounding Multiple Cretaceous Oceanic Anoxic Events, Northern Gulf of Mexico

    Get PDF
    The ubiquity of carbonate platforms throughout the Cretaceous Period is recognized as a product of high eustatic sea-level and a distinct climatic optimum induced by rapid sea-floor spreading and elevated levels of atmospheric carbon-dioxide. Notably, a series of global oceanic anoxic events (OAEs) punctuate this time-interval and mark periods of significantly reduced free oxygen in the world's oceans. The best records of these events are often from one-dimensional shelf or basin sections where only abrupt shifts between oxygenated carbonates and anoxic shales are recorded. The Comanche Platform of central Texas provides a unique opportunity to study these events within a well-constrained stratigraphic framework in which their up-dip and down-dip sedimentologic effects can be observed and the recovery of the platform to equilibrium states can be timed and understood. Stable isotope data from whole cores in middle Hauterivian through lower Campanian mixed carbonate-siliciclastic strata are used to construct a 52-myr carbon isotope reference profile for the northern Gulf of Mexico. Correlation of this composite curve to numerous global reference profiles permits identification of several anoxic events and allows their impact on platform architecture and fades distribution to be documented. Oceanic anoxic events la, 1b, 1d, and 2 occurred immediately before, after, or during shale deposition in the Pine Island Member, Bexar Member, Del Rio Formation, and Eagle Ford Group, respectively. Oceanic anoxic event 3 corresponds to deposition of the Austin Chalk Group. Platform drowning on three occasions more closely coincided with globally recognized anoxic sub-events such as the Fallot, Albian-Cenomanian, and Mid-Cenomanian events. This illustrates that the specific anoxic event most affecting a given carbonate platform varied globally as a function of regional oceanographic circumstances. Using chemo- and sequence-stratigraphic observations, a four-stage model is proposed to describe the changing fades patterns, fauna, sedimentation accumulation rates, platform architectures, and relative sea-level trends of transgressive-regressive composite sequences that developed in response to global carbon-cycle perturbations. The four phases of platform evolution include the equilibrium, crisis, anoxic, and recovery stages. The equilibrium stage is characterized by progradational shelf geometries and coralrudist phototrophic faunal assemblages. Similar phototrophic fauna typify the crisis stage; however, incipient biocalcification crises of this phase led to retrogradational shelf morphologies, transgressive facies patterns, and increased clay mineral proportions. Anoxic stages of the Comanche Platform were coincident with back-ground deposition of organic-rich shale on drowned shelves and heterotrophic fauna dominated by oysters or coccolithophorids. Eustatic peaks of this stage were of moderate amplitude (similar to 30 m), yet relative sea-level rises were greatly enhanced by reduced sedimentation rates. In the recovery stage, heterotrophic carbonate factories re-established at the shoreline as progradational ramp systems and sediment accumulation rates slowly increased as dysoxia diminished. Full recovery to equilibrium conditions may or may not have followed. Geochemical and stratigraphic trends present in the four stages are consistent with increased volcanism along mid-ocean ridges and in large-igneous provinces as primary drivers of Cretaceous OAEs and the resulting transgressive-regressive composite sequences. (C) 2014 Elsevier Ltd. All rights reserved.BHP-BillitonReservoir Characterization Research Laboratory, the Bureau of Economic GeologyJackson School of Geosciences at the University of Texas at AustinBureau of Economic Geolog

    A task and performance analysis of endoscopic submucosal dissection (ESD) surgery

    Get PDF
    BACKGROUND: ESD is an endoscopic technique for en bloc resection of gastrointestinal lesions. ESD is a widely-used in Japan and throughout Asia, but not as prevalent in Europe or the US. The procedure is technically challenging and has higher adverse events (bleeding, perforation) compared to endoscopic mucosal resection. Inadequate training platforms and lack of established training curricula have restricted its wide acceptance in the US. Thus, we aim to develop a Virtual Endoluminal Surgery Simulator (VESS) for objective ESD training and assessment. In this work, we performed task and performance analysis of ESD surgeries. METHODS: We performed a detailed colorectal ESD task analysis and identified the critical ESD steps for lesion identification, marking, injection, circumferential cutting, dissection, intraprocedural complication management, and post-procedure examination. We constructed a hierarchical task tree that elaborates the order of tasks in these steps. Furthermore, we developed quantitative ESD performance metrics. We measured task times and scores of 16 ESD surgeries performed by four different endoscopic surgeons. RESULTS: The average time of the marking, injection, and circumferential cutting phases are 203.4 (σ: 205.46), 83.5 (σ: 49.92), 908.4 s. (σ: 584.53), respectively. Cutting the submucosal layer takes most of the time of overall ESD procedure time with an average of 1394.7 s (σ: 908.43). We also performed correlation analysis (Pearson's test) among the performance scores of the tasks. There is a moderate positive correlation (R = 0.528, p = 0.0355) between marking scores and total scores, a strong positive correlation (R = 0.7879, p = 0.0003) between circumferential cutting and submucosal dissection and total scores. Similarly, we noted a strong positive correlation (R = 0.7095, p = 0.0021) between circumferential cutting and submucosal dissection and marking scores. CONCLUSIONS: We elaborated ESD tasks and developed quantitative performance metrics used in analysis of actual surgery performance. These ESD metrics will be used in future validation studies of our VESS simulator

    Event tracking for real-time unaware sensitivity analysis (EventTracker)

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2013 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.This paper introduces a platform for online Sensitivity Analysis (SA) that is applicable in large scale real-time data acquisition (DAQ) systems. Here we use the term real-time in the context of a system that has to respond to externally generated input stimuli within a finite and specified period. Complex industrial systems such as manufacturing, healthcare, transport, and finance require high quality information on which to base timely responses to events occurring in their volatile environments. The motivation for the proposed EventTracker platform is the assumption that modern industrial systems are able to capture data in real-time and have the necessary technological flexibility to adjust to changing system requirements. The flexibility to adapt can only be assured if data is succinctly interpreted and translated into corrective actions in a timely manner. An important factor that facilitates data interpretation and information modelling is an appreciation of the affect system inputs have on each output at the time of occurrence. Many existing sensitivity analysis methods appear to hamper efficient and timely analysis due to a reliance on historical data, or sluggishness in providing a timely solution that would be of use in real-time applications. This inefficiency is further compounded by computational limitations and the complexity of some existing models. In dealing with real-time event driven systems, the underpinning logic of the proposed method is based on the assumption that in the vast majority of cases changes in input variables will trigger events. Every single or combination of events could subsequently result in a change to the system state. The proposed event tracking sensitivity analysis method describes variables and the system state as a collection of events. The higher the numeric occurrence of an input variable at the trigger level during an event monitoring interval, the greater is its impact on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis method with a comparable method (that of Entropy). An improvement of 10% in computational efficiency without loss in accuracy was observed. The comparison also showed that the time taken to perform the sensitivity analysis was 0.5% of that required when using the comparable Entropy based method.EPSR
    • 

    corecore