106,307 research outputs found

    Probabilistic seismic hazard function based on spatiotemporal earthquake likelihood simulation and Akaike information criterion: The PSHF study around off the west coast of Sumatra Island before large earthquake events

    Get PDF
    The probabilistic seismic hazard function (PSHF) before large earthquake events based on the hypothesis earthquake forecast algorithm using the Akaike information criterion (AIC) is performed in this study. The motivation for using the AIC is to better understand the reliability model used to construct the PSHF. The PSHF as the function of the b-value is calculated based on a 5-year window length with a 1-year moving window (instantaneous PSHF) before a large earthquake event. The AIC is calculated based on the likelihood of success and failure using shallow earthquake catalog data around the west coast of Sumatra Island. The probability of occurrence defines the success criteria as more significant than the average probability of greater than or equal to the given magnitude; otherwise, it is defined as failure. Seismic potency has been determined based on the likelihood of an earthquake occurring in several decades or a hundred years. The seismicity rate model is developed based on the integrated data of pre-seismic shallow crustal movement data and the shallow crustal earthquake catalog data. Furthermore, the AIC is calculated based on the likelihood of success and failure as a function of b(t). The b(t) is the change in the b-value as a time function estimated based on shallow earthquake data from 1963 to 2016. In addition, the AIC before M7.9 of 2000, M8.5 of 2007, and M7.8 of 2010 is assessed. The Ī“AIC is then introduced as a function of (AICmodelā€“AICreference) during the observation time. The positive Ī“AIC implies that the likelihood of having a large earthquake is more significant; otherwise, it is smaller. By plotting the time of observation versus Ī“AIC and the PSHF estimated as the function of b(t), we could identify a large positive gradient and increase the PSHF at each certain probability exceedance (PE) level before the great earthquake event. It consistently happened for the three events that were evaluated. It suggested that the results of this study might be very beneficial for probabilistic seismic hazard analysis (PSHA) and seismic mitigation realization

    A Probabilistic Logic Programming Event Calculus

    Full text link
    We present a system for recognising human activity given a symbolic representation of video content. The input of our system is a set of time-stamped short-term activities (STA) detected on video frames. The output is a set of recognised long-term activities (LTA), which are pre-defined temporal combinations of STA. The constraints on the STA that, if satisfied, lead to the recognition of a LTA, have been expressed using a dialect of the Event Calculus. In order to handle the uncertainty that naturally occurs in human activity recognition, we adapted this dialect to a state-of-the-art probabilistic logic programming framework. We present a detailed evaluation and comparison of the crisp and probabilistic approaches through experimentation on a benchmark dataset of human surveillance videos.Comment: Accepted for publication in the Theory and Practice of Logic Programming (TPLP) journa

    Casual reasoning through intervention

    Get PDF

    In light of the theory of Special Relativity is a Passage of Time and the argument of the Presentist untenable?

    Get PDF
    In light of the Special Theory of Relativity and the Minkowski creation of ā€˜spacetimeā€™, the universe is taken to be a four-dimensional entity which postulates bodies as existing within a temporally extended reality. The Special Theory of Relativityā€™s implications liken the nature of the universe to a ā€˜blockā€™ within which all events coexist equally in spacetime. Such a view strikes against the very essence of presentism, which holds that all that exists is the instantaneous state of objects in the present moment. With respect to the present moment, events have a clear division into the past or future, however such regions do not exist in reality and the universe is a three-dimensional entity. The consequences of a four-dimensional universe are disturbing to say the least for our everyday human experience, with once objective facts about reality becoming dependent upon an observerā€™s relative motion and the debate over the extent of true free will in a Block Universe. This paper will look at arguments which seek to rescue the presentist view in light of Special Relativity so such four-dimensionalist implications do not have to be accepted. Two approaches will be considered. The first accepts that presentism is incompatible with Special Relativity, and seeks to show that the theory is ultimately false. The second holds that it is the Block Universe interpretation of Special Relativity that is wrong, and a version of presentism can be reconciled with Special Relativity. The paper will expound and critically examine both of these approaches to review whether the case for the three-dimensionalist and a fundamental passage of time can be made

    Event-based Vision: A Survey

    Get PDF
    Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world

    Dependability checking with StoCharts: Is train radio reliable enough for trains?

    Get PDF
    Performance, dependability and quality of service (QoS) are prime aspects of the UML modelling domain. To capture these aspects effectively in the design phase, we have recently proposed STOCHARTS, a conservative extension of UML statechart diagrams. In this paper, we apply the STOCHART formalism to a safety critical design problem. We model a part of the European Train Control System specification, focusing on the risks of wireless communication failures in future high-speed cross-European trains. Stochastic model checking with the model checker PROVER enables us to derive constraints under which the central quality requirements are satisfied by the STOCHART model. The paper illustrates the flexibility and maturity of STOCHARTS to model real problems in safety critical system design
    • ā€¦
    corecore