34 research outputs found

    IoT-based Lava Flood Early Warning System with Rainfall Intensity Monitoring and Disaster Communication Technology

    Get PDF
    A lava flood disaster is a volcanic hazard that often occurs when heavy rains are happening at the top of a volcano. This flood carries volcanic material from upstream to downstream of the river, affecting populous areas located quite far from the volcano peak. Therefore, an advanced early warning system of cold lava floods is inarguably vital. This paper aims to present a reliable, remote, Early Warning System (EWS) specifically designed for lava flood detection, along with its disaster communication system. The proposed system consists of two main subsystems: lava flood detection and disaster communication systems. It utilizes a modified automatic rain gauge; a novel configured vibration sensor; Fuzzy Tree Decision algorithm; ESP microcontrollers that support IoT, and disaster communication tools (WhatsApp, SMS, radio communication). According to the experiment results, the prototype of rainfall detection using the tipping bucket rain gauge sensor can measure heavy and moderate rainfall intensities with 81.5% accuracy. Meanwhile, the prototype of earthquake vibration detection using a geophone sensor can remove noise from car vibrations with a Kalman filter and measure vibrations in high and medium intensity with an accuracy of 89.5%. Measurements from sensors are sent to the webserver. The disaster mitigation team uses data from the webserver to evacuate residents using the disaster communication method. The proposed system was successfully implemented in Mount Merapi, Indonesia, coordinated with the local Disaster Deduction Risk (DDR) forum. Doi: 10.28991/esj-2021-SP1-011 Full Text: PD

    Numerical Integration as and for Probabilistic Inference

    Get PDF
    Numerical integration or quadrature is one of the workhorses of modern scientific computing and a key operation to perform inference in intractable probabilistic models. The epistemic uncertainty about the true value of an analytically intractable integral identifies the integration task as an inference problem itself. Indeed, numerical integration can be cast as a probabilistic numerical method known as Bayesian quadrature (BQ). BQ leverages structural assumptions about the function to be integrated via properties encoded in the prior. A posterior belief over the unknown integral value emerges by conditioning the BQ model on an actively selected point set and corresponding function evaluations. Iterative updates to the Bayesian model turn BQ into an adaptive quadrature method that quantifies its uncertainty about the solution of the integral in a principled way. This thesis traces out the scope of probabilistic integration methods and highlights types of integration tasks that BQ excels at. These arise when sample efficiency is required and encodable prior knowledge about the integration problem of low to moderate dimensionality is at hand. The first contribution addresses transfer learning with BQ. It extends the notion of active learning schemes to cost-sensitive settings where cheap approximations to an expensive-to-evaluate integrand are available. The degeneracy of acquisition policies in simple BQ is lifted upon generalization to the multi-source, cost-sensitive setting. This motivates the formulation of a set of desirable properties for BQ acquisition functions. A second application considers integration tasks arising in statistical computations on Riemannian manifolds that have been learned from data. Unsupervised learning algorithms that respect the intrinsic geometry of the data rely on the repeated estimation of expensive and structured integrals. Our custom-made active BQ scheme outperforms conventional integration tools for Riemannian statistics. Despite their unarguable benefits, BQ schemes provide limited flexibility to construct suitable priors while keeping the inference step tractable. In a final contribution, we identify the ubiquitous integration problem of computing multivariate normal probabilities as a type of integration task that is structurally taxing for BQ. The instead proposed method is an elegant algorithm based on Markov chain Monte Carlo that permits both sampling from and estimating the normalization constant of linearly constrained Gaussians that contain an arbitrarily small probability mass. As a whole, this thesis contributes to the wider goal of advancing integration algorithms to satisfy the needs imposed by contemporary probabilistic machine learning applications
    corecore