14 research outputs found

    Towards a cache-aware spacetime decomposition for the three-dimentional heat equation

    No full text
    International audienc

    Random Forest location prediction from social networks during disaster events.

    No full text
    International audienceRapid location and classification of data posted onsocial networks during time-critical situations such as naturaldisasters, crowd movement and terrorism is very useful wayto gain situational awareness and to plan response efforts.Twitter as successful real time micro-blogging social media, isincreasingly used to improve resilience during extreme weatherevents/emergency management situations, including earthquake.It being used during crises by communicating potential risksand their impacts by informing agencies and officials. Thegeographical location information of such events are vital torescue people in danger, or need assistance. However, only fewmessages contains there native geographical coordinates (GPS).So identifying location is a real challenge with Twitter data duringcritical situations. Identification of Tweets and their preciselocation are still inaccurate. In this work, we propose to usesemi-supervised technique to utilize unlabeled data, which is oftenabundant at the onset of a crisis event, along with fewer labeleddata. Specifically, we adopt an iterative Random Forest fittingpredictionframework to learn the semi-supervised model

    Random Forest location prediction from social networks during disaster events.

    No full text
    International audienceRapid location and classification of data posted onsocial networks during time-critical situations such as naturaldisasters, crowd movement and terrorism is very useful wayto gain situational awareness and to plan response efforts.Twitter as successful real time micro-blogging social media, isincreasingly used to improve resilience during extreme weatherevents/emergency management situations, including earthquake.It being used during crises by communicating potential risksand their impacts by informing agencies and officials. Thegeographical location information of such events are vital torescue people in danger, or need assistance. However, only fewmessages contains there native geographical coordinates (GPS).So identifying location is a real challenge with Twitter data duringcritical situations. Identification of Tweets and their preciselocation are still inaccurate. In this work, we propose to usesemi-supervised technique to utilize unlabeled data, which is oftenabundant at the onset of a crisis event, along with fewer labeleddata. Specifically, we adopt an iterative Random Forest fittingpredictionframework to learn the semi-supervised model

    An R package for hybrid uncertainty analysis in natural and environmental risk assessments using probability, imprecise probability and possibility distributions.

    No full text
    International audienceUncertainty analysis is an unavoidable risk assessment task (for instance for natural hazards, or for environmental issues). In situations where data are scarce, incomplete or imprecise, the systematic and only use of probabilities can be debatable. Over the last years, several alternative mathematical representation methods have been developedto handle in a more flexible manner the lack of knowledge related to input parameters of risk assessment models. This article presents an R package HYRISK dedicated to jointly handling different mathematical representation tools, namely probabilities, possibility distributions and probability functions with imprecise parameters, for the different stages of uncertainty treatment in risk assessments (i.e. uncertainty representation, propagation, sensitivity analysis and decision-making). We support the description using the case study of a dike stability analysis. The package is available at: https://cran.r-project.org/web/packages/HYRISK/index.html

    IGGI, A computing Framework for large Scale Parametric Simulations, Application to uncertainty analysis with TOUGHREACT

    No full text
    International audienceIGGI stands for infrastructure for grids, cluster and intranet. This research project partially funded by the French government is aiming at developing technologies allowing the access and the gathering of the whole computing resources spread over the intranet of a company. This could include dedicated computing power or personal computers. The project is collaboration between BRGM, INRIA and Mandriva. Scientific computing has evolved for last few years and commodity-based components can run a wide range of applications. In this study, our focus will be on the software environment required to carry out large scale parametric simulations, but we will also report on experience of the deployment of such an infrastructure in the day to day life of an intranet. The IGGI software suite is mainly based on two components: ComputeModeTM which smoothly aggregates idle user machine to a virtual computing cluster. This is done through a transparent switch of the PC to a secondary, protected mode from which it boots from the ComputeMode server taking advantage of the PXE protocol. The second IGGI component is CIGRI which scheduled and executed computing tasks on the idle nodes of clusters. Special attention has been paid on the end-users ability to perform easily parametric simulations. Transparent access to the batch-scheduler, checkpoint and migration of the application will be exposed for the test-case of the analysis of uncertainty with TOUGHREACT : "Uncertainty in predictions of transfer model response to a thermal and alkaline perturbation in clay" . The calculations reported were carried out using idle computing capacity on personal computers inside the BRGM. This new architecture is particularly well suited to explore the wide rang

    SURICATE-Nat: Innovative citizen centered platform for Twitter based natural disaster monitoring

    No full text
    International audienceIt is often difficult to promptly perceive the extent of the consequences of a natural disaster when roads are cut, mobile phone is saturated, and the fragmentary information coming from the stricken area arrive at the drip. It is however on the basis of this diagnosis that must be organized the disaster management, relief, assistance to the victims, and the monitoring of the dynamic of the phenomenon. During natural disaster, social media, especially Twitter, sometimes remain one of the only source of in-situ information, as was the case just after the disastrous earthquake of Haiti in 2010. Some of the messages exchanged make it possible to get a clear idea of the consequences thanks to testimonies, photos or videos. It is with this in mind that we have developed a participatory platform for semi-automatic analysis of Twitter posts related to natural disasters. So-called “SURICATE-Nat”, this online Frenchspeaking platform (www.suricatenat.fr) aims to exploit informative messages posted on Twitter immediately during/after occurrence of natural disasters. This platform aims to structure and enrich the flow of raw data from Twitter into a human based instrumental stream that can then be analyzed as those coming from technological sensors (e.g. seismometers, GPS, piezometers, etc.). In addition, responding to willingness of Internet users to have an active “engagement” in the analysis of their own data, SURICATE-Nat offers participatory functionalities recognizing the value of “citizen expertise” and allowing each user to bring their own testimonies and take part in manual classification of tweets posted by others. This methodology aims to take advantage of the complementarity of artificial intelligence and participative approaches, in order to provide robust indicators for decision support

    Integrating strong-motion recordings and twitter data for a rapid shakemap of macroseismic intensity

    No full text
    Rapid estimation of the intensity of seismic ground motions is crucial for an effective rapid response when an earthquake occurs. To this end, maps of updated grond-motion fields (or shakemaps) are produced by using observations or measurements in near real-time to better constrain initial estimates. In this work, two types of observations are integrated to generate shakemaps right after an earthquake: the common type of data recorded by physical sensors (seismic stations) and the data extracted from social sensors (Twitter), or the combination of both. We investigate an approach to extract an approximation of the macroseismic intensity from social sensors 10 min after the earthquake; the approach relies on Twitter feeds to define the “felt area” where the earthquake was felt by the population, and the “unfelt locations” where the earthquake was not reported. Two recent earthquakes in France of moderate magnitude are studied and the results are compared to the official macroseismic intensity maps for validation. For the two studied cases, we note that Peak Ground Acceleration recordings far from the epicenter tend to underestimate the entire macroseismic field, and that the tweets from “felt areas” are complementary for a better estimation of the intensity shakemap. We highlight the importance and the limits of each type of observations when generating the seismic shakemaps

    Deriving the 100-Year Total Water Level around the Coast of Corsica by Combining Trivariate Extreme Value Analysis and Coastal Hydrodynamic Models

    No full text
    As low-lying coastal areas can be impacted by flooding caused by dynamic components that are dependent on each other (wind, waves, water levels—tide, atmospheric surge, currents), the analysis of the return period of a single component is not representative of the return period of the total water level at the coast. It is important to assess a joint return period of all the components. Based on a semiparametric multivariate extreme value analysis, we determined the joint probabilities that significant wave heights (Hs), wind intensity at 10 m above the ground (U), and still water level (SWL) exceeded jointly imposed thresholds all along the Corsica Island coasts (Mediterranean Sea). We also considered the covariate peak direction (Dp), the peak period (Tp), and the wind direction (Du). Here, we focus on providing extreme scenarios to populate coastal hydrodynamic models, SWAN and SWASH-2DH, in order to compute the 100-year total water level (100y-TWL) all along the coasts. We show how the proposed multivariate extreme value analysis can help to more accurately define low-lying zones potentially exposed to coastal flooding, especially in Corsica where a unique value of 2 m was taken into account in previous studies. The computed 100y-TWL values are between 1 m along the eastern coasts and a maximum of 1.8 m on the western coast. The calculated values are also below the 2.4 m threshold recommended when considering the sea level rise (SLR). This highlights the added value of performing a full integration of extreme offshore conditions, together with their dependence on hydrodynamic simulations for screening out the coastal areas potentially exposed to flooding

    Thermal and volumetric properties of complex aqueous electrolyte solutions using the Pitzer formalism – The PhreeSCALE code

    No full text
    Highlights ‱ We described the apparent properties of saline solutions with the Pitzer formalism. ‱ We implemented these calculations in the Phreeqc geochemical calculation code. ‱ PhreeSCALE is tested on several systems. ‱ We used PhreeSCALE to revise the interaction parameters for these two systems. The thermal and volumetric properties of complex aqueous solutions are described according to the Pitzer equation, explicitly taking into account the speciation in the aqueous solutions. The thermal properties are the apparent relative molar enthalpy () and the apparent molar heat capacity (). The volumetric property is the apparent molar volume (). Equations describing these properties are obtained from the temperature or pressure derivatives of the excess Gibbs energy and make it possible to calculate the dilution enthalpy (), the heat capacity () and the density (ρ) of aqueous solutions up to high concentrations. Their implementation in PHREEQC V.3 (Parkhurst and Appelo, 2013) is described and has led to a new numerical tool, called PhreeSCALE. It was tested first, using a set of parameters (specific interaction parameters and standard properties) from the literature for two binary systems (Na2SO4–H2O and MgSO4–H2O), for the quaternary K–Na–Cl–SO4 system (heat capacity only) and for the Na–K–Ca–Mg–Cl–SO4–HCO3 system (density only). The results obtained with PhreeSCALE are in agreement with the literature data when the same standard solution heat capacity () and volume (V0) values are used. For further applications of this improved computation tool, these standard solution properties were calculated independently, using the Helgeson–Kirkham–Flowers (HKF) equations. By using this kind of approach, most of the Pitzer interaction parameters coming from literature become obsolete since they are not coherent with the standard properties calculated according to the HKF formalism. Consequently a new set of interaction parameters must be determined. This approach was successfully applied to the Na2SO4–H2O and MgSO4–H2O binary systems, providing a new set of optimized interaction parameters, consistent with the standard solution properties derived from the HKF equations
    corecore