204 research outputs found

    Rockfall Magnitude-Frequency Relationship Based on Multi-Source Data from Monitoring and Inventory

    Get PDF
    Quantitative hazard analysis of rockfalls is a fundamental tool for sustainable risk management, even more so in places where the preservation of natural heritage and people's safety must find the right balance. The first step consists in determining the magnitude-frequency relationship, which corresponds to the apparently simple question: how big and how often will a rockfall be detached from anywhere in the cliff? However, there is usually only scarce data on past activity from which to derive a quantitative answer. Methods are proposed to optimize the exploitation of multi-source inventories, introducing sampling extent as a main attribute for the analysis. This work explores the maximum possible synergy between data sources as different as traditional inventories of observed events and current remote sensing techniques. Both information sources may converge, providing complementary results in the magnitude-frequency relationship, taking advantage of each strength that overcomes the correspondent weakness. Results allow characterizing rockfall detachment hazardous conditions and reveal many of the underlying conditioning factors, which are analyzed in this paper. High variability of the hazard over time and space has been found, with strong dependencies on influential external factors. Therefore, it will be necessary to give the appropriate reading to the magnitude-frequency scenarios, depending on the application of risk management tools (e.g., hazard zoning, quantitative risk analysis, or actions that bring us closer to its forecast). In this sense, some criteria and proxies for hazard assessment are proposed in the paper

    Machine Learning-Based Rockfalls Detection with 3D Point Clouds, Example in the Montserrat Massif (Spain)

    Full text link
    Rock slope monitoring using 3D point cloud data allows the creation of rockfall inventories, provided that an efficient methodology is available to quantify the activity. However, monitoring with high temporal and spatial resolution entails the processing of a great volume of data, which can become a problem for the processing system. The standard methodology for monitoring includes the steps of data capture, point cloud alignment, the measure of differences, clustering differences, and identification of rockfalls. In this article, we propose a new methodology adapted from existing algorithms (multiscale model to model cloud comparison and density-based spatial clustering of applications with noise algorithm) and machine learning techniques to facilitate the identification of rockfalls from compared temporary 3D point clouds, possibly the step with most user interpretation. Point clouds are processed to generate 33 new features related to the rock cliff differences, predominant differences, or orientation for classification with 11 machine learning models, combined with 2 undersampling and 13 oversampling methods. The proposed methodology is divided into two software packages: point cloud monitoring and cluster classification. The prediction model applied in two study cases in the Montserrat conglomeratic massif (Barcelona, Spain) reveal that a reduction of 98% in the initial number of clusters is sufficient to identify the totality of rockfalls in the first case study. The second case study requires a 96% reduction to identify 90% of the rockfalls, suggesting that the homogeneity of the rockfall characteristics is a key factor for the correct prediction of the machine learning models

    Reseñas

    Get PDF

    Extended use of dual antiplatelet therapy among older adults with acute coronary syndromes and associated variables: a cohort study

    Full text link
    Current guidelines recommend extending the use of dual antiplatelet therapy (DAPT) beyond 1 year in patients with an acute coronary syndrome (ACS) and a high risk of ischaemia and low risk of bleeding. No data exist about the implementation of this strategy in older adults from routine clinical practice. Methods We conducted a Spanish multicentre, retrospective, observational registry-based study that included patients with ACS but no thrombotic or bleeding events during the first year of DAPT after discharge and no indication for oral anticoagulants. High bleeding risk was defined according to the Academic Research Consortium definition. We assessed the proportion of cases of extended DAPT among patients 65≥years that went beyond 1 year after hospitalisation for ACS and the variables associated with the strategy. Results We found that 48.1% (928/1,928) of patients were aged≥65 years. DAPT was continued beyond 1 year in 32.1% (298/928) of patients≥65; which was a similar proportion as with their younger counterparts. There was no significant correlation between a high bleeding risk and DAPT duration. Contrastingly, there was a strong correlation between the extent of coronary disease and DAPT duration (p<0.001). Other variables associated with extended DAPT were a higher left ventricle ejection fraction, a history of heart failure and a prior stent thrombosis. Conclusion: There was no correlation between age and extended use of DAPT beyond 1 year in older patients with ACS. DAPT was extended in about one-third of patients≥65 years. The severity of the coronary disease, prior heart failure, left ventricle ejection fraction and prior stent thrombosis all correlated with extended DAPT

    Serum Potassium Dynamics During Acute Heart Failure Hospitalization

    Get PDF
    [Abstract] Background. Available information about prognostic implications of potassium levels alteration in the setting of acute heart failure (AHF) is scarce. Objectives. We aim to describe the prevalence of dyskalemia (hypo or hyperkalemia), its dynamic changes during AHF-hospitalization, and its long-term clinical impact after hospitalization. Methods. We analyzed 1779 patients hospitalized with AHF who were included in the REDINSCOR II registry. Patients were classified in three groups, according to potassium levels both on admission and discharge: hypokalemia (potassium  5 mEq/L). Results. The prevalence of hypokalemia and hyperkalemia on admission was 8.2 and 4.6%, respectively, and 6.4 and 2.7% at discharge. Hyperkalemia on admission was associated with higher in-hospital mortality (OR = 2.32 [95% CI: 1.04–5.21] p = 0.045). Among patients with hypokalemia on admission, 79% had normalized potassium levels at discharge. In the case of patients with hyperkalemia on admission, 89% normalized kalemia before discharge. In multivariate Cox regression, dyskalemia was associated with higher 12-month mortality, (HR = 1.48 [95% CI, 1.12–1.96], p = 0.005). Among all patterns of dyskalemia persistent hypokalemia (HR = 3.17 [95% CI: 1.71–5.88]; p < 0.001), and transient hyperkalemia (HR = 1.75 [95% CI: 1.07–2.86]; p = 0.023) were related to reduced 12-month survival. Conclusions. Potassium levels alterations are frequent and show a dynamic behavior during AHF admission. Hyperkalemia on admission is an independent predictor of higher in-hospital mortality. Furthermore, persistent hypokalemia and transient hyperkalemia on admission are independent predictors of 12-month mortality.This work is funded by the Instituto de Salud Carlos III (Ministry of Economy, Industry, and Competitiveness) and co-funded by the European Regional Development Fund, through the CIBER in cardiovascular diseases (CB16/11/00502)

    Short- and Long-Term Prognosis of Patients With Takotsubo Syndrome Based on Different Triggers: Importance of the Physical Nature

    Get PDF
    Background Takotsubo syndrome (TTS) is an acute reversible heart condition initially believed to represent a benign pathology attributable to its self-limiting clinical course; however, little is known about its prognosis based on different triggers. This study compared short- and long-term outcomes between TTS based on different triggers, focusing on various physical triggering events. Methods and Results We analyzed patients with a definitive TTS diagnosis recruited for the Spanish National Registry on TTS (RETAKO [Registry on Takotsubo Syndrome]). Short- and long-term outcomes were compared between different groups according to triggering factors. A total of 939 patients were included. An emotional trigger was detected in 340 patients (36.2%), a physical trigger in 293 patients (31.2%), and none could be identified in 306 patients (32.6%). The main physical triggers observed were infections (30.7%), followed by surgical procedures (22.5%), physical activities (18.4%), episodes of severe hypoxia (18.4%), and neurological events (9.9%). TTS triggered by physical factors showed higher mortality in the short and long term, and within this group, patients whose physical trigger was hypoxia were those who had a worse prognosis, in addition to being triggered by physical factors, including age >70 years, diabetes mellitus, left ventricular eyection fraction <30% and shock on admission, and increased long-term mortality risk. Conclusions TTS triggered by physical factors could present a worse prognosis in terms of mortality. Under the TTS label, there could be as yet undiscovered very different clinical profiles, whose differentiation could lead to individual better management, and therefore the perception of TTS as having a benign prognosis should be generally ruled out

    Antithrombotic Therapy in Elderly Patients with Acute Coronary Syndromes

    Get PDF
    The treatment of acute coronary syndrome (ACS) in elderly patients continues to be a challenge because of the characteS.G.B.ristics of this population and the lack of data and specific recommendations. This review summarizes the current evidence about critical points of oral antithrombotic therapy in elderly patients. To this end, we discuss the peculiarities and differences reported referring to dual antiplatelet therapy (DAPT) in ACS management in elderly patients and what might be the best option considering these population characteristics. Furthermore, we analyze antithrombotic strategies in patients with atrial fibrillation (AF), with a particular focus on those cases that also present coronary artery disease (CAD). It is imperative to deepen our knowledge regarding the management of these challenging patients through real-world data and specifically designed geriatric studies to help resolve the questions remaining in their disease management

    <i>Gaia</i> Data Release 1. Summary of the astrometric, photometric, and survey properties

    Get PDF
    Context. At about 1000 days after the launch of Gaia we present the first Gaia data release, Gaia DR1, consisting of astrometry and photometry for over 1 billion sources brighter than magnitude 20.7. Aims. A summary of Gaia DR1 is presented along with illustrations of the scientific quality of the data, followed by a discussion of the limitations due to the preliminary nature of this release. Methods. The raw data collected by Gaia during the first 14 months of the mission have been processed by the Gaia Data Processing and Analysis Consortium (DPAC) and turned into an astrometric and photometric catalogue. Results. Gaia DR1 consists of three components: a primary astrometric data set which contains the positions, parallaxes, and mean proper motions for about 2 million of the brightest stars in common with the HIPPARCOS and Tycho-2 catalogues – a realisation of the Tycho-Gaia Astrometric Solution (TGAS) – and a secondary astrometric data set containing the positions for an additional 1.1 billion sources. The second component is the photometric data set, consisting of mean G-band magnitudes for all sources. The G-band light curves and the characteristics of ∼3000 Cepheid and RR-Lyrae stars, observed at high cadence around the south ecliptic pole, form the third component. For the primary astrometric data set the typical uncertainty is about 0.3 mas for the positions and parallaxes, and about 1 mas yr−1 for the proper motions. A systematic component of ∼0.3 mas should be added to the parallax uncertainties. For the subset of ∼94 000 HIPPARCOS stars in the primary data set, the proper motions are much more precise at about 0.06 mas yr−1. For the secondary astrometric data set, the typical uncertainty of the positions is ∼10 mas. The median uncertainties on the mean G-band magnitudes range from the mmag level to ∼0.03 mag over the magnitude range 5 to 20.7. Conclusions. Gaia DR1 is an important milestone ahead of the next Gaia data release, which will feature five-parameter astrometry for all sources. Extensive validation shows that Gaia DR1 represents a major advance in the mapping of the heavens and the availability of basic stellar data that underpin observational astrophysics. Nevertheless, the very preliminary nature of this first Gaia data release does lead to a number of important limitations to the data quality which should be carefully considered before drawing conclusions from the data
    corecore