1,765 research outputs found

    CryoSat-2 satellite radar altimetry for river analysis and modelling

    Get PDF

    Concept and Implementation of a Mobile Interactive Floor Plan

    Get PDF
    The relocation of a company facility always comes with great strategic and organizational challenges. The new site needs to be found, employees need to relocate to the new location, the workplaces need to be assigned to the employees, conference rooms need to be added to the conference room reservation system and the staff needs to be introduced to the new building. This process can cause high financial costs as well as a reduced employee performance during the time of acclimatization. To minimize the additional stress created through these changes, a lot of effort needs to be put in informing the employees about news concerning the relocation. Guided tours through the building need to be organized, regular appearing newsletters need to be created and events to integrate new hires into the company culture need to be held. New technologies and the wide coverage of mobile devices provide a great opportunity to support the employees and can offer a modern and intuitive assistance both during the time of adjustment and every day work. This thesis covers the development of an interactive floor plan, designed to help the employees find a certain person or conference room in the building. It discusses the challenges during the design and development and gives an overview of the future development goals of the project

    Long short-term memory networks enhance rainfall-runoff modelling at the national scale of Denmark

    Get PDF
    This study explores the application of long short-term memory (LSTM) networks to simulate runoff at the national scale of Denmark using data from 301 catchments. This is the first LSTM application on Danish data. The results were benchmarked against the Danish national water resources model (DK-model), a physically based hydrological model. The median Kling-Gupta Efficiency (KGE), a common metric to assess performance of runoff predictions (optimum of 1), increased from 0.7 (DK-model) to 0.8 (LSTM) when trained against all catchments. Overall, the LSTM outperformed the DK-model in 80% of catchments. Despite the compelling KGE evaluation, the water balance closure was modelled less accurately by the LSTM. The applicability of LSTM networks for modelling ungauged catchments was assessed via a spatial split-sample experiment. A 20% spatial hold-out showed poorer performance of the LSTM with respect to the DK model. However, after pre-training, that is, weight initialisation obtained from training against simulated data from the DK-model, the performance of the LSTM was effectively improved. This formed a convincing argument supporting the knowledge-guided machine learning (ML) paradigm to integrate physically based models and ML to train robust models that generalise well

    Effects of baryons on weak lensing peak statistics

    Full text link
    Upcoming weak-lensing surveys have the potential to become leading cosmological probes provided all systematic effects are under control. Recently, the ejection of gas due to feedback energy from active galactic nuclei (AGN) has been identified as major source of uncertainty, challenging the success of future weak-lensing probes in terms of cosmology. In this paper we investigate the effects of baryons on the number of weak-lensing peaks in the convergence field. Our analysis is based on full-sky convergence maps constructed via light-cones from NN-body simulations, and we rely on the baryonic correction model of Schneider et al. (2019) to model the baryonic effects on the density field. As a result we find that the baryonic effects strongly depend on the Gaussian smoothing applied to the convergence map. For a DES-like survey setup, a smoothing of θk≳8\theta_k\gtrsim8 arcmin is sufficient to keep the baryon signal below the expected statistical error. Smaller smoothing scales lead to a significant suppression of high peaks (with signal-to-noise above 2), while lower peaks are not affected. The situation is more severe for a Euclid-like setup, where a smoothing of θk≳16\theta_k\gtrsim16 arcmin is required to keep the baryonic suppression signal below the statistical error. Smaller smoothing scales require a full modelling of baryonic effects since both low and high peaks are strongly affected by baryonic feedback.Comment: 22 pages, 11 figures, JCAP accepte

    Risk Stratification in Post-MI Patients Based on Left Ventricular Ejection Fraction and Heart-Rate Turbulence

    Get PDF
    Objectives: Development of risk stratification criteria for predicting mortality in post-infarction patients taking into account LVEF and heart-rate turbulence (HRT). Methods: Based on previous results the two parameters LVEF (continuously) and turbulence slope (TS) as an indicator of the HRT were combined for risk stratification. The method has been applied within two independent data sets (the MPIP-trial and the EMIAT-study). Results: The criteria were defined in order to match the outcome of applying LVEF ( 30 % in sensitivity. In the MPIP trial the optimal criteria selected are TS normal and LVEF ( 21 % or TS abnormal and LVEF ( 40 %. Within the placebo group of the EMIAT-study the corresponding criteria are: TS normal and LVEF ( 23 % or TS abnormal and LVEF ( 40 %. Combining both studies the following criteria could be obtained: TS normal and LVEF ( 20 % or TS abnormal and LVEF ( 40 %. In the MPIP study 83 out of the 581 patients (= 14.3 %) are fulfilling these criteria. Within this group 30 patients have died during the follow-up. In the EMIAT-trial 218 out of the 591 patients (= 37.9 %) are classified as high risk patients with 53 deaths. Combining both studies the high risk group contains 301 patients with 83 deaths (ppv = 27.7 %). Using the MADIT-criterion as classification rule (LVEF ( 30 %) a sample of 375 patients with 85 deaths (ppv = 24 %) can be selected. Conclusions: The stratification rule based on LVEF and TS is able to select high risk patients suitable for implanting an ICD. The rule performs better than the classical one with LVEF alone. The high risk group applying the new criteria is smaller with about the same number of deaths and therefor with a higher positive predictive value. The classification criteria have been validated within a bootstrap study with 100 replications. In all samples the rule based on TS and LVEF (= NEW) was superior to LVEV alone, the high risk group has been smaller (( s: 301 ( 14.5 (NEW) vs. 375 ( 14.5 (LVEF)) and the positive predictive value was larger (( s: 27.2 ( 2.6 % (NEW) vs. 23.3 ( 2.2 % (LVEF)). The new criteria are less expensive due to a reduced number of high risk patients selected

    Effect of hypoxia and hyperoxia on exercise performance in healthy individuals and in patients with pulmonary hypertension: A systematic review

    Full text link
    Exercise performance is determined by oxygen supply to working muscles and vital organs. In healthy individuals, exercise performance is limited in the hypoxic environment at altitude, when oxygen delivery is diminished due to the reduced alveolar and arterial oxygen partial pressures. In patients with pulmonary hypertension, exercise performance is already reduced near sea level due to impairments of the pulmonary circulation and gas exchange and, presumably, these limitations are more pronounced at altitude. In studies performed near sea level in healthy subjects as well as in patients with pulmonary hypertension (PH) maximal performance during progressive ramp exercise and endurance of submaximal constant load exercise were substantially enhanced by breathing oxygen-enriched air. Both in healthy individuals and in PH-patients these improvements were mediated by a better arterial, muscular and cerebral oxygenation along with a reduced sympathetic excitation, as suggested by the reduced heart rate and alveolar ventilation at submaximal isoloads, and an improved pulmonary gas exchange efficiency, especially in patients with PH. In summary, in healthy individuals and in patients with pulmonary hypertension, alterations in the inspiratory PO2 by exposure to hypobaric hypoxia or normobaric hyperoxia reduce or enhance exercise performance, respectively, by modifying oxygen delivery to the muscles and the brain, by effects on cardiovascular and respiratory control and by alterations in pulmonary gas exchange. The understanding of these physiologic mechanisms helps counselling individuals planning altitude or air travel and prescribing oxygen therapy to patients with pulmonary hypertension

    A Statistical Model for Risk Stratification on the Basis of Left Ventricular Ejection Fraction and Heart-Rate Turbulence

    Get PDF
    The MPIP data set was used to obtain a model for mortality risk stratification of acute myocardial infarction patients. The predictors heart rate turbulence (HRT) and left-ventricular ejection fraction (LVEF) were employed. HRT was a categorical variable of three levels; LVEF was continuous and its influence on the relative risk was explained by the natural logarithm function (found using fractional polynomials). Cox - PH model with HRT and lnLVEF was constructed and used for risk stratification. The model can be used to divide the patients into two or more groups according to mortality risk. It also describes the relationship between risk and predictors by a (continuous) function, which allows the calculation of individual mortality risk

    Substances from unregulated drug markets - A retrospective data analysis of customer-provided samples from a decade of drug checking service in Zurich (Switzerland)

    Get PDF
    BACKGROUND: Drug checking services (DCS) are harm reduction interventions for people who consume illicit substances. Unregulated drug markets lead to samples with unexpected and variable contents. A retrospective data analysis of Zurich's DCS was performed to determine the nature of these samples. METHODS: This study aims to investigate the qualitative and quantitative properties of 16,815 customer-provided psychoactive drug samples analyzed chemically through the DCS in Zurich from 1st^{st} January 2011 to 31st^{st} December 2021. The main analytical method utilized for characterizing these substances was high-performance liquid chromatography and gas chromatography-mass spectrometry. Data sets are summarized using descriptive statistics. RESULTS: There was a 2.5-fold increase in the number of tested samples over the past decade. An overall proportion of 57.9% (weighted mean) of samples within our database demonstrates unexpected analytical findings and additional low sample contents during the observation period. Substantial differences in quality and quantity between substance groups were detected and an increase of sample quality and content over time was demonstrated. CONCLUSIONS: Chemical analysis reveals that over half of substances acquired from unregulated drug markets analyzed through DCS in Zurich are with low qualitative and quantitative properties, which may expose users to risks. Based on longitudinal analyses over a decade, this study contributes to the body of evidence that DCS may potentially manipulate unregulated drug markets towards providing better quality substances, as well as may stabilize these markets over time. The necessity for drug policy changes to make this service accessible in further settings was highlighted, as DCS still often take place in legal grey zones
    • …
    corecore