132 research outputs found
Blow-up behavior of collocation solutions to Hammerstein-type volterra integral equations
We analyze the blow-up behavior of one-parameter collocation solutions for Hammerstein-type Volterra integral equations (VIEs) whose solutions may blow up in finite time. To approximate such solutions (and the corresponding blow-up time), we will introduce an adaptive stepsize strategy that guarantees the existence of collocation solutions whose blow-up behavior is the same as the one for the exact solution. Based on the local convergence of the collocation methods for VIEs, we present the convergence analysis for the numerical blow-up time. Numerical experiments illustrate the analysis
An Arctic Disaster and its Policy Implications
The purpose of the research reported here is to help the community in Barrow, Alaska, clarify its vulnerability to extreme weather events, and devise better-informed policies for reducing that vulnerability and adapting to climate variability and change. We examine the worst disaster on record there - a storm that struck on 3 October 1963 - from different disciplinary perspectives and in the context of other severe storms. The major policy responses to date have been a beach nourishment program, a feasibility study of additional means of erosion control, and an emergency management plan. Additional possible responses have been identified in the community's cumulative experience of these storms, but have not yet been fully explored or implemented. Meanwhile, given inherent uncertainties, it is clear that sound policies will allow for corrective action if and when expectations based on the best available knowledge and information turn out to be mistaken. It is also clear that the people of Barrow are in the best position to understand the evolving situation and to decide what to do about it.Les travaux de recherche que l'on présente ici ont pour but d'aider la collectivité de Barrow (Alaska) à définir son degré de vulnérabilité à des conditions climatiques extrêmes, et à créer des politiques plus éclairées qui réduiraient cette vulnérabilité et favoriseraient l'adaptation à la variabilité et au changement climatiques. On examine le pire désastre jamais enregistré à cet endroit, soit une tempête qui fit rage le 3 octobre 1963, et ce, sous l'angle de différentes disciplines et dans le contexte d'autres grandes tempêtes. Jusqu'à présent, les politiques majeures d'intervention se sont résumées à un programme de recharge de plage, à une étude de faisabilité portant sur des mesures supplémentaires de lutte contre l'érosion et à un plan de gestion des situations d'urgence. L'expérience cumulative de la collectivité relative à ces tempêtes a permis de dégager d'autres interventions possibles, sans qu'elles aient toutefois été explorées à fond ou concrétisées. Entre-temps, vu les incertitudes inhérentes à ce genre de choses, il est évident que des politiques bien pensées permettront l'application de mesures correctives si et quand les prédictions fondées sur les toutes dernières connaissances et informations disponibles s'avèrent erronées. Il est en outre évident que les habitants de Barrow sont les mieux placés pour comprendre comment la situation évolue et pour décider des mesures à prendre
Three-body interactions in colloidal systems
We present the first direct measurement of three-body interactions in a
colloidal system comprised of three charged colloidal particles. Two of the
particles have been confined by means of a scanned laser tweezers to a
line-shaped optical trap where they diffused due to thermal fluctuations. Upon
the approach of a third particle, attractive three-body interactions have been
observed. The results are in qualitative agreement with additionally performed
nonlinear Poissson-Boltzmann calculations, which also allow us to investigate
the microionic density distributions in the neighborhood of the interacting
colloidal particles
Estimating Regionalized Hydrological Impacts of Climate Change Over Europe by Performance-Based Weighting of CORDEX Projections
Ensemble projections of future changes in discharge over Europe show large variation. Several methods for performance-based weighting exist that have the potential to increase the robustness of the change signal. Here we use future projections of an ensemble of three hydrological models forced with climate datasets from the Coordinated Downscaling Experiment - European Domain (EURO-CORDEX). The experiment is set-up for nine river basins spread over Europe that hold different climate and catchment characteristics. We evaluate the ensemble consistency and apply two weighting approaches; the Climate model Weighting by Independence and Performance (ClimWIP) that focuses on meteorological variables and the Reliability Ensemble Averaging (REA) in our study applied to discharge statistics per basin. For basins with a strong climate signal, in Southern and Northern Europe, the consistency in the set of projections is large. For rivers in Central Europe the differences between models become more pronounced. Both weighting approaches assign high weights to single General Circulation Models (GCMs). The ClimWIP method results in ensemble mean weighted changes that differ only slightly from the non-weighted mean. The REA method influences the weighted mean more, but the weights highly vary from basin to basin. We see that high weights obtained through past good performance can provide deviating projections for the future. It is not apparent that the GCM signal dominates the overall change signal, i.e., there is no strong intra GCM consistency. However, both weighting methods favored projections from the same GCM
Large violation of Bell inequalities with low entanglement
In this paper we obtain violations of general bipartite Bell inequalities of
order with inputs, outputs and
-dimensional Hilbert spaces. Moreover, we construct explicitly, up to a
random choice of signs, all the elements involved in such violations: the
coefficients of the Bell inequalities, POVMs measurements and quantum states.
Analyzing this construction we find that, even though entanglement is necessary
to obtain violation of Bell inequalities, the Entropy of entanglement of the
underlying state is essentially irrelevant in obtaining large violation. We
also indicate why the maximally entangled state is a rather poor candidate in
producing large violations with arbitrary coefficients. However, we also show
that for Bell inequalities with positive coefficients (in particular, games)
the maximally entangled state achieves the largest violation up to a
logarithmic factor.Comment: Reference [16] added. Some typos correcte
Prognosis research strategy (PROGRESS) 1: a framework for researching clinical outcomes.
The PROGRESS series (www.progress-partnership.org) sets out a framework of four interlinked prognosis research themes and provides examples from several disease fields to show why evidence from prognosis research is crucial to inform all points in the translation of biomedical and health related research into better patient outcomes. Recommendations are made in each of the four papers to improve current research standards What is prognosis research? Prognosis research seeks to understand and improve future outcomes in people with a given disease or health condition. However, there is increasing evidence that prognosis research standards need to be improved Why is prognosis research important? More people now live with disease and conditions that impair health than at any other time in history; prognosis research provides crucial evidence for translating findings from the laboratory to humans, and from clinical research to clinical practice This first article introduces the framework of four interlinked prognosis research themes and then focuses on the first of the themes - fundamental prognosis research, studies that aim to describe and explain future outcomes in relation to current diagnostic and treatment practices, often in relation to quality of care Fundamental prognosis research provides evidence informing healthcare and public health policy, the design and interpretation of randomised trials, and the impact of diagnostic tests on future outcome. It can inform new definitions of disease, may identify unanticipated benefits or harms of interventions, and clarify where new interventions are required to improve prognosis
Prognosis research strategy (PROGRESS) 4: Stratified medicine research
In patients with a particular disease or health condition, stratified medicine seeks to identify thosewho will have the most clinical benefit or least harm from a specific treatment. In this article, thefourth in the PROGRESS series, the authors discuss why prognosis research should form acornerstone of stratified medicine, especially in regard to the identification of factors that predictindividual treatment respons
Data challenges of time domain astronomy
Astronomy has been at the forefront of the development of the techniques and
methodologies of data intensive science for over a decade with large sky
surveys and distributed efforts such as the Virtual Observatory. However, it
faces a new data deluge with the next generation of synoptic sky surveys which
are opening up the time domain for discovery and exploration. This brings both
new scientific opportunities and fresh challenges, in terms of data rates from
robotic telescopes and exponential complexity in linked data, but also for data
mining algorithms used in classification and decision making. In this paper, we
describe how an informatics-based approach-part of the so-called "fourth
paradigm" of scientific discovery-is emerging to deal with these. We review our
experiences with the Palomar-Quest and Catalina Real-Time Transient Sky
Surveys; in particular, addressing the issue of the heterogeneity of data
associated with transient astronomical events (and other sensor networks) and
how to manage and analyze it.Comment: 15 pages, 3 figures, to appear in special issue of Distributed and
Parallel Databases on Data Intensive eScienc
Guideline implementation, drug sequencing, and quality of care in heart failure:design and rationale of TITRATE-HF
Aims: Current heart failure (HF) guidelines recommend to prescribe four drug classes in patients with HF with reduced ejection fraction (HFrEF). A clear challenge exists to adequately implement guideline-directed medical therapy (GDMT) regarding the sequencing of drugs and timely reaching target dose. It is largely unknown how the paradigm shift from a serial and sequential approach for drug therapy to early parallel application of the four drug classes will be executed in daily clinical practice, as well as the reason clinicians may not adhere to new guidelines. We present the design and rationale for the real-world TITRATE-HF study, which aims to assess sequencing strategies for GDMT initiation, dose titration patterns (order and speed), intolerance for GDMT, barriers for implementation, and long-term outcomes in patients with de novo, chronic, and worsening HF. Methods and results: A total of 4000 patients with HFrEF, HF with mildly reduced ejection fraction, and HF with improved ejection fraction will be enrolled in >40 Dutch centres with a follow-up of at least 3 years. Data collection will include demographics, physical examination and vital parameters, electrocardiogram, laboratory measurements, echocardiogram, medication, and quality of life. Detailed information on titration steps will be collected for the four GDMT drug classes. Information will include date, primary reason for change, and potential intolerances. The primary clinical endpoints are HF-related hospitalizations, HF-related urgent visits with a need for intravenous diuretics, all-cause mortality, and cardiovascular mortality. Conclusions: TITRATE-HF is a real-world multicentre longitudinal registry that will provide unique information on contemporary GDMT implementation, sequencing strategies (order and speed), and prognosis in de novo, worsening, and chronic HF patients.</p
Pulmonary artery pressure monitoring in chronic heart failure:effects across clinically relevant subgroups in the MONITOR-HF trial
Background and Aims:In patients with chronic heart failure (HF), the MONITOR-HF trial demonstrated the efficacy of pulmonary artery (PA)-guided HF therapy over standard of care in improving quality of life and reducing HF hospitalizations and mean PA pressure. This study aimed to evaluate the consistency of these benefits in relation to clinically relevant subgroups. Methods: The effect of PA-guided HF therapy was evaluated in the MONITOR-HF trial among predefined subgroups based on age, sex, atrial fibrillation, diabetes mellitus, left ventricular ejection fraction, HF aetiology, cardiac resynchronization therapy, and implantable cardioverter defibrillator. Outcome measures were based upon significance in the main trial and included quality of life-, clinical-, and PA pressure endpoints, and were assessed for each subgroup. Differential effects in relation to the subgroups were assessed with interaction terms. Both unadjusted and multiple testing adjusted interaction terms were presented. Results: The effects of PA monitoring on quality of life, clinical events, and PA pressure were consistent in the predefined subgroups, without any clinically relevant heterogeneity within or across all endpoint categories (all adjusted interaction P-values were non-significant). In the unadjusted analysis of the primary endpoint quality-of-life change, weak trends towards a less pronounced effect in older patients (Pinteraction = .03; adjusted Pinteraction = .33) and diabetics (Pinteraction = .01; adjusted Pinteraction = .06) were observed. However, these interaction effects did not persist after adjusting for multiple testing. Conclusions: This subgroup analysis confirmed the consistent benefits of PA-guided HF therapy observed in the MONITOR-HF trial across clinically relevant subgroups, highlighting its efficacy in improving quality of life, clinical, and PA pressure endpoints in chronic HF patients.</p
- …