561 research outputs found
Fast Ensemble Smoothing
Smoothing is essential to many oceanographic, meteorological and hydrological
applications. The interval smoothing problem updates all desired states within
a time interval using all available observations. The fixed-lag smoothing
problem updates only a fixed number of states prior to the observation at
current time. The fixed-lag smoothing problem is, in general, thought to be
computationally faster than a fixed-interval smoother, and can be an
appropriate approximation for long interval-smoothing problems. In this paper,
we use an ensemble-based approach to fixed-interval and fixed-lag smoothing,
and synthesize two algorithms. The first algorithm produces a linear time
solution to the interval smoothing problem with a fixed factor, and the second
one produces a fixed-lag solution that is independent of the lag length.
Identical-twin experiments conducted with the Lorenz-95 model show that for lag
lengths approximately equal to the error doubling time, or for long intervals
the proposed methods can provide significant computational savings. These
results suggest that ensemble methods yield both fixed-interval and fixed-lag
smoothing solutions that cost little additional effort over filtering and model
propagation, in the sense that in practical ensemble application the additional
increment is a small fraction of either filtering or model propagation costs.
We also show that fixed-interval smoothing can perform as fast as fixed-lag
smoothing and may be advantageous when memory is not an issue
Recommended from our members
Bioavailability in soils
The consumption of locally-produced vegetables by humans may be an important exposure pathway for soil contaminants in many urban settings and for agricultural land use. Hence, prediction of metal and metalloid uptake by vegetables from contaminated soils is an important part of the Human Health Risk Assessment procedure. The behaviour of metals (cadmium, chromium, cobalt, copper, mercury, molybdenum, nickel, lead and zinc) and metalloids (arsenic, boron and selenium) in contaminated soils depends to a large extent on the intrinsic charge, valence and speciation of the contaminant ion, and soil properties such as pH, redox status and contents of clay and/or organic matter. However, chemistry and behaviour of the contaminant in soil alone cannot predict soil-to-plant transfer. Root uptake, root selectivity, ion interactions, rhizosphere processes, leaf uptake from the atmosphere, and plant partitioning are important processes that ultimately govern the accumulation ofmetals and metalloids in edible vegetable tissues. Mechanistic models to accurately describe all these processes have not yet been developed, let alone validated under field conditions. Hence, to estimate risks by vegetable consumption, empirical models have been used to correlate concentrations of metals and metalloids in contaminated soils, soil physico-chemical characteristics, and concentrations of elements in vegetable tissues. These models should only be used within the bounds of their calibration, and often need to be re-calibrated or validated using local soil and environmental conditions on a regional or site-specific basis.Mike J. McLaughlin, Erik Smolders, Fien Degryse, and Rene Rietr
Valoración de la función contráctil del ventrículo derecho por deformación en escala de grises bidimensional en una población con hipertensión pulmonar
Quantification of right ventricular function continues to evolve, even though his assessment is difficult due to the complex geometry of this cardiac chamber.Objectiveto characterize right ventricular function by calculating the strain, the longitudinal strain rate of the right ventricular free wall and the ejection fraction and volumes of the ventricle through assessment of the strain by two-dimensional speckle tracking, and to compare it with tricuspid annulus peak systolic excursion (TAPSE) in patients with pulmonary hypertension and in healthy population.Methodobservational descriptive studyResultsWe included 120 patients, of whom 80 suffered from pulmonary hypertension and 40 were healthy. The overall strain of the right ventricular free wall was significantly lower in the group with pulmonary hypertension compared to healthy subjects (-20.5 ± 6 vs. -25 ± 4.5; p <0.001); regional strain showed similar behavior. The overall longitudinal strain rate showed no significant differences between groups. We found a significant correlation between TAPSE and ejection fraction of the right ventricle (r = 0.49; p <0.001) and an inverse correlation between TAPSE and global longitudinal strain of the right ventricle free wall (r = -0.41; p <0.001).Conclusionsthe assessment of regional right ventricular function by two dimensional speckle tracking can be a useful tool for assessing right ventricular systolic function in patients with pulmonary arterial hypertension
Habit training versus habit training with direct visual biofeedback in adults with chronic constipation: A randomized controlled trial
Aim: The aim was to determine whether specialist-led habit training using Habit Training with Biofeedback (HTBF) is more effective than specialist-led habit training alone (HT) for chronic constipation and whether outcomes of interventions are improved by stratification to HTBF or HT based on diagnosis (functional defaecation disorder vs. no functional defaecation disorder) by radio-physiological investigations (INVEST). Method: This was a parallel three-arm randomized single-blinded controlled trial, permitting two randomized comparisons: HTBF versus HT alone; INVEST- versus no-INVEST-guided intervention. The inclusion criteria were age 18–70 years; attending specialist hospitals in England; self-reported constipation for >6 months; refractory to basic treatment. The main exclusions were secondary constipation and previous experience of the trial interventions. The primary outcome was the mean change in Patient Assessment of Constipation Quality of Life score at 6 months on intention to treat. The secondary outcomes were validated disease-specific and psychological questionnaires and cost-effectiveness (based on EQ-5D-5L). Results: In all, 182 patients were randomized 3:3:2 (target 384): HT n = 68; HTBF n = 68; INVEST-guided treatment n = 46. All interventions had similar reductions (improvement) in the primary outcome at 6 months (approximately −0.8 points of a 4-point scale) with no statistically significant difference between HT and HTBF (−0.03 points; 95% CI −0.33 to 0.27; P = 0.85) or INVEST versus no-INVEST (0.22; −0.11 to 0.55; P = 0.19). Secondary outcomes showed a benefit for all interventions with no evidence of greater cost-effectiveness of HTBF or INVEST compared with HT. Conclusion: The results of the study at 6 months were inconclusive. However, with the caveat of under-recruitment and further attrition at 6 months, a simple, cheaper approach to intervention may be as clinically effective and more cost-effective than more complex and invasive approaches
Accuracy of popular automatic QT Interval algorithms assessed by a 'Gold Standard' and comparison with a Novel method: computer simulation study
BACKGROUND: Accurate measurement of the QT interval is very important from a clinical and pharmaceutical drug safety screening perspective. Expert manual measurement is both imprecise and imperfectly reproducible, yet it is used as the reference standard to assess the accuracy of current automatic computer algorithms, which thus produce reproducible but incorrect measurements of the QT interval. There is a scientific imperative to evaluate the most commonly used algorithms with an accurate and objective 'gold standard' and investigate novel automatic algorithms if the commonly used algorithms are found to be deficient. METHODS: This study uses a validated computer simulation of 8 different noise contaminated ECG waveforms (with known QT intervals of 461 and 495 ms), generated from a cell array using Luo-Rudy membrane kinetics and the Crank-Nicholson method, as a reference standard to assess the accuracy of commonly used QT measurement algorithms. Each ECG contaminated with 39 mixtures of noise at 3 levels of intensity was first filtered then subjected to three threshold methods (T1, T2, T3), two T wave slope methods (S1, S2) and a Novel method. The reproducibility and accuracy of each algorithm was compared for each ECG. RESULTS: The coefficient of variation for methods T1, T2, T3, S1, S2 and Novel were 0.36, 0.23, 1.9, 0.93, 0.92 and 0.62 respectively. For ECGs of real QT interval 461 ms the methods T1, T2, T3, S1, S2 and Novel calculated the mean QT intervals(standard deviations) to be 379.4(1.29), 368.5(0.8), 401.3(8.4), 358.9(4.8), 381.5(4.6) and 464(4.9) ms respectively. For ECGs of real QT interval 495 ms the methods T1, T2, T3, S1, S2 and Novel calculated the mean QT intervals(standard deviations) to be 396.9(1.7), 387.2(0.97), 424.9(8.7), 386.7(2.2), 396.8(2.8) and 493(0.97) ms respectively. These results showed significant differences between means at >95% confidence level. Shifting ECG baselines caused large errors of QT interval with T1 and T2 but no error with Novel. CONCLUSION: The algorithms T2, T1 and Novel gave low coefficients of variation for QT measurement. The Novel technique gave the most accurate measurement of QT interval, T3 (a differential threshold method) was the next most accurate by a large margin. The objective and accurate 'gold standard' presented in this paper may be useful to assess new QT measurement algorithms. The Novel algorithm may prove to be more accurate and reliable method to measure the QT interval
Recommended from our members
Leaving the Past (Self) Behind: Non-Reporting Rape Survivors' Narratives of Self and Action
Using a symbolic interactionist framework, this study considers the narratives of non-reporting rape survivors. We use interviews to examine the complex processes that inform a survivor’s decision not to report. Rape is not interpreted as an isolated event; it is something that is seen as caused by, connected to, and affecting the survivor’s sense of self and agency. Rape forces the survivor to reconstruct a sense of agency in the aftermath of the traumatic attack. Rather than report the rape, the survivors constructed narratives that direct blame and accountability toward the “old self”. This less visible, yet still agentic strategy, allows the survivors to regain a sense of agency and control. As a result, a more positive, optimistic self can be constructed, while pursuing legal justice would force them to reenact an “old” self that cannot be disentangled from the rape
Quantum Tunneling in the Wigner Representation
Time dependence for barrier penetration is considered in the phase space. An
asymptotic phase-space propagator for nonrelativistic scattering on a one -
dimensional barrier is constructed. The propagator has a form universal for
various initial state preparations and local potential barriers. It is
manifestly causal and includes time-lag effects and quantum spreading. Specific
features of quantum dynamics which disappear in the standard semi-classical
approximation are revealed. The propagator may be applied to calculation of the
final momentum and coordinate distributions, for particles transmitted through
or reflected from the potential barrier, as well as for elucidating the
tunneling time problem.Comment: 18 pages, LATEX, no figure
Study protocol for a group randomized controlled trial of a classroom-based intervention aimed at preventing early risk factors for drug abuse: integrating effectiveness and implementation research
<p>Abstract</p> <p>Background</p> <p>While a number of preventive interventions delivered within schools have shown both short-term and long-term impact in epidemiologically based randomized field trials, programs are not often sustained with high-quality implementation over time. This study was designed to support two purposes. The first purpose was to test the effectiveness of a universal classroom-based intervention, the Whole Day First Grade Program (WD), aimed at two early antecedents to drug abuse and other problem behaviors, namely, aggressive, disruptive behavior and poor academic achievement. The second purpose--the focus of this paper--was to examine the utility of a multilevel structure to support high levels of implementation during the effectiveness trial, to sustain WD practices across additional years, and to train additional teachers in WD practices.</p> <p>Methods</p> <p>The WD intervention integrated three components, each previously tested separately: classroom behavior management; instruction, specifically reading; and family-classroom partnerships around behavior and learning. Teachers and students in 12 schools were randomly assigned to receive either the WD intervention or the standard first-grade program of the school system (SC). Three consecutive cohorts of first graders were randomized within schools to WD or SC classrooms and followed through the end of third grade to test the effectiveness of the WD intervention. Teacher practices were assessed over three years to examine the utility of the multilevel structure to support sustainability and scaling-up.</p> <p>Discussion</p> <p>The design employed in this trial appears to have considerable utility to provide data on WD effectiveness and to inform the field with regard to structures required to move evidence-based programs into practice.</p> <p>Trial Registration</p> <p><b>Clinical Trials Registration Number</b>: NCT00257088</p
- …