527 research outputs found

    Scalar and vector Slepian functions, spherical signal estimation and spectral analysis

    Full text link
    It is a well-known fact that mathematical functions that are timelimited (or spacelimited) cannot be simultaneously bandlimited (in frequency). Yet the finite precision of measurement and computation unavoidably bandlimits our observation and modeling scientific data, and we often only have access to, or are only interested in, a study area that is temporally or spatially bounded. In the geosciences we may be interested in spectrally modeling a time series defined only on a certain interval, or we may want to characterize a specific geographical area observed using an effectively bandlimited measurement device. It is clear that analyzing and representing scientific data of this kind will be facilitated if a basis of functions can be found that are "spatiospectrally" concentrated, i.e. "localized" in both domains at the same time. Here, we give a theoretical overview of one particular approach to this "concentration" problem, as originally proposed for time series by Slepian and coworkers, in the 1960s. We show how this framework leads to practical algorithms and statistically performant methods for the analysis of signals and their power spectra in one and two dimensions, and, particularly for applications in the geosciences, for scalar and vectorial signals defined on the surface of a unit sphere.Comment: Submitted to the 2nd Edition of the Handbook of Geomathematics, edited by Willi Freeden, Zuhair M. Nashed and Thomas Sonar, and to be published by Springer Verlag. This is a slightly modified but expanded version of the paper arxiv:0909.5368 that appeared in the 1st Edition of the Handbook, when it was called: Slepian functions and their use in signal estimation and spectral analysi

    Slepian functions and their use in signal estimation and spectral analysis

    Full text link
    It is a well-known fact that mathematical functions that are timelimited (or spacelimited) cannot be simultaneously bandlimited (in frequency). Yet the finite precision of measurement and computation unavoidably bandlimits our observation and modeling scientific data, and we often only have access to, or are only interested in, a study area that is temporally or spatially bounded. In the geosciences we may be interested in spectrally modeling a time series defined only on a certain interval, or we may want to characterize a specific geographical area observed using an effectively bandlimited measurement device. It is clear that analyzing and representing scientific data of this kind will be facilitated if a basis of functions can be found that are "spatiospectrally" concentrated, i.e. "localized" in both domains at the same time. Here, we give a theoretical overview of one particular approach to this "concentration" problem, as originally proposed for time series by Slepian and coworkers, in the 1960s. We show how this framework leads to practical algorithms and statistically performant methods for the analysis of signals and their power spectra in one and two dimensions, and on the surface of a sphere.Comment: Submitted to the Handbook of Geomathematics, edited by Willi Freeden, Zuhair M. Nashed and Thomas Sonar, and to be published by Springer Verla

    Calibur: a tool for clustering large numbers of protein decoys

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Ab initio protein structure prediction methods generate numerous structural candidates, which are referred to as decoys. The decoy with the most number of neighbors of up to a threshold distance is typically identified as the most representative decoy. However, the clustering of decoys needed for this criterion involves computations with runtimes that are at best quadratic in the number of decoys. As a result currently there is no tool that is designed to exactly cluster very large numbers of decoys, thus creating a bottleneck in the analysis.</p> <p>Results</p> <p>Using three strategies aimed at enhancing performance (proximate decoys organization, preliminary screening via lower and upper bounds, outliers filtering) we designed and implemented a software tool for clustering decoys called Calibur. We show empirical results indicating the effectiveness of each of the strategies employed. The strategies are further fine-tuned according to their effectiveness.</p> <p>Calibur demonstrated the ability to scale well with respect to increases in the number of decoys. For a sample size of approximately 30 thousand decoys, Calibur completed the analysis in one third of the time required when the strategies are not used.</p> <p>For practical use Calibur is able to automatically discover from the input decoys a suitable threshold distance for clustering. Several methods for this discovery are implemented in Calibur, where by default a very fast one is used. Using the default method Calibur reported relatively good decoys in our tests.</p> <p>Conclusions</p> <p>Calibur's ability to handle very large protein decoy sets makes it a useful tool for clustering decoys in ab initio protein structure prediction. As the number of decoys generated in these methods increases, we believe Calibur will come in important for progress in the field.</p

    PET-CT for detecting the undetected in the ICU

    Get PDF

    Efficacy of haloperidol to decrease the burden of delirium in adult critically ill patients:the EuRIDICE randomized clinical trial

    Get PDF
    Background:The role of haloperidol as treatment for ICU delirium and related symptoms remains controversial despite two recent large controlled trials evaluating its efficacy and safety. We sought to determine whether haloperidol when compared to placebo in critically ill adults with delirium reduces days with delirium and coma and improves delirium-related sequelae.Methods:This multi-center double-blind, placebo-controlled randomized trial at eight mixed medical-surgical Dutch ICUs included critically ill adults with delirium (Intensive Care Delirium Screening Checklist ≥ 4 or a positive Confusion Assessment Method for the ICU) admitted between February 2018 and January 2020. Patients were randomized to intravenous haloperidol 2.5 mg or placebo every 8 h, titrated up to 5 mg every 8 h if delirium persisted until ICU discharge or up to 14 days. The primary outcome was ICU delirium- and coma-free days (DCFDs) within 14 days after randomization. Predefined secondary outcomes included the protocolized use of sedatives for agitation and related behaviors, patient-initiated extubation and invasive device removal, adverse drug associated events, mechanical ventilation, ICU length of stay, 28-day mortality, and long-term outcomes up to 1-year after randomization.Results:The trial was terminated prematurely for primary endpoint futility on DSMB advice after enrolment of 132 (65 haloperidol; 67 placebo) patients [mean age 64 (15) years, APACHE IV score 73.1 (33.9), male 68%]. Haloperidol did not increase DCFDs (adjusted RR 0.98 [95% CI 0.73–1.31], p = 0.87). Patients treated with haloperidol (vs. placebo) were less likely to receive benzodiazepines (adjusted OR 0.41 [95% CI 0.18–0.89], p = 0.02). Effect measures of other secondary outcomes related to agitation (use of open label haloperidol [OR 0.43 (95% CI 0.12–1.56)] and other antipsychotics [OR 0.63 (95% CI 0.29–1.32)], self-extubation or invasive device removal [OR 0.70 (95% CI 0.22–2.18)]) appeared consistently more favorable with haloperidol, but the confidence interval also included harm. Adverse drug events were not different. Long-term secondary outcomes (e.g., ICU recall and quality of life) warrant further study.Conclusions:Haloperidol does not reduce delirium in critically ill delirious adults. However, it may reduce rescue medication requirements and agitation-related events in delirious ICU patients warranting further evaluation.Trial registration: ClinicalTrials.gov (#NCT03628391), October 9, 2017

    Efficacy and safety profiles of manidipine compared with amlodipine: A meta-analysis of head-to-head trials

    Get PDF
    The aim of this meta-analysis was to compare the efficacy and safety profile of manidipine 20 mg with that of amlodipine 10 mg. A systematic research of quantitative data produced or published between 1995 and 2009 was performed. Head-to-head randomized controlled trials (RCTs) of 12 months minimum duration reporting comparative efficacy (changes in systolic and diastolic blood pressure) and safety (total adverse events and ankle oedema), were included. Four high-quality RCTs, accounting for 838 patients (436 received manidipine and 402 received amlodipine) were included. The effi cacy of manidipine and amlodipine was statistically equivalent: effect size for DBP =−0.08 (p = 0.22) and SBP =−0.01 (p =0.83).The global safety of manidipine was signifi cantly better than amlodipine: the relative risk (RR) for adverse event was 0.69 (0.56 – 0.85), and particularly for ankle oedema RR was 0.35 (0.22 – 0.54). Publication bias was not signifi cant and the robustness of the analyses was good. These data suggest a better efficacy/safety ratio of manidipine over amlodipine

    Psychological determinants of whole-body endurance performance

    Get PDF
    Background: No literature reviews have systematically identified and evaluated research on the psychological determinants of endurance performance, and sport psychology performance-enhancement guidelines for endurance sports are not founded on a systematic appraisal of endurance-specific research. Objective: A systematic literature review was conducted to identify practical psychological interventions that improve endurance performance and to identify additional psychological factors that affect endurance performance. Additional objectives were to evaluate the research practices of included studies, to suggest theoretical and applied implications, and to guide future research. Methods: Electronic databases, forward-citation searches, and manual searches of reference lists were used to locate relevant studies. Peer-reviewed studies were included when they chose an experimental or quasi-experimental research design, a psychological manipulation, endurance performance as the dependent variable, and athletes or physically-active, healthy adults as participants. Results: Consistent support was found for using imagery, self-talk, and goal setting to improve endurance performance, but it is unclear whether learning multiple psychological skills is more beneficial than learning one psychological skill. The results also demonstrated that mental fatigue undermines endurance performance, and verbal encouragement and head-to-head competition can have a beneficial effect. Interventions that influenced perception of effort consistently affected endurance performance. Conclusions: Psychological skills training could benefit an endurance athlete. Researchers are encouraged to compare different practical psychological interventions, to examine the effects of these interventions for athletes in competition, and to include a placebo control condition or an alternative control treatment. Researchers are also encouraged to explore additional psychological factors that could have a negative effect on endurance performance. Future research should include psychological mediating variables and moderating variables. Implications for theoretical explanations of endurance performance and evidence-based practice are described

    Long-term monitoring in primary care for chronic kidney disease and chronic heart failure: a multi-method research programme

    Get PDF
    Background: Long-term monitoring is important in chronic condition management. Despite considerable costs of monitoring, there is no or poor evidence on how, what and when to monitor. The aim of this study was to improve understanding, methods, evidence base and practice of clinical monitoring in primary care, focusing on two areas: chronic kidney disease and chronic heart failure. Objectives: The research questions were as follows: does the choice of test affect better care while being affordable to the NHS? Can the number of tests used to manage individuals with early-stage kidney disease, and hence the costs, be reduced? Is it possible to monitor heart failure using a simple blood test? Can this be done using a rapid test in a general practitioner consultation? Would changes in the management of these conditions be acceptable to patients and carers? Design: Various study designs were employed, including cohort, feasibility study, Clinical Practice Research Datalink analysis, seven systematic reviews, two qualitative studies, one cost-effectiveness analysis and one cost recommendation. Setting: This study was set in UK primary care. Data sources: Data were collected from study participants and sourced from UK general practice and hospital electronic health records, and worldwide literature. Participant: The participants were NHS patients (Clinical Practice Research Datalink: 4.5 million patients), chronic kidney disease and chronic heart failure patients managed in primary care (including 750 participants in the cohort study) and primary care health professionals. Interventions: The interventions were monitoring with blood and urine tests (for chronic kidney disease) and monitoring with blood tests and weight measurement (for chronic heart failure). Main outcome measures: The main outcomes were the frequency, accuracy, utility, acceptability, costs and cost-effectiveness of monitoring. Results: Chronic kidney disease: serum creatinine testing has increased steadily since 1997, with most results being normal (83% in 2013). Increases in tests of creatinine and proteinuria correspond to their introduction as indicators in the Quality and Outcomes Framework. The Chronic Kidney Disease Epidemiology Collaboration equation had 2.7% greater accuracy (95% confidence interval 1.6% to 3.8%) than the Modification of Diet in Renal Disease equation for estimating glomerular filtration rate. Estimated annual transition rates to the next chronic kidney disease stage are ≈ 2% for people with normal urine albumin, 3–5% for people with microalbuminuria (3–30 mg/mmol) and 3–12% for people with macroalbuminuria (> 30 mg/mmol). Variability in estimated glomerular filtration rate-creatinine leads to misclassification of chronic kidney disease stage in 12–15% of tests in primary care. Glycaemic-control and lipid-modifying drugs are associated with a 6% (95% confidence interval 2% to 10%) and 4% (95% confidence interval 0% to 8%) improvement in renal function, respectively. Neither estimated glomerular filtration rate-creatinine nor estimated glomerular filtration rate-Cystatin C have utility in predicting rate of kidney function change. Patients viewed phrases such as ‘kidney damage’ or ‘kidney failure’ as frightening, and the term ‘chronic’ was misinterpreted as serious. Diagnosis of asymptomatic conditions (chronic kidney disease) was difficult to understand, and primary care professionals often did not use ‘chronic kidney disease’ when managing patients at early stages. General practitioners relied on Clinical Commissioning Group or Quality and Outcomes Framework alerts rather than National Institute for Health and Care Excellence guidance for information. Cost-effectiveness modelling did not demonstrate a tangible benefit of monitoring kidney function to guide preventative treatments, except for individuals with an estimated glomerular filtration rate of 60–90 ml/minute/1.73 m2, aged < 70 years and without cardiovascular disease, where monitoring every 3–4 years to guide cardiovascular prevention may be cost-effective. Chronic heart failure: natriuretic peptide-guided treatment could reduce all-cause mortality by 13% and heart failure admission by 20%. Implementing natriuretic peptide-guided treatment is likely to require predefined protocols, stringent natriuretic peptide targets, relative targets and being located in a specialist heart failure setting. Remote monitoring can reduce all-cause mortality and heart failure hospitalisation, and could improve quality of life. Diagnostic accuracy of point-of-care N-terminal prohormone of B-type natriuretic peptide (sensitivity, 0.99; specificity, 0.60) was better than point-of-care B-type natriuretic peptide (sensitivity, 0.95; specificity, 0.57). Within-person variation estimates for B-type natriuretic peptide and weight were as follows: coefficient of variation, 46% and coefficient of variation, 1.2%, respectively. Point-of-care N-terminal prohormone of B-type natriuretic peptide within-person variability over 12 months was 881 pg/ml (95% confidence interval 380 to 1382 pg/ml), whereas between-person variability was 1972 pg/ml (95% confidence interval 1525 to 2791 pg/ml). For individuals, monitoring provided reassurance; future changes, such as increased testing, would be acceptable. Point-of-care testing in general practice surgeries was perceived positively, reducing waiting time and anxiety. Community heart failure nurses had greater knowledge of National Institute for Health and Care Excellence guidance than general practitioners and practice nurses. Health-care professionals believed that the cost of natriuretic peptide tests in routine monitoring would outweigh potential benefits. The review of cost-effectiveness studies suggests that natriuretic peptide-guided treatment is cost-effective in specialist settings, but with no evidence for its value in primary care settings. Limitations: No randomised controlled trial evidence was generated. The pathways to the benefit of monitoring chronic kidney disease were unclear. Conclusions: It is difficult to ascribe quantifiable benefits to monitoring chronic kidney disease, because monitoring is unlikely to change treatment, especially in chronic kidney disease stages G3 and G4. New approaches to monitoring chronic heart failure, such as point-of-care natriuretic peptide tests in general practice, show promise if high within-test variability can be overcome

    Variational Methods for Biomolecular Modeling

    Full text link
    Structure, function and dynamics of many biomolecular systems can be characterized by the energetic variational principle and the corresponding systems of partial differential equations (PDEs). This principle allows us to focus on the identification of essential energetic components, the optimal parametrization of energies, and the efficient computational implementation of energy variation or minimization. Given the fact that complex biomolecular systems are structurally non-uniform and their interactions occur through contact interfaces, their free energies are associated with various interfaces as well, such as solute-solvent interface, molecular binding interface, lipid domain interface, and membrane surfaces. This fact motivates the inclusion of interface geometry, particular its curvatures, to the parametrization of free energies. Applications of such interface geometry based energetic variational principles are illustrated through three concrete topics: the multiscale modeling of biomolecular electrostatics and solvation that includes the curvature energy of the molecular surface, the formation of microdomains on lipid membrane due to the geometric and molecular mechanics at the lipid interface, and the mean curvature driven protein localization on membrane surfaces. By further implicitly representing the interface using a phase field function over the entire domain, one can simulate the dynamics of the interface and the corresponding energy variation by evolving the phase field function, achieving significant reduction of the number of degrees of freedom and computational complexity. Strategies for improving the efficiency of computational implementations and for extending applications to coarse-graining or multiscale molecular simulations are outlined.Comment: 36 page
    corecore