1,569 research outputs found
Double Averaging Analysis Applied to a Large Eddy Simulation of Coupled Turbulent Overlying and Porewater Flow
Freestream turbulence in rivers is a key contributor to the flux of dissolved nutrients, carbon, and other ecologically important solutes into porewater. To advance understanding of turbulent hyporheic exchange and porewater transport, we investigate flow over and through a rough bed of spheres using large eddy simulation (LES). We apply double averaging (combined space and time averaging) to the LES results to determine the mean velocity distribution, momentum balance, and drag forces. Our simulations show large-scale freestream structures interacting strongly with vortices generated at the surfaces of individual spheres to control turbulent momentum fluxes into the bed. The transition between turbulent flow and Darcy flow occurs over the first row of spheres, where turbulence decays rapidly and turbulent kinetic energy, Reynolds stress, and drag forces peak. Below this region, turbulence is only present in the high-velocity flow in open pore throats. Experimental observations suggest that minimum mean porewater velocity occurs in the first open pore space below the transition region, but our results show that the minimum occurs between the first and second pore spaces. The simulation mean porewater velocities are approximately half those captured in measurements because the model resolves the entire flow continuum while measurements can access high-velocity fluid in open pores. The high-resolution dual time-space averaging of the LES resolves both turbulent and mean flow features that are important to interfacial solute and particle fluxes, providing a means to include turbulent hyporheic exchange in upscaled river models, which has not been achieved to date
Symptom- and Laboratory-Based Ebola Risk Scores to Differentiate Likely Ebola Infections.
Rapidly identifying likely Ebola patients is difficult because of a broad case definition, overlap of symptoms with common illnesses, and lack of rapid diagnostics. However, rapid identification is critical for care and containment of contagion. We analyzed retrospective data from 252 Ebola-positive and 172 Ebola-negative patients at a Sierra Leone Ebola treatment center to develop easy-to-use risk scores, based on symptoms and laboratory tests (if available), to stratify triaged patients by their likelihood of having Ebola infection. Headache, diarrhea, difficulty breathing, nausea/vomiting, loss of appetite, and conjunctivitis comprised the symptom-based score. The laboratory-based score also included creatinine, creatine kinase, alanine aminotransferase, and total bilirubin. This risk score correctly identified 92% of Ebola-positive patients as high risk for infection; both scores correctly classified >70% of Ebola-negative patients as low or medium risk. Clinicians can use these risk scores to gauge the likelihood of triaged patients having Ebola while awaiting laboratory confirmation
Six-Year Incidence of Blindness and Visual Impairment in Kenya: The Nakuru Eye Disease Cohort Study.
PURPOSE: To describe the cumulative 6-year incidence of visual impairment (VI) and blindness in an adult Kenyan population. The Nakuru Posterior Segment Eye Disease Study is a population-based sample of 4414 participants aged ?50 years, enrolled in 2007-2008. Of these, 2170 (50%) were reexamined in 2013-2014. METHODS: The World Health Organization (WHO) and US definitions were used to calculate presenting visual acuity classifications based on logMAR visual acuity tests at baseline and follow-up. Detailed ophthalmic and anthropometric examinations as well as a questionnaire, which included past medical and ophthalmic history, were used to assess risk factors for study participation and vision loss. Cumulative incidence of VI and blindness, and factors associated with these outcomes, were estimated. Inverse probability weighting was used to adjust for nonparticipation. RESULTS: Visual acuity measurements were available for 2164 (99.7%) participants. Using WHO definitions, the 6-year cumulative incidence of VI was 11.9% (95%CI [confidence interval]: 10.3-13.8%) and blindness was 1.51% (95%CI: 1.0-2.2%); using the US classification, the cumulative incidence of blindness was 2.70% (95%CI: 1.8-3.2%). Incidence of VI increased strongly with older age, and independently with being diabetic. There are an estimated 21 new cases of VI per year in people aged ?50 years per 1000 people, of whom 3 are blind. Therefore in Kenya we estimate that there are 92,000 new cases of VI in people aged ?50 years per year, of whom 11,600 are blind, out of a total population of approximately 4.3 million people aged 50 and above. CONCLUSIONS: The incidence of VI and blindness in this older Kenyan population was considerably higher than in comparable studies worldwide. A continued effort to strengthen the eye health system is necessary to support the growing unmet need in an aging and growing population
Real-world effects of medications for stroke prevention in atrial fibrillation: protocol for a UK population-based non-interventional cohort study with validation against randomised trial results.
INTRODUCTION: Patients with atrial fibrillation experience an irregular heart rate and have an increased risk of stroke; prophylactic treatment with anticoagulation medication reduces this risk. Direct-acting oral anticoagulants (DOACs) have been approved providing an alternative to vitamin K antagonists such as warfarin. There is interest from regulatory bodies on the effectiveness of medications in routine clinical practice; however, uncertainty remains regarding the suitability of non-interventional data for answering questions on drug effectiveness and on the most suitable methods to be used. In this study, we will use data from Apixaban for Reduction in Stroke and Other Thromboembolic Events in Atrial Fibrillation (ARISTOTLE)-the pivotal trial for the DOAC apixaban-to validate non-interventional methods for assessing treatment effectiveness of anticoagulants. These methods could then be applied to analyse treatment effectiveness in people excluded from or under-represented in ARISTOTLE. METHODS AND ANALYSIS: Patient characteristics from ARISTOTLE will be used to select a cohort of patients with similar baseline characteristics from two UK electronic health record (EHR) databases, Clinical Practice Research Datalink Gold and Aurum (between 1 January 2013 and 31 July 2019). Methods such as propensity score matching and coarsened exact matching will be explored in matching between EHR treatment groups to determine the optimal method of obtaining a balanced cohort.Absolute and relative risk of outcomes in the EHR trial-analogous cohort will be calculated and compared with the ARISTOTLE results; if results are deemed compatible the methods used for matching EHR treatment groups can then be used to examine drug effectiveness over a longer duration of exposure and in special patient groups of interest not studied in the trial. ETHICS AND DISSEMINATION: The study has been approved by the Independent Scientific Advisory Committee of the UK Medicines and Healthcare Products Regulatory Agency. Results will be disseminated in scientific publications and at relevant conferences
Use of real-world evidence in postmarketing medicines regulation in the European Union: a systematic assessment of European Medicines Agency referrals 2013-2017.
OBJECTIVES: To assess the use, and evaluate the usefulness, of non-interventional studies and routinely collected healthcare data in postmarketing assessments conducted by the European Medicines Agency (EMA). DESIGN: We reviewed and systematically assessed all referrals to the EMA made due to safety or efficacy concerns that were evaluated between 1 January 2013 and 30 June 2017. We extracted information from the assessment report and the referral notification. Two reviewers independently assessed the contribution of non-interventional evidence to decision-making. RESULTS: The preliminary evidence leading to the assessment in 52 eligible referrals was mostly from spontaneous reports (cited in 26 of 52 referrals) and randomised trials (22/52). In contrast, many evidence types were used for the full assessment. Non-interventional studies were frequently used in the full assessment for the evaluation of product safety (31/52) and product efficacy (18/52). In particular, non-interventional studies were relied on for the evaluation of safety and efficacy in subgroups, the evaluation of safety relating to a rare adverse event, understanding product usage and misuse and for evaluation of the effectiveness of risk minimisation measures. The most common recommendations were changes to product information (43/52) and marketing authorisation withdrawal or suspension (12/52). In the majority of referrals, non-interventional evidence was judged to contribute to the decision made (30/52) and in three referrals it was the primary source of evidence. CONCLUSIONS: European regulatory decision-making relies on multiple evidence types, particularly randomised trials, spontaneous reports and non-interventional studies. Non-interventional studies had an important role particularly for the characterisation and quantification of adverse events, the evaluation of product usage and for evaluating the effectiveness of regulatory action to minimise risk
Arrangement for interfacing a telephone device with a personal computer
An arrangement provides an interface between a telephone device and a personal computer in such a manner that enhanced capability for both the telephone device and the computer in processing information in an analog telephone environment is provided. The telephone device attaches to an analog telephone line and advantageously operates either as a stand-alone device when the computer is powered-off or in tandem with the computer when the computer is powered-on. A user is able to access any of the available telephony features from the telephone device at all times and from the computer when it is powered-on. Such available telephony features include, by way of example, Caller ID for decoding available information presented on the analog telephone line and an integrated telephone answering system, which provides for reception, transmission, and storage of voice, facsimile, and electronic mail messages.Published versio
Recommended from our members
Moist Static Energy Budget Analysis of Tropical Cyclone Intensification in High-Resolution Climate Models
Tropical cyclone intensification processes are explored in six high-resolution climate models. The analysis framework employs process-oriented diagnostics that focus on how convection, moisture, clouds, and related processes are coupled. These diagnostics include budgets of column moist static energy and the spatial variance of column moist static energy, where the column integral is performed between fixed pressure levels. The latter allows for the quantification of the different feedback processes responsible for the amplification of moist static energy anomalies associated with the organization of convection and cyclone spinup, including surface flux feedbacks and cloud-radiative feedbacks. Tropical cyclones (TCs) are tracked in the climate model simulations and the analysis is applied along the individual tracks and composited over many TCs. Two methods of compositing are employed: a composite over all TC snapshots in a given intensity range, and a composite over all TC snapshots at the same stage in the TC life cycle (same time relative to the time of lifetime maximum intensity for each storm). The radiative feedback contributes to TC development in all models, especially in storms of weaker intensity or earlier stages of development. Notably, the surface flux feedback is stronger in models that simulate more intense TCs. This indicates that the representation of the interaction between spatially varying surface fluxes and the developing TC is responsible for at least part of the intermodel spread in TC simulation
Recommended from our members
Individual snag detection using neighborhood attribute filtered airborne lidar data
The ability to estimate and monitor standing dead trees (snags) has been difficult due to their irregular and sparse distribution, often requiring intensive sampling methods to obtain statistically significant estimates. This study presents a new method for estimating and monitoring snags using neighborhood attribute filtered airborne discrete-return lidar data. The method first develops and then applies an automated filtering algorithm that utilizes three dimensional neighborhood lidar point-based intensity and density statistics to remove lidar points associated with live trees and retain lidar points associated with snags. A traditional airborne lidar individual-tree detection procedure is then applied to the snag-filtered lidar point cloud, resulting in stem map of identified snags with height estimates. The filtering algorithm was developed using training datasets comprised of four different forest types in wide range of stand conditions, and then applied to independent data to determine successful snag detection rates. Detection rates ranged from 43 to 100%, increasing as the size of snags increased. The overall detection rate for snags with DBH ≥ 25 cm was 56% (± 2.9%) with low commission error rates. The method provides the ability to estimate snag density and stem map a large proportion of snags across the landscape. The resulting information can be used to analyze the spatial distribution of snags, provide a better understanding of wildlife snag use dynamics, assess achievement of stocking standard requirements, and bring more clarity to snag stocking standards.Keywords: Snag detection, Snag density, Airborne lidar, Forestry, Snags, Neighborhood attribute lidar filtering, Lidar filterin
Recommended from our members
Azimuthally Averaged Wind and Thermodynamic Structures of Tropical Cyclones in Global Climate Models and Their Sensitivity to Horizontal Resolution
Characteristics of tropical cyclones (TCs) in global climate models (GCMs) are known to be influenced by details of the model configurations, including horizontal resolution and parameterization schemes. Understanding model-to-model differences in TC characteristics is a prerequisite for reducing uncertainty in future TC activity projections by GCMs. This study performs a process-level examination of TC structures in eight GCM simulations that span a range of horizontal resolutions from 1° to 0.25°. A recently developed set of process-oriented diagnostics is used to examine the azimuthally averaged wind and thermodynamic structures of the GCM-simulated TCs. Results indicate that the inner-core wind structures of simulated TCs are more strongly constrained by the horizontal resolutions of the models than are the thermodynamic structures of those TCs. As expected, the structures of TC circulations become more realistic with smaller horizontal grid spacing, such that the radii of maximum wind (RMW) become smaller, and the maximum vertical velocities occur off the center. However, the RMWs are still too large, especially at higher intensities, and there are rising motions occurring at the storm centers, inconsistently with observations. The distributions of precipitation, moisture, and radiative and surface turbulent heat fluxes around TCs are diverse, even across models with similar horizontal resolutions. At the same horizontal resolution, models that produce greater rainfall in the inner-core regions tend to simulate stronger TCs. When TCs are weak, the radial gradient of net column radiative flux convergence is comparable to that of surface turbulent heat fluxes, emphasizing the importance of cloud–radiative feedbacks during the early developmental phases of TCs
- …