1,573 research outputs found
Calibrating ensemble reliability whilst preserving spatial structure
Ensemble forecasts aim to improve decision-making by predicting a set of possible outcomes. Ideally, these would provide probabilities which are both sharp and reliable. In practice, the models, data assimilation and ensemble perturbation systems are all imperfect, leading to deficiencies in the predicted probabilities. This paper presents an ensemble post-processing scheme which directly targets local reliability, calibrating both climatology and ensemble dispersion in one coherent operation. It makes minimal assumptions about the underlying statistical distributions, aiming to extract as much information as possible from the original dynamic forecasts and support statistically awkward variables such as precipitation. The output is a set of ensemble members preserving the spatial, temporal and inter-variable structure from the raw forecasts, which should be beneficial to downstream applications such as hydrological models. The calibration is tested on three leading 15-d ensemble systems, and their aggregation into a simple multimodel ensemble. Results are presented for 12 h, 1° scale over Europe for a range of surface variables, including precipitation. The scheme is very effective at removing unreliability from the raw forecasts, whilst generally preserving or improving statistical resolution. In most cases, these benefits extend to the rarest events at each location within the 2-yr verification period. The reliability and resolution are generally equivalent or superior to those achieved using a Local Quantile-Quantile Transform, an established calibration method which generalises bias correction. The value of preserving spatial structure is demonstrated by the fact that 3×3 averages derived from grid-scale precipitation calibration perform almost as well as direct calibration at 3×3 scale, and much better than a similar test neglecting the spatial relationships. Some remaining issues are discussed regarding the finite size of the output ensemble, variables such as sea-level pressure which are very reliable to start with, and the best way to handle derived variables such as dewpoint depression
Trends in HIV testing and recording of HIV status in the UK primary care setting: a retrospective cohort study 1995-2005
Objectives: To provide nationally representative data on trends in HIV testing in primary care and to estimate the proportion of diagnosed HIV positive individuals known to general practitioners (GPs). Methods: We undertook a retrospective cohort study between 1995 and 2005 of all general practices contributing data to the UK General Practice Research Database (GPRD), and data on persons accessing HIV care (Survey of Prevalent HIV Infections Diagnosed). We identified all practice-registered patients where an HIV test or HIV positive status is recorded in their general practice records. HIV testing in primary care and prevalence of recorded HIV positive status in primary care were estimated. Results: Despite 11-fold increases in male testing and 19-fold increases in non-pregnant female testing between 1995 and 2005, HIV testing rates remained low in 2005 at 71.3 and 61.2 tests per 100 000 person years for males and females, respectively, peaking at 162.5 and 173.8 per 100 000 person years at 25–34 years of age. Inclusion of antenatal tests yielded a 129-fold increase in women over the 10-year period. In 2005, 50.7% of HIV positive individuals had their diagnosis recorded with a lower proportion in London (41.8%) than outside the capital (60.1%). Conclusion: HIV testing rates in primary care remain low. Normalisation of HIV testing and recording in primary care in antenatal testing has not been accompanied by a step change in wider HIV testing practice. Recording of HIV positive status by GPs remains low and GPs may be unaware of HIV-related morbidity or potential drug interactions
Primary care consultations and costs among HIV-positive individulas in UK primary care 1995-2005: a cohort study
Objectives: To investigate the role of primary care in the management of HIV and estimate primary care-associated costs at a time of rising prevalence.
Methods: Retrospective cohort study between 1995 and 2005, using data from general practices contributing data to the UK General Practice Research Database. Patterns of consultation and morbidity and associated consultation costs were analysed among all practice-registered patients for whom HIV-positive status was recorded in the general practice record.
Results: 348 practices yielded 5504 person-years (py) of follow-up for known HIV-positive patients, who consult in general practice frequently (4.2 consultations/py by men, 5.2 consultations/py by women, in 2005) for a range of conditions. Consultation rates declined in the late 1990s from 5.0 and 7.3 consultations/py in 1995 in men and women, respectively, converging to rates similar to the wider population. Costs of consultation (general practitioner and nurse, combined) reflect these changes, at £100.27 for male patients and £117.08 for female patients in 2005. Approximately one in six medications prescribed in primary care for HIV-positive individuals has the potential for major interaction with antiretroviral medications.
Conclusion: HIV-positive individuals known in general practice now consult on a similar scale to the wider population. Further research should be undertaken to explore how primary care can best contribute to improving the health outcomes of this group with chronic illness. Their substantial use of primary care suggests there may be potential to develop effective integrated care pathways
Three-year tracking of fatty acid composition of plasma phospholipids in healthy children
Objectives: The fatty acid composition of plasma phospholipids reflects the dietary fatty acid intake as well as endogenous turnover. We aimed at investigating the potential tracking of plasma phospholipid fatty acid composition in children that participated in a prospective cohort study. Methods: 26 healthy children participated in a longitudinal study on health risks and had been enrolled after birth. All children were born at term with birth weights appropriate for gestational age. Follow-up took place at ages 24, 36 and 60 months. At each time point a 24-hour dietary recall was obtained, anthropometric parameters were measured and a blood sample for phospholipid fatty acid analysis was taken. Results: Dietary intake of saturated (SFA), monounsaturated (MUFA) and polyunsaturated (PUFA) fatty acids at the three time points were not correlated. We found lower values for plasma MUFA and the MUFA/SFA ratio at 60 months compared to 24 months. In contrast, total PUFA, total n-6 and n-6 long-chain polyunsaturated fatty acids (LC-PUFA) were higher at 60 months. Significant averaged correlation coefficients (average of Pearson's R for 24 versus 36 months and 36 versus 60 months) were found for n-6 LC-PUFA (r = 0.67), n-6/n-3 LC-PUFA ratio (r = 0.59) and arachidonic acid/linoleic acid ratio (r = 0.64). Partial tracking was found for the docosahexaenoic acid/alpha-linolenic acid ratio (r = 0.33). Body mass index and sum of skinfolds Z-scores were similar in the three evaluations. Conclusions: A significant tracking of n-6 LC-PUFA, n-6 LC-PUFA/n-3 LC-PUFA ratio, arachidonic acid/ linoleic acid ratio and docosahexaenoic acid/alpha-linolenic acid ratio may reflect an influence of individual endogenous fatty acid metabolism on plasma concentrations of some, but not all, fatty acids. Copyright (c) 2007 S. Karger AG, Basel
Multi-parameter models of innovation diffusion on complex networks
A model, applicable to a range of innovation diffusion applications with a
strong peer to peer component, is developed and studied, along with methods for
its investigation and analysis. A particular application is to individual
households deciding whether to install an energy efficiency measure in their
home. The model represents these individuals as nodes on a network, each with a
variable representing their current state of adoption of the innovation. The
motivation to adopt is composed of three terms, representing personal
preference, an average of each individual's network neighbours' states and a
system average, which is a measure of the current social trend. The adoption
state of a node changes if a weighted linear combination of these factors
exceeds some threshold. Numerical simulations have been carried out, computing
the average uptake after a sufficient number of time-steps over many
realisations at a range of model parameter values, on various network
topologies, including random (Erdos-Renyi), small world (Watts-Strogatz) and
(Newman's) highly clustered, community-based networks. An analytical and
probabilistic approach has been developed to account for the observed
behaviour, which explains the results of the numerical calculations
Phi-values in protein folding kinetics have energetic and structural components
Phi-values are experimental measures of how the kinetics of protein folding
is changed by single-site mutations. Phi-values measure energetic quantities,
but are often interpreted in terms of the structures of the transition state
ensemble. Here we describe a simple analytical model of the folding kinetics in
terms of the formation of protein substructures. The model shows that
Phi-values have both structural and energetic components. In addition, it
provides a natural and general interpretation of "nonclassical" Phi-values
(i.e., less than zero, or greater than one). The model reproduces the
Phi-values for 20 single-residue mutations in the alpha-helix of the protein
CI2, including several nonclassical Phi-values, in good agreement with
experiments.Comment: 15 pages, 3 figures, 1 tabl
Dynamic clamp with StdpC software
Dynamic clamp is a powerful method that allows the introduction of artificial electrical components into target cells to simulate ionic conductances and synaptic inputs. This method is based on a fast cycle of measuring the membrane potential of a cell, calculating the current of a desired simulated component using an appropriate model and injecting this current into the cell. Here we present a dynamic clamp protocol using free, fully integrated, open-source software (StdpC, for spike timing-dependent plasticity clamp). Use of this protocol does not require specialist hardware, costly commercial software, experience in real-time operating systems or a strong programming background. The software enables the configuration and operation of a wide range of complex and fully automated dynamic clamp experiments through an intuitive and powerful interface with a minimal initial lead time of a few hours. After initial configuration, experimental results can be generated within minutes of establishing cell recording
Asymmetry analysis of the arm segments during forward handspring on floor
Asymmetry in gymnastics underpins successful performance and may also have implications as an injury mechanism; therefore, understanding of this concept could be useful for coaches and clinicians. The aim of this study was to examine kinematic and external kinetic asymmetry of the arm segments during the contact phase of a fundamental skill, the forward handspring on floor. Using a repeated single subject design six female National elite gymnasts (age: 19 ± 1.5 years, mass: 58.64 ± 3.72 kg, height: 1.62 ± 0.41 m), each performed 15 forward handsprings, synchronised 3D kinematic and kinetic data were collected. Asymmetry between the lead and non-lead side arms was quantified during each trial. Significant kinetic asymmetry was observed for all gymnasts (p < 0.005) with the direction of the asymmetry being related to the lead leg. All gymnasts displayed kinetic asymmetry for ground reaction force. Kinematic asymmetry was present for more gymnasts at the shoulder than the distal joints. These findings provide useful information for coaching gymnastics skills, which may subjectively appear to be symmetrical. The observed asymmetry has both performance and injury implications
Recommended from our members
Ensemble prediction for nowcasting with a convection-permitting model—I: description of the system and the impact of radar-derived surface precipitation rates
A key strategy to improve the skill of quantitative predictions of precipitation, as well as hazardous weather such as severe thunderstorms and flash floods is to exploit the use of observations of convective activity (e.g. from radar). In this paper, a convection-permitting ensemble prediction system (EPS) aimed at addressing the problems of forecasting localized weather events with relatively short predictability time scale and based on a 1.5 km grid-length version of the Met Office Unified Model is presented. Particular attention is given to the impact of using predicted observations of radar-derived precipitation intensity in the ensemble transform Kalman filter (ETKF) used within the EPS. Our initial results based on the use of a 24-member ensemble of forecasts for two summer case studies show that the convective-scale EPS produces fairly reliable forecasts of temperature, horizontal winds and relative humidity at 1 h lead time, as evident from the inspection of rank histograms. On the other hand, the rank histograms seem also to show that the EPS generates too much spread for forecasts of (i) surface pressure and (ii) surface precipitation intensity. These may indicate that for (i) the value of surface pressure observation error standard deviation used to generate surface pressure rank histograms is too large and for (ii) may be the result of non-Gaussian precipitation observation errors. However, further investigations are needed to better understand these findings. Finally, the inclusion of predicted observations of precipitation from radar in the 24-member EPS considered in this paper does not seem to improve the 1-h lead time forecast skill
Content and Feedback Analysis of YouTube Videos: Football Clubs and Fans as Brand Communities
The use of Web 2.0 tools has been transforming the interaction between companies and their clients, especially for those that are selling emotional products. Consumers are generating and sharing contents concerning their favourite products on the web. Even if this process has been widely acknowledged, only a few studies have been specifically devoted to the analysis of both the contents and the feedback the consumers receive from other users.
This article analyzes the online presence of sport brands through contents that are generated by sport clubs (official contents) and their fans (User Generated Content, UGC) on YouTube. After a description and classification of video contents, it examines the factors that influence the performance of the videos in terms of passive (videos views) and active behaviour (any kinds of interaction with videos) among the viewers.
In order to carry out this analysis, 125 YouTube channels were considered thereby accounting for a total of 375 videos.
Results show that official contents are those preferred by the users/consumers and that if the video displays a passive/purely informative content, the chance of getting an active behaviour from the users tends to decrease.
These findings may help companies manage their online presence, creating awareness about contents and information that should be spread and shared on the web
- …