422 research outputs found

    Qualitative evaluation of media device orchestration for immersive spatial audio reproduction

    Get PDF
    The challenge of installing and setting up dedicated spatial audio systems can make it difficult to deliver immersive listening experiences to the general public. However, the proliferation of smart mobile devices and the rise of the Internet of Things mean that there are increasing numbers of connected devices capable of producing audio in the home. \Media device orchestration" (MDO) is the concept of utilizing an ad hoc set of devices to deliver or augment a media experience. In this paper, the concept is evaluated by implementing MDO for augmented spatial audio reproduction using object-based audio with semantic metadata. A thematic analysis of positive and negative listener comments about the system revealed three main categories of response: perceptual, technical, and content-dependent aspects. MDO performed particularly well in terms of immersion/envelopment, but the quality of listening experience was partly dependent on loudspeaker quality and listener position. Suggestions for further development based on these categories are given

    Study of Optimal Perimetric Testing In Children (OPTIC): developing consensus and setting research priorities for perimetry in the management of children with glaucoma

    Get PDF
    BACKGROUND: Perimetry is important in the management of children with glaucoma, but there is limited evidence-based guidance on its use. We report an expert consensus-based study to update guidance and identify areas requiring further research. METHODS: Experts were invited to participate in a modified Delphi consensus process. Panel selection was based on clinical experience of managing children with glaucoma and UK-based training to minimise diversity of view due to healthcare setting. Questionnaires were delivered electronically, and analysed to establish 'agreement'. Divergence of opinions was investigated and resolved where possible through further iterations. RESULTS: 7/9 experts invited agreed to participate. Consensus (≥5/7 (71%) in agreement) was achieved for 21/26 (80.8%) items in 2 rounds, generating recommendations to start perimetry from approximately 7 years of age (IQR: 6.75-7.25), and use qualitative methods in conjunction with automated reliability indices to assess test quality. There was a lack of agreement about defining progressive visual field (VF) loss and methods for implementing perimetry longitudinally. Panel members highlighted the importance of informing decisions based upon individual circumstances-from gauging maturity/capability when selecting tests and interpreting outcomes, to accounting for specific clinical features (e.g. poor IOP control and/or suspected progressive VF loss) when making decisions about frequency of testing. CONCLUSIONS: There is commonality of expert views in relation to implementing perimetry and interpreting test quality in the management of children with glaucoma. However, there remains a lack of agreement about defining progressive VF loss, and utilising perimetry over an individuals' lifetime, highlighting the need for further research

    The effects of temperature on nitrous oxide and oxygen mixture homogeneity and stability

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>For many long standing practices, the rationale for them is often lost as time passes. This is the situation with respect to the storage and handling of equimolar 50% nitrous oxide and 50% oxygen volume/volume (v/v) mixtures.</p> <p>Methods</p> <p>A review was undertaken of existing literature to examine the developmental history of nitrous oxide and oxygen mixtures for anesthesia and analgesia and to ascertain if sufficient bibliographic data was available to support the position that the contents of a cylinder of a 50%/50% volume/volume (v/v) mixture of nitrous oxide and oxygen is in a homogenous single gas phase in a filled cylinder under normal conditions of handling and storage and if justification could be found for the standard instructions given for handling before use.</p> <p>Results</p> <p>After ranking and removing duplicates, a total of fifteen articles were identified by the various search strategies and formed the basis of this literature review. Several studies were identified that confirmed that 50%/50% v/v mixture of nitrous oxide and oxygen is in a homogenous single gas phase in a filled cylinder under normal conditions of handling and storage. The effect of temperature on the change of phase of the nitrous oxide in this mixture was further examined by several authors. These studies demonstrated that although it is possible to cause condensation and phase separation by cooling the cylinder, by allowing the cylinder to rewarm to room temperature for at least 48 hours, preferably in a horizontal orientation, and inverting it three times before use, the cylinder consistently delivered the proper proportions of the component gases as a homogenous mixture.</p> <p>Conclusions</p> <p>The contents of a cylinder of a 50%/50% volume/volume (v/v) mixture of nitrous oxide and oxygen is in a homogenous single gas phase in a filled cylinder under normal conditions of handling and storage. The standard instructions given for handling before are justified based on previously conducted studies.</p

    Study of Optimal Perimetric Testing In Children (OPTIC): Development and feasibility of the kinetic perimetry reliability measure (KPRM)

    Get PDF
    INTRODUCTION: Interpretation of perimetric findings, particularly in children, relies on accurate assessment of test reliability, yet no objective measures of reliability exist for kinetic perimetry. We developed the kinetic perimetry reliability measure (KPRM), a quantitative measure of perimetric test reproducibility/reliability and report here its feasibility and association with subjective assessment of reliability. METHODS: Children aged 5-15 years, without an ophthalmic condition that affects the visual field, were recruited from Moorfields Eye Hospital and underwent Goldmann perimetry as part of a wider research programme on perimetry in children. Subjects were tested with two isopters and the blind spot was plotted, followed by a KPRM. Test reliability was also scored qualitatively using our examiner-based assessment of reliability (EBAR) scoring system, which standardises the conventional clinical approach to assessing test quality. The relationship between KPRM and EBAR was examined to explore the use of KPRM in assessing reliability of kinetic fields. RESULTS: A total of 103 children (median age 8.9 years; IQR: 7.1 to 11.8 years) underwent Goldmann perimetry with KPRM and EBAR scoring. A KPRM was achieved by all children. KPRM values increased with reducing test quality (Kruskal-Wallis, p=0.005), indicating greater testretest variability, and reduced with age (linear regression, p=0.015). One of 103 children (0.97%) demonstrated discordance between EBAR and KPRM. CONCLUSION: KPRM and EBAR are distinct but complementary approaches. Though scores show excellent agreement, KPRM is able to quantify withintest variability, providing data not captured by subjective assessment. Thus, we suggest combining KPRM with EBAR to aid interpretation of kinetic perimetry test reliability in children

    A review of reporting of participant recruitment and retention in RCTs in six major journals

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Poor recruitment and retention of participants in randomised controlled trials (RCTs) is problematic but common. Clear and detailed reporting of participant flow is essential to assess the generalisability and comparability of RCTs. Despite improved reporting since the implementation of the CONSORT statement, important problems remain. This paper aims: (i) to update and extend previous reviews evaluating reporting of participant recruitment and retention in RCTs; (ii) to quantify the level of participation throughout RCTs.</p> <p>Methods</p> <p>We reviewed all reports of RCTs of health care interventions and/or processes with individual randomisation, published July–December 2004 in six major journals. Short, secondary or interim reports, and Phase I/II trials were excluded. Data recorded were: general RCT details; inclusion of flow diagram; participant flow throughout trial; reasons for non-participation/withdrawal; target sample sizes.</p> <p>Results</p> <p>133 reports were reviewed. Overall, 79% included a flow diagram, but over a third were incomplete. The majority reported the flow of participants at each stage of the trial after randomisation. However, 40% failed to report the numbers assessed for eligibility. Percentages of participants retained at each stage were high: for example, 90% of eligible individuals were randomised, and 93% of those randomised were outcome assessed. On average, trials met their sample size targets. However, there were some substantial shortfalls: for example 21% of trials reporting a sample size calculation failed to achieve adequate numbers at randomisation, and 48% at outcome assessment. Reporting of losses to follow up was variable and difficult to interpret.</p> <p>Conclusion</p> <p>The majority of RCTs reported the flow of participants well after randomisation, although only two-thirds included a complete flow chart and there was great variability over the definition of "lost to follow up". Reporting of participant eligibility was poor, making assessments of recruitment practice and external validity difficult. Reporting of participant flow throughout RCTs could be improved by small changes to the CONSORT chart.</p

    The Mitochondrial Ca(2+) Uniporter: Structure, Function, and Pharmacology.

    Get PDF
    Mitochondrial Ca(2+) uptake is crucial for an array of cellular functions while an imbalance can elicit cell death. In this chapter, we briefly reviewed the various modes of mitochondrial Ca(2+) uptake and our current understanding of mitochondrial Ca(2+) homeostasis in regards to cell physiology and pathophysiology. Further, this chapter focuses on the molecular identities, intracellular regulators as well as the pharmacology of mitochondrial Ca(2+) uniporter complex

    Increased Litterfall in Tropical Forests Boosts the Transfer of Soil CO2 to the Atmosphere

    Get PDF
    Aboveground litter production in forests is likely to increase as a consequence of elevated atmospheric carbon dioxide (CO2) concentrations, rising temperatures, and shifting rainfall patterns. As litterfall represents a major flux of carbon from vegetation to soil, changes in litter inputs are likely to have wide-reaching consequences for soil carbon dynamics. Such disturbances to the carbon balance may be particularly important in the tropics because tropical forests store almost 30% of the global soil carbon, making them a critical component of the global carbon cycle; nevertheless, the effects of increasing aboveground litter production on belowground carbon dynamics are poorly understood. We used long-term, large-scale monthly litter removal and addition treatments in a lowland tropical forest to assess the consequences of increased litterfall on belowground CO2 production. Over the second to the fifth year of treatments, litter addition increased soil respiration more than litter removal decreased it; soil respiration was on average 20% lower in the litter removal and 43% higher in the litter addition treatment compared to the controls but litter addition did not change microbial biomass. We predicted a 9% increase in soil respiration in the litter addition plots, based on the 20% decrease in the litter removal plots and an 11% reduction due to lower fine root biomass in the litter addition plots. The 43% measured increase in soil respiration was therefore 34% higher than predicted and it is possible that this ‘extra’ CO2 was a result of priming effects, i.e. stimulation of the decomposition of older soil organic matter by the addition of fresh organic matter. Our results show that increases in aboveground litter production as a result of global change have the potential to cause considerable losses of soil carbon to the atmosphere in tropical forests

    The Open AUC Project

    Get PDF
    Progress in analytical ultracentrifugation (AUC) has been hindered by obstructions to hardware innovation and by software incompatibility. In this paper, we announce and outline the Open AUC Project. The goals of the Open AUC Project are to stimulate AUC innovation by improving instrumentation, detectors, acquisition and analysis software, and collaborative tools. These improvements are needed for the next generation of AUC-based research. The Open AUC Project combines on-going work from several different groups. A new base instrument is described, one that is designed from the ground up to be an analytical ultracentrifuge. This machine offers an open architecture, hardware standards, and application programming interfaces for detector developers. All software will use the GNU Public License to assure that intellectual property is available in open source format. The Open AUC strategy facilitates collaborations, encourages sharing, and eliminates the chronic impediments that have plagued AUC innovation for the last 20 years. This ultracentrifuge will be equipped with multiple and interchangeable optical tracks so that state-of-the-art electronics and improved detectors will be available for a variety of optical systems. The instrument will be complemented by a new rotor, enhanced data acquisition and analysis software, as well as collaboration software. Described here are the instrument, the modular software components, and a standardized database that will encourage and ease integration of data analysis and interpretation software

    A biophysical model of dynamic balancing of excitation and inhibition in fast oscillatory large-scale networks

    Get PDF
    Over long timescales, neuronal dynamics can be robust to quite large perturbations, such as changes in white matter connectivity and grey matter structure through processes including learning, aging, development and certain disease processes. One possible explanation is that robust dynamics are facilitated by homeostatic mechanisms that can dynamically rebalance brain networks. In this study, we simulate a cortical brain network using the Wilson-Cowan neural mass model with conduction delays and noise, and use inhibitory synaptic plasticity (ISP) to dynamically achieve a spatially local balance between excitation and inhibition. Using MEG data from 55 subjects we find that ISP enables us to simultaneously achieve high correlation with multiple measures of functional connectivity, including amplitude envelope correlation and phase locking. Further, we find that ISP successfully achieves local E/I balance, and can consistently predict the functional connectivity computed from real MEG data, for a much wider range of model parameters than is possible with a model without ISP

    Exhaustive prediction of disease susceptibility to coding base changes in the human genome

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Single Nucleotide Polymorphisms (SNPs) are the most abundant form of genomic variation and can cause phenotypic differences between individuals, including diseases. Bases are subject to various levels of selection pressure, reflected in their inter-species conservation.</p> <p>Results</p> <p>We propose a method that is not dependant on transcription information to score each coding base in the human genome reflecting the disease probability associated with its mutation. Twelve factors likely to be associated with disease alleles were chosen as the input for a support vector machine prediction algorithm. The analysis yielded 83% sensitivity and 84% specificity in segregating disease like alleles as found in the Human Gene Mutation Database from non-disease like alleles as found in the Database of Single Nucleotide Polymorphisms. This algorithm was subsequently applied to each base within all known human genes, exhaustively confirming that interspecies conservation is the strongest factor for disease association. For each gene, the length normalized average disease potential score was calculated. Out of the 30 genes with the highest scores, 21 are directly associated with a disease. In contrast, out of the 30 genes with the lowest scores, only one is associated with a disease as found in published literature. The results strongly suggest that the highest scoring genes are enriched for those that might contribute to disease, if mutated.</p> <p>Conclusion</p> <p>This method provides valuable information to researchers to identify sensitive positions in genes that have a high disease probability, enabling them to optimize experimental designs and interpret data emerging from genetic and epidemiological studies.</p
    • …
    corecore