4,240 research outputs found

    MetaboLab - advanced NMR data processing and analysis for metabolomics

    Get PDF
    Background\ud Despite wide-spread use of Nuclear Magnetic Resonance (NMR) in metabolomics for the analysis of biological samples there is a lack of graphically driven, publicly available software to process large one and two-dimensional NMR data sets for statistical analysis.\ud \ud Results\ud Here we present MetaboLab, a MATLAB based software package that facilitates NMR data processing by providing automated algorithms for processing series of spectra in a reproducible fashion. A graphical user interface provides easy access to all steps of data processing via a script builder to generate MATLAB scripts, providing an option to alter code manually. The analysis of two-dimensional spectra (1H,13C-HSQC spectra) is facilitated by the use of a spectral library derived from publicly available databases which can be extended readily. The software allows to display specific metabolites in small regions of interest where signals can be picked. To facilitate the analysis of series of two-dimensional spectra, different spectra can be overlaid and assignments can be transferred between spectra. The software includes mechanisms to account for overlapping signals by highlighting neighboring and ambiguous assignments.\ud \ud Conclusions\ud The MetaboLab software is an integrated software package for NMR data processing and analysis, closely linked to the previously developed NMRLab software. It includes tools for batch processing and gives access to a wealth of algorithms available in the MATLAB framework. Algorithms within MetaboLab help to optimize the flow of metabolomics data preparation for statistical analysis. The combination of an intuitive graphical user interface along with advanced data processing algorithms facilitates the use of MetaboLab in a broader metabolomics context.\ud \u

    Using the Traditional Ex Vivo Whole Blood Model to Discriminate Bacteria by Their Inducible Host Responses

    Get PDF
    \ua9 2024 by the authors.Whole blood models are rapid and versatile for determining immune responses to inflammatory and infectious stimuli, but they have not been used for bacterial discrimination. Staphylococcus aureus, S. epidermidis and Escherichia coli are the most common causes of invasive disease, and rapid testing strategies utilising host responses remain elusive. Currently, immune responses can only discriminate between bacterial ‘domains’ (fungi, bacteria and viruses), and very few studies can use immune responses to discriminate bacteria at the species and strain level. Here, whole blood was used to investigate the relationship between host responses and bacterial strains. Results confirmed unique temporal profiles for the 10 parameters studied: IL-6, MIP-1α, MIP-3α, IL-10, resistin, phagocytosis, S100A8, S100A8/A9, C5a and TF3. Pairwise analysis confirmed that IL-6, resistin, phagocytosis, C5a and S100A8/A9 could be used in a discrimination scheme to identify to the strain level. Linear discriminant analysis (LDA) confirmed that (i) IL-6, MIP-3α and TF3 could predict genera with 95% accuracy; (ii) IL-6, phagocytosis, resistin and TF3 could predict species at 90% accuracy and (iii) phagocytosis, S100A8 and IL-10 predicted strain at 40% accuracy. These data are important because they confirm the proof of concept that host biomarker panels could be used to identify bacterial pathogens

    The design-by-adaptation approach to universal access: learning from videogame technology

    Get PDF
    This paper proposes an alternative approach to the design of universally accessible interfaces to that provided by formal design frameworks applied ab initio to the development of new software. This approach, design-byadaptation, involves the transfer of interface technology and/or design principles from one application domain to another, in situations where the recipient domain is similar to the host domain in terms of modelled systems, tasks and users. Using the example of interaction in 3D virtual environments, the paper explores how principles underlying the design of videogame interfaces may be applied to a broad family of visualization and analysis software which handles geographical data (virtual geographic environments, or VGEs). One of the motivations behind the current study is that VGE technology lags some way behind videogame technology in the modelling of 3D environments, and has a less-developed track record in providing the variety of interaction methods needed to undertake varied tasks in 3D virtual worlds by users with varied levels of experience. The current analysis extracted a set of interaction principles from videogames which were used to devise a set of 3D task interfaces that have been implemented in a prototype VGE for formal evaluation

    High mobility explains demand sharing and enforced cooperation in egalitarian hunter-gatherers.

    Get PDF
    'Simple' hunter-gatherer populations adopt the social norm of 'demand sharing', an example of human hyper-cooperation whereby food brought into camps is claimed and divided by group members. Explaining how demand sharing evolved without punishment to free riders, who rarely hunt but receive resources from active hunters, has been a long-standing problem. Here we show through a simulation model that demand-sharing families that continuously move between camps in response to their energy income are able to survive in unpredictable environments typical of hunter-gatherers, while non-sharing families and sedentary families perish. Our model also predicts that non-producers (free riders, pre-adults and post-productive adults) can be sustained in relatively high numbers. As most of hominin pre-history evolved in hunter-gatherer settings, demand sharing may be an ancestral manifestation of hyper-cooperation and inequality aversion, allowing exploration of high-quality, hard-to-acquire resources, the evolution of fluid co-residence patterns and egalitarian resource distribution in the absence of punishment or warfare

    Azithromycin plus chloroquine: combination therapy for protection against malaria and sexually transmitted infections in pregnancy

    Get PDF
    INTRODUCTION: The first-line therapy for the intermittent preventive treatment of malaria in pregnancy (IPTp) is sulphadoxine-pyrimethamine (SP). There is an urgent need to identify safe, well-tolerated and efficacious alternatives to SP due to widespread Plasmodium falciparum resistance. Combination therapy using azithromycin and chloroquine is one possibility that has demonstrated adequate parasitological response > 95% in clinical trials of non-pregnant adults in sub-Saharan Africa and where IPTp is a government policy in 33 countries. AREAS COVERED: Key safety, tolerability and efficacy data are presented for azithromycin and chloroquine, alone and/or in combination, when used to prevent and/or treat P. falciparum, P. vivax, and several curable sexually transmitted and reproductive tract infections (STI/RTI). Pharmacokinetic evidence from pregnant women is also summarized for both compounds. EXPERT OPINION: The azithromycin-chloroquine regimen that has demonstrated consistent efficacy in non-pregnant adults has been a 3-day course containing daily doses of 1 g of azithromycin and 600 mg base of chloroquine. The pharmacokinetic evidence of these compounds individually suggests that dose adjustments may not be necessary when used in combination for treatment efficacy against P. falciparum, P. vivax, as well as several curable STI/RTI among pregnant women, although clinical confirmation will be necessary. Mass trachoma-treatment campaigns have shown that azithromycin selects for macrolide resistance in the pneumococcus, which reverses following the completion of therapy. Most importantly, no evidence to date suggests that azithromycin induces pneumococcal resistance to penicillin

    Agricultural climate change mitigation : Carbon calculators as a guide for decision making

    Get PDF
    This is an Accepted Manuscript of an article published by Taylor & Francis Group in International Journal of Agricultural Sustainability on 9 November 2017, available online: https://doi.org/10.1080/14735903.2017.1398628. Under embargo. Embargo end date: 9 November 2018.The dairy industry is receiving considerable attention in relation to both its significant greenhouse gas (GHG) emissions, and it’s potential for reducing those emissions, contributing towards meeting national targets and driving the industry towards sustainable intensification. However, the extent to which improvements can be made is dependent on the decision making processes of individual producers, so there has been a proliferation of carbon accounting tools seeking to influence those processes. This paper evaluates the suitability of such tools for driving environmental change by influencing on-farm management decisions. Seven tools suitable for the European dairy industry were identified, their characteristics evaluated, and used to process data relating to six scenario farms, emulating process undertaken in real farm management situations. As a result of the range of approaches taken by the tools, there was limited agreement between them as to GHG emissions magnitude, and no consistent pattern as to which tools resulted in the highest/lowest results. Despite this it is argued, that as there was agreement as to the farm activities responsible for the greatest emissions, the more complex tools were still capable of performing a ‘decision support’ role, and guiding management decisions, whilst others could merely focus attention on key issues.Peer reviewe

    Good practice in social care: the views of people with severe and complex needs and those who support them

    Get PDF
    This paper reports findings drawn from a study of good practice in English social care for adults with disability and older people with severe and complex needs. People with severe and complex needs are a relatively small proportion of adult social care service users, but they are growing in numbers and have resource-intensive needs. The study involved qualitative research with adults with disability and older people with severe and complex needs, family carers and members of specialist organisations (n = 67), focusing on the features of social care services they considered to be good practice. Data were collected between August 2010 and June 2011. The approach to data collection was flexible, to accommodate participants' communication needs and preferences, including face-to-face and telephone interviews, Talking Mats(c) sessions and a focus group. Data were managed using Framework and analysed thematically. Features of good practice were considered at three levels: (i) everyday support, (ii) service organisation, and (iii) commissioning. Findings relating to the first two of these are presented here. Participants emphasised the importance of person-centred ways of working at all levels. Personalisation, as currently implemented in English social care, aims to shift power from professionals to service users through the allocation of personal budgets. This approach focuses very much on the role of the individual in directing his/her own support arrangements. However, participants in this study also stressed the importance of ongoing professional support, for example, from a specialist key worker or case manager to co-ordinate diverse services and ensure good practice at an organisational level. The paper argues that, despite the recent move to shift power from professionals to service users, people with the most complex needs still value support from professionals and appropriate organisational support. Without these, they risk being excluded from the benefits that personalisation, properly supported, could yield. Keywords : continuity of care; dementia; people with disability; qualitative research; service delivery and organisation

    Testing the feasibility of the Dignity Therapy interview: adaptation for the Danish culture

    Get PDF
    <p>Abstract</p> <p>Background</p> <p><b>'</b>Dignity Therapy' (DT) is a brief, flexible intervention, which allows patients to complete an interview and create a document regarding their life, identity and what they want to leave in writing for their loved ones. DT is based on the DT Question Protocol. Developed and tested in English speaking settings, DT has proven to be a feasible and effective way to enhance patient dignity, while diminishing suffering and depression. The aim of this study was to test the acceptability and feasibility of the DT Question Protocol among Danish health professionals and cancer patients, and to obtain preliminary estimates of patient uptake for DT. These results will be used to inform a larger evaluation study.</p> <p>Method</p> <p>Ten professionals were interviewed about their perception of DT and the Question Protocol. It was then tested with 20 patients at two palliative care sites and one gynecologic oncology department. Data was analyzed using content analysis techniques to evaluate the protocol for relevance, acceptability and comprehension. The interest and relevance of the intervention was also determined by examining the preliminary participation rate.</p> <p>Results</p> <p>Overall, DT was perceived to be comprehensible and relevant. Professionals highlighted six concerns that might warrant modification. These issues were examined using patient data. Some of their concerns overlapped with those raised by the professionals (e.g. <it>'unacceptable self-praise' </it>and '<it>interference with the lives of others'</it>). Tailoring DT to Danish culture required easily accommodated adjustments to the procedures and the DT Question Protocol. Some concerns expressed by health professionals may have reflected protectiveness toward the patients. While the intervention was relevant and manageable for patients admitted to palliative care, DT was less easily implemented at the gynecologic oncology department.</p> <p>Conclusion</p> <p>Based on patients' and professionals' reaction to the DT Question Protocol, and based on the preliminary proportion of participants accepting DT, the DT question protocol - with minor adaptations - appears to be a manageable, acceptable and relevant intervention for Danish patients admitted to palliative care.</p

    Increasing Neff with particles in thermal equilibrium with neutrinos

    Full text link
    Recent work on increasing the effective number of neutrino species (Neff) in the early universe has focussed on introducing extra relativistic species (`dark radiation'). We draw attention to another possibility: a new particle of mass less than 10 MeV that remains in thermal equilibrium with neutrinos until it becomes non-relativistic increases the neutrino temperature relative to the photons. We demonstrate that this leads to a value of Neff that is greater than three and that Neff at CMB formation is larger than at BBN. We investigate the constraints on such particles from the primordial abundance of helium and deuterium created during BBN and from the CMB power spectrum measured by ACT and SPT and find that they are presently relatively unconstrained. We forecast the sensitivity of the Planck satellite to this scenario: in addition to dramatically improving constraints on the particle mass, in some regions of parameter space it can discriminate between the new particle being a real or complex scalar.Comment: 10 pages, 5 figures v2 matches version to appear in JCA

    Rapid neurogenesis through transcriptional activation in human stem cells

    Get PDF
    Advances in cellular reprogramming and stem cell differentiation now enable ex vivo studies of human neuronal differentiation. However, it remains challenging to elucidate the underlying regulatory programs because differentiation protocols are laborious and often result in low neuron yields. Here, we overexpressed two Neurogenin transcription factors in human-induced pluripotent stem cells and obtained neurons with bipolar morphology in 4 days, at greater than 90% purity. The high purity enabled mRNA and microRNA expression profiling during neurogenesis, thus revealing the genetic programs involved in the rapid transition from stem cell to neuron. The resulting cells exhibited transcriptional, morphological and functional signatures of differentiated neurons, with greatest transcriptional similarity to prenatal human brain samples. Our analysis revealed a network of key transcription factors and microRNAs that promoted loss of pluripotency and rapid neurogenesis via progenitor states. Perturbations of key transcription factors affected homogeneity and phenotypic properties of the resulting neurons, suggesting that a systems-level view of the molecular biology of differentiation may guide subsequent manipulation of human stem cells to rapidly obtain diverse neuronal types
    corecore