1,966 research outputs found

    Architecture and Neuroscience; what can the EEG recording of brain activity reveal about a walk through everyday spaces?

    Get PDF
    New digital media and quantitative data have been increasingly used in an attempt to map, understand and analyse spaces. Each different medium with which we analyse and map spaces offers a different insight, and can potentially increase our tools and methods for mapping spaces and understanding human experience. The emergence of such technologies has the potential to influence the way in which we map, analyse and perceive spaces. Given this context, the project presented in this paper examines how neurophysiological data, recorded with the use of portable electroencephalography (EEG) devices, can help us understand how the brain responds to physical environments in different individuals. In this study we look into how a number of participants navigate in an urban environment; between specific identified buildings in the city. The brain activity of the participants is recorded with a portable EEG device whilst simultaneously video recording the route. Through this experiment we aim to observe and analyse the relationship between the physical environment and the participant’s type of brain activity. We attempt to correlate how key moments of their journey, such as moments of decision making, relate to recordings of specific brain waves. We map and analyse certain common patterns observed. We look into how the variation of the physical attributes of the built environment around them is related to the fluctuation of specific brain waves. This paper presents a specific project of an ongoing cross-disciplinary study between architecture and neuroscience, and the key findings of a specific experiment in an urban environment

    Health inequalities in Germany: do regional-level variables explain differentials in cardiovascular risk?

    Get PDF
    Breckenkamp J, Mielck A, Razum O. Health inequalities in Germany: do regional-level variables explain differentials in cardiovascular risk? BMC Public Health. 2007;7(1): 132.Background: Socioeconomic status is a predictor not only of mortality, but also of cardiovascular risk and morbidity. An ongoing debate in the field of social inequalities and health focuses on two questions: 1) Is individual health status associated with individual income as well as with income inequality at the aggregate (e. g. regional) level? 2) If there is such an association, does it operate via a psychosocial pathway (e.g. stress) or via a ´´neo-materialistic´´ pathway (e.g. systematic under-investment in societal infrastructures)? For the first time in Germany, we here investigate the association between cardiovascular health status and income inequality at the area level, controlling for individual socio-economic status. Methods: Individual-level explanatory variables (age, socio-economic status) and outcome data (body mass index, blood pressure, cholesterol level) as well as the regional-level variable (proportion of relative poverty) were taken from the baseline survey of the German Cardiovascular Prevention Study, a cross-sectional, community-based, multi-center intervention study, comprising six socio-economically diverse intervention regions, each with about 1800 participants aged 25–69 years. Multilevel modeling was used to examine the effects of individual and regional level variables. Results: Regional effects are small compared to individual effects for all risk factors analyzed. Most of the total variance is explained at the individual level. Only for diastolic blood pressure in men and for cholesterol in both men and women is a statistically significant effect visible at the regional level. Conclusion: Our analysis does not support the assumption that in Germany cardiovascular risk factors were to a large extent associated with income inequality at regional level

    Design principles for riboswitch function

    Get PDF
    Scientific and technological advances that enable the tuning of integrated regulatory components to match network and system requirements are critical to reliably control the function of biological systems. RNA provides a promising building block for the construction of tunable regulatory components based on its rich regulatory capacity and our current understanding of the sequence–function relationship. One prominent example of RNA-based regulatory components is riboswitches, genetic elements that mediate ligand control of gene expression through diverse regulatory mechanisms. While characterization of natural and synthetic riboswitches has revealed that riboswitch function can be modulated through sequence alteration, no quantitative frameworks exist to investigate or guide riboswitch tuning. Here, we combined mathematical modeling and experimental approaches to investigate the relationship between riboswitch function and performance. Model results demonstrated that the competition between reversible and irreversible rate constants dictates performance for different regulatory mechanisms. We also found that practical system restrictions, such as an upper limit on ligand concentration, can significantly alter the requirements for riboswitch performance, necessitating alternative tuning strategies. Previous experimental data for natural and synthetic riboswitches as well as experiments conducted in this work support model predictions. From our results, we developed a set of general design principles for synthetic riboswitches. Our results also provide a foundation from which to investigate how natural riboswitches are tuned to meet systems-level regulatory demands

    Lynch syndrome: barriers to and facilitators of screening and disease management

    Get PDF
    Background Lynch syndrome is a hereditary cancer with confirmed carriers at high risk for colorectal (CRC) and extracolonic cancers. The purpose of the current study was to develop a greater understanding of the factors influencing decisions about disease management post-genetic testing. Methods The study used a grounded theory approach to data collection and analysis as part of a multiphase project examining the psychosocial and behavioral impact of predictive DNA testing for Lynch syndrome. Individual and small group interviews were conducted with individuals from 10 families with the MSH2 intron 5 splice site mutation or exon 8 deletion. The data from confirmed carriers (n = 23) were subjected to re-analysis to identify key barriers to and/or facilitators of screening and disease management. Results Thematic analysis identified personal, health care provider and health care system factors as dominant barriers to and/or facilitators of managing Lynch syndrome. Person-centered factors reflect risk perceptions and decision-making, and enduring screening/disease management. The perceived knowledge and clinical management skills of health care providers also influenced participation in recommended protocols. The health care system barriers/facilitators are defined in terms of continuity of care and coordination of services among providers. Conclusions Individuals with Lynch syndrome often encounter multiple barriers to and facilitators of disease management that go beyond the individual to the provider and health care system levels. The current organization and implementation of health care services are inadequate. A coordinated system of local services capable of providing integrated, efficient health care and follow-up, populated by providers with knowledge of hereditary cancer, is necessary to maintain optimal health

    Insight into glucocorticoid receptor signalling through interactome model analysis

    Get PDF
    Glucocorticoid hormones (GCs) are used to treat a variety of diseases because of their potent anti-inflammatory effect and their ability to induce apoptosis in lymphoid malignancies through the glucocorticoid receptor (GR). Despite ongoing research, high glucocorticoid efficacy and widespread usage in medicine, resistance, disease relapse and toxicity remain factors that need addressing. Understanding the mechanisms of glucocorticoid signalling and how resistance may arise is highly important towards improving therapy. To gain insight into this we undertook a systems biology approach, aiming to generate a Boolean model of the glucocorticoid receptor protein interaction network that encapsulates functional relationships between the GR, its target genes or genes that target GR, and the interactions between the genes that interact with the GR. This model named GEB052 consists of 52 nodes representing genes or proteins, the model input (GC) and model outputs (cell death and inflammation), connected by 241 logical interactions of activation or inhibition. 323 changes in the relationships between model constituents following in silico knockouts were uncovered, and steady-state analysis followed by cell-based microarray genome-wide model validation led to an average of 57% correct predictions, which was taken further by assessment of model predictions against patient microarray data. Lastly, semi-quantitative model analysis via microarray data superimposed onto the model with a score flow algorithm has also been performed, which demonstrated significantly higher correct prediction ratios (average of 80%), and the model has been assessed as a predictive clinical tool using published patient microarray data. In summary we present an in silico simulation of the glucocorticoid receptor interaction network, linked to downstream biological processes that can be analysed to uncover relationships between GR and its interactants. Ultimately the model provides a platform for future development both by directing laboratory research and allowing for incorporation of further components, encapsulating more interactions/genes involved in glucocorticoid receptor signalling

    Choosing Organic Pesticides over Synthetic Pesticides May Not Effectively Mitigate Environmental Risk in Soybeans

    Get PDF
    Background: Selection of pesticides with small ecological footprints is a key factor in developing sustainable agricultural systems. Policy guiding the selection of pesticides often emphasizes natural products and organic-certified pesticides to increase sustainability, because of the prevailing public opinion that natural products are uniformly safer, and thus more environmentally friendly, than synthetic chemicals. Methodology/Principal Findings: We report the results of a study examining the environmental impact of several new synthetic and certified organic insecticides under consideration as reduced-risk insecticides for soybean aphid (Aphis glycines) control, using established and novel methodologies to directly quantify pesticide impact in terms of biocontrol services. We found that in addition to reduced efficacy against aphids compared to novel synthetic insecticides, organic approved insecticides had a similar or even greater negative impact on several natural enemy species in lab studies, were more detrimental to biological control organisms in field experiments, and had higher Environmental Impact Quotients at field use rates. Conclusions/Significance: These data bring into caution the widely held assumption that organic pesticides are more environmentally benign than synthetic ones. All pesticides must be evaluated using an empirically-based risk assessment

    Concurrent use of prescription drugs and herbal medicinal products in older adults: A systematic review

    Get PDF
    This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.The use of herbal medicinal products (HMPs) is common among older adults. However, little is known about concurrent use with prescription drugs as well as the potential interactions associated with such combinations. Objective Identify and evaluate the literature on concurrent prescription and HMPs use among older adults to assess prevalence, patterns, potential interactions and factors associated with this use. Methods Systematic searches in MEDLINE, PsycINFO, EMBASE, CINAHL, AMED, Web of Science and Cochrane from inception to May 2017 for studies reporting concurrent use of prescription medicines with HMPs in adults (β‰₯65 years). Quality was assessed using the Joanna Briggs Institute checklists. The Evidence for Policy and Practice Information and Co-ordinating Centre (EPPI-Centre) three stage approach to mixed method research was used to synthesise data. Results Twenty-two studies were included. A definition of HMPs or what was considered HMP was frequently missing. Prevalence of concurrent use by older adults varied widely between 5.3% and 88.3%. Prescription medicines most combined with HMPs were antihypertensive drugs, beta blockers, diuretics, antihyperlipidemic agents, anticoagulants, analgesics, antihistamines, antidiabetics, antidepressants and statins. The HMPs most frequently used were: ginkgo, garlic, ginseng, St John’s wort, Echinacea, saw palmetto, evening primrose oil and ginger. Potential risks of bleeding due to use of ginkgo, garlic or ginseng with aspirin or warfarin was the most reported herb-drug interaction. Some data suggests being female, a lower household income and less than high school education were associated with concurrent use. Conclusion Prevalence of concurrent prescription drugs and HMPs use among older adults is substantial and potential interactions have been reported. Knowledge of the extent and manner in which older adults combine prescription drugs will aid healthcare professionals can appropriately identify and manage patients at risk.Peer reviewedFinal Published versio

    Mutation Accumulation in a Selfing Population: Consequences of Different Mutation Rates between Selfers and Outcrossers

    Get PDF
    Currently existing theories predict that because deleterious mutations accumulate at a higher rate, selfing populations suffer from more intense genetic degradation relative to outcrossing populations. This prediction may not always be true when we consider a potential difference in deleterious mutation rate between selfers and outcrossers. By analyzing the evolutionary stability of selfing and outcrossing in an infinite population, we found that the genome-wide deleterious mutation rate would be lower in selfing than in outcrossing organisms. When this difference in mutation rate was included in simulations, we found that in a small population, mutations accumulated more slowly under selfing rather than outcrossing. This result suggests that under frequent and intense bottlenecks, a selfing population may have a lower risk of genetic extinction than an outcrossing population

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    Β© 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. Β© 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio
    • …
    corecore