231 research outputs found

    Where ecosystems, people and health meet: academic traditons and emerging fields for research and practice

    Get PDF
    Human-driven environmental change has brought attention to the importance of ecosystems in sustaining human health and well-being. There are various schools of thought and fields of inquiry and action that seek to understand health in relation to linked social and ecological phenomena. We describe 18 such fields and outline common elements and incongruities among them. They converge around the application of systems thinking and crossing disciplinary boundaries, while differences are found in methodologies, research foci and problem framing. Although fields encourage sustainable and equitable pathways for health promotion, depoliticized and ahistorical approaches continue to be standard practice. Future research calls for a deeper commitment to examining ourselves as political actors, making space for conversations around power dynamics, and (re)centering participants in research methodologies.As mudanças ambientais antrópicas despertaram a atenção para a importância dos ecossistemas como fundamentais para sustentar a saúde e o bem-estar humanos. Várias escolas de pensamento e campos de atuação em pesquisa e ação buscam compreender a saúde e os fenômenos sociais e ecológicos associados. Apresentamos 18 desses campos de atuação destacando seus elementos comuns e divergências. Eles convergem em torno do cruzamento de fronteiras disciplinares e na aplicação do pensamento sistêmico, enquanto as principais diferenças são encontradas nas metodologias, nos enfoques de pesquisa e no enquadramento dos problemas. Embora os campos busquem promover a saúde pelos caminhos sustentáveis e equitativos, as abordagens despolitizadas e a-históricas continuam sendo parte da prática padrão. Pesquisas futuras requerem um compromisso maior na avaliação das nossas próprias condutas como atores políticos e na promoção de novos espaços de discussões sobre a dinâmica de poder, a fim de (re)centralizar os participantes nas metodologias de pesquisa

    The effects of acute CRAM supplementation on reaction time and subjective measures of focus and alertness in healthy college students

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The purpose of this study was to examine the effect of acute and prolonged (4-weeks) ingestion of a supplement designed to improve reaction time and subjective measures of alertness, energy, fatigue, and focus compared to placebo.</p> <p>Methods</p> <p>Nineteen physically-active subjects (17 men and 2 women) were randomly assigned to a group that either consumed a supplement (21.1 ± 0.6 years; body mass: 80.6 ± 9.4 kg) or placebo (21.3 ± 0.8 years; body mass: 83.4 ± 18.5 kg). During the initial testing session (T1), subjects were provided 1.5 g of the supplement (CRAM; α-glycerophosphocholine, choline bitartrate, phosphatidylserine, vitamins B3, B6, and B12, folic acid, L-tyrosine, anhydrous caffeine, acetyl-L-carnitine, and naringin) or a placebo (PL), and rested quietly for 10-minutes before completing a questionnaire on subjective feelings of energy, fatigue, alertness and focus (PRE). Subjects then performed a 4-minute quickness and reaction test followed by a 10-min bout of exhaustive exercise. The questionnaire and reaction testing sequence was then repeated (POST). Subjects reported back to the lab (T2) following 4-weeks of supplementation and repeated the testing sequence.</p> <p>Results</p> <p>Reaction time significantly declined (p = 0.050) between PRE and POST at T1 in subjects consuming PL, while subjects under CRAM supplementation were able to maintain (p = 0.114) their performance. Significant performance declines were seen in both groups from PRE to POST at T2. Elevations in fatigue were seen for CRAM at both T1 and T2 (p = 0.001 and p = 0.000, respectively), but only at T2 for PL (p = 0.029). Subjects in CRAM maintained focus between PRE and POST during both T1 and T2 trials (p = 0.152 and p = 0.082, respectively), whereas significant declines in focus were observed between PRE and POST in PL at both trials (p = 0.037 and p = 0.014, respectively). No difference in alertness was seen at T1 between PRE and POST for CRAM (p = 0.083), but a significant decline was recorded at T2 (p = 0.005). Alertness was significantly lower at POST at both T1 and T2 for PL (p = 0.040 and p = 0.33, respectively). No differences in any of these subjective measures were seen between the groups at any time point.</p> <p>Conclusion</p> <p>Results indicate that acute ingestion of CRAM can maintain reaction time, and subjective feelings of focus and alertness to both visual and auditory stimuli in healthy college students following exhaustive exercise. However, some habituation may occur following 4-weeks of supplementation.</p

    The genomic basis of adaptive evolution in threespine sticklebacks

    Get PDF
    Marine stickleback fish have colonized and adapted to thousands of streams and lakes formed since the last ice age, providing an exceptional opportunity to characterize genomic mechanisms underlying repeated ecological adaptation in nature. Here we develop a high-quality reference genome assembly for threespine sticklebacks. By sequencing the genomes of twenty additional individuals from a global set of marine and freshwater populations, we identify a genome-wide set of loci that are consistently associated with marine–freshwater divergence. Our results indicate that reuse of globally shared standing genetic variation, including chromosomal inversions, has an important role in repeated evolution of distinct marine and freshwater sticklebacks, and in the maintenance of divergent ecotypes during early stages of reproductive isolation. Both coding and regulatory changes occur in the set of loci underlying marine–freshwater evolution, but regulatory changes appear to predominate in this well known example of repeated adaptive evolution in nature.National Human Genome Research Institute (U.S.)National Human Genome Research Institute (U.S.) (NHGRI CEGS Grant P50-HG002568

    Molecular and cellular mechanisms underlying the evolution of form and function in the amniote jaw.

    Get PDF
    The amniote jaw complex is a remarkable amalgamation of derivatives from distinct embryonic cell lineages. During development, the cells in these lineages experience concerted movements, migrations, and signaling interactions that take them from their initial origins to their final destinations and imbue their derivatives with aspects of form including their axial orientation, anatomical identity, size, and shape. Perturbations along the way can produce defects and disease, but also generate the variation necessary for jaw evolution and adaptation. We focus on molecular and cellular mechanisms that regulate form in the amniote jaw complex, and that enable structural and functional integration. Special emphasis is placed on the role of cranial neural crest mesenchyme (NCM) during the species-specific patterning of bone, cartilage, tendon, muscle, and other jaw tissues. We also address the effects of biomechanical forces during jaw development and discuss ways in which certain molecular and cellular responses add adaptive and evolutionary plasticity to jaw morphology. Overall, we highlight how variation in molecular and cellular programs can promote the phenomenal diversity and functional morphology achieved during amniote jaw evolution or lead to the range of jaw defects and disease that affect the human condition

    Alert but less alarmed: a pooled analysis of terrorism threat perception in Australia

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Previous Australian research has highlighted disparities in community perceptions of the threat posed by terrorism. A study with a large sample size is needed to examine reported concerns and anticipated responses of community sub-groups and to determine their consistency with existing Australian and international findings.</p> <p>Methods</p> <p>Representative samples of New South Wales (NSW) adults completed terrorism perception questions as part of computer assisted telephone interviews (CATI) in 2007 (N = 2081) and 2010 (N = 2038). Responses were weighted against the NSW population. Data sets from the two surveys were pooled and multivariate multilevel analyses conducted to identify health and socio-demographic factors associated with higher perceived risk of terrorism and evacuation response intentions, and to examine changes over time.</p> <p>Results</p> <p>In comparison with 2007, Australians in 2010 were significantly more likely to believe that a terrorist attack would occur in Australia (Adjusted Odd Ratios (AOR) = 1.24, 95%CI:1.06-1.45) but felt less concerned that they would be directly affected by such an incident (AOR = 0.65, 95%CI:0.55-0.75). Higher perceived risk of terrorism and related changes in living were associated with middle age, female gender, lower education and higher reported psychological distress. Australians of migrant background reported significantly lower likelihood of terrorism (AOR = 0.52, 95%CI:0.39-0.70) but significantly higher concern that they would be personally affected by such an incident (AOR = 1.57, 95%CI:1.21-2.04) and having made changes in the way they live due to this threat (AOR = 2.47, 95%CI:1.88-3.25). Willingness to evacuate homes and public places in response to potential incidents increased significantly between 2007 and 2010 (AOR = 1.53, 95%CI:1.33-1.76).</p> <p>Conclusion</p> <p>While an increased proportion of Australians believe that the national threat of terrorism remains high, concern about being personally affected has moderated and may reflect habituation to this threat. Key sub-groups remain disproportionately concerned, notably those with lower education and migrant groups. The dissonance observed in findings relating to Australians of migrant background appears to reflect wider socio-cultural concerns associated with this issue. Disparities in community concerns regarding terrorism-related threat require active policy consideration and specific initiatives to reduce the vulnerabilities of known risk groups, particularly in the aftermath of future incidents.</p

    Application of functional genomics to primate endometrium: insights into biological processes

    Get PDF
    Endometrium is a dynamic tissue that responds on a cyclic basis to circulating levels of the ovarian-derived steroid hormones, estradiol and progesterone. Functional genomics has enabled a global approach to understanding gene regulation in whole endometrial tissue in the setting of a changing hormonal milieu. The proliferative phase of the cycle, under the influence of estradiol, has a preponderance of genes involved in DNA synthesis and cell cycle regulation. Interestingly, genes encoding ion channels and cell adhesion, as well as angiogenic factors, are also highly regulated in this phase of the cycle. After the LH surge, different gene expression profiles are uniquely observed in the early secretory, mid-secretory (window of implantation), and late secretory phases. The early secretory phase is notable for up-regulation of multiple genes and gene families involved in cellular metabolism, steroid hormone metabolism, as well as some secreted glycoproteins. The mid-secretory phase is characterized by multiple biological processes, including up-regulation of genes encoding secreted glycoproteins, immune response genes with a focus on innate immunity, and genes involved in detoxification mechanisms. In the late secretory phase, as the tissue prepares for desquamation, there is a marked up-regulation of an inflammatory response, along with matrix degrading enzymes, and genes involved in hemostasis, among others. This monograph reviews hormonal regulation of gene expression in this tissue and the molecular events occurring therein throughout the cycle derived from functional genomics analysis. It also highlights challenges encountered in using human endometrial tissue in translational research in this context

    Critical Review of Norovirus Surrogates in Food Safety Research: Rationale for Considering Volunteer Studies

    Get PDF
    The inability to propagate human norovirus (NoV) or to clearly differentiate infectious from noninfectious virus particles has led to the use of surrogate viruses, like feline calicivirus (FCV) and murine norovirus-1 (MNV), which are propagatable in cell culture. The use of surrogates is predicated on the assumption that they generally mimic the viruses they represent; however, studies are proving this concept invalid. In direct comparisons between FCV and MNV, their susceptibility to temperatures, environmental and food processing conditions, and disinfectants are dramatically different. Differences have also been noted between the inactivation of NoV and its surrogates, thus questioning the validity of surrogates. Considerable research funding is provided globally each year to conduct surrogate studies on NoVs; however, there is little demonstrated benefit derived from these studies in regard to the development of virus inactivation techniques or food processing strategies. Human challenge studies are needed to determine which processing techniques are effective in reducing NoVs in foods. A major obstacle to clinical trials on NoVs is the perception that such trials are too costly and risky, but in reality, there is far more cost and risk in allowing millions of unsuspecting consumers to contract NoV illness each year, when practical interventions are only a few volunteer studies away. A number of clinical trials have been conducted, providing important insights into NoV inactivation. A shift in research priorities from surrogate research to volunteer studies is essential if we are to identify realistic, practical, and scientifically valid processing approaches to improve food safety

    High Prevalence of Tuberculosis and Serious Bloodstream Infections in Ambulatory Individuals Presenting for Antiretroviral Therapy in Malawi

    Get PDF
    Background Tuberculosis (TB) and serious bloodstream infections (BSI) may contribute to the high early mortality observed among patients qualifying for antiretroviral therapy (ART) with unexplained weight loss, chronic fever or chronic diarrhea. Methods and Findings A prospective cohort study determined the prevalence of undiagnosed TB or BSI among ambulatory HIV-infected adults with unexplained weight loss and/or chronic fever, or diarrhea in two routine program settings in Malawi. Subjects with positive expectorated sputum smears for AFB were excluded. Investigations Bacterial and mycobacterial blood cultures, cryptococcal antigen test (CrAg), induced sputum (IS) for TB microscopy and solid culture, full blood count and CD4 lymphocyte count. Among 469 subjects, 52 (11%) had microbiological evidence of TB; 50 (11%) had a positive (non-TB) blood culture and/or positive CrAg. Sixty-five additional TB cases were diagnosed on clinical and radiological grounds. Nontyphoidal Salmonellae (NTS) were the most common blood culture pathogens (29 cases; 6% of participants and 52% of bloodstream isolates). Multivariate analysis of baseline clinical and hematological characteristics found significant independent associations between oral candidiasis or lymphadenopathy and TB, marked CD4 lymphopenia and NTS infection, and severe anemia and either infection, but low positive likelihood ratios (<2 for all combinations). Conclusions We observed a high prevalence of TB and serious BSI, particularly NTS, in a program cohort of chronically ill HIV-infected outpatients. Baseline clinical and hematological characteristics were inadequate predictors of infection. HIV clinics need better rapid screening tools for TB and BSI. Clinical trials to evaluate empiric TB or NTS treatment are required in similar populations
    corecore