12 research outputs found
Adaptive capacity beyond the household: a systematic review of empirical social-ecological research
The concept of adaptive capacity has received significant attention within social-ecological and environmental change research. Within both the resilience and vulnerability literatures specifically, adaptive capacity has emerged as a fundamental concept for assessing the ability of social-ecological systems to adapt to environmental change. Although methods and indicators used to evaluate adaptive capacity are broad, the focus of existing scholarship has predominately been at the individual- and household- levels. However, the capacities necessary for humans to adapt to global environmental change are often a function of individual and societal characteristics, as well as cumulative and emergent capacities across communities and jurisdictions. In this paper, we apply a systematic literature review and co-citation analysis to investigate empirical research on adaptive capacity that focus on societal levels beyond the household. Our review demonstrates that assessments of adaptive capacity at higher societal levels are increasing in frequency, yet vary widely in approach, framing, and results; analyses focus on adaptive capacity at many different levels (e.g. community, municipality, global region), geographic locations, and cover multiple types of disturbances and their impacts across sectors. We also found that there are considerable challenges with regard to the ‘fit’ between data collected and analytical methods used in adequately capturing the cross-scale and cross-level determinants of adaptive capacity. Current approaches to assessing adaptive capacity at societal levels beyond the household tend to simply aggregate individual- or household-level data, which we argue oversimplifies and ignores the inherent interactions within and across societal levels of decision-making that shape the capacity of humans to adapt to environmental change across multiple scales. In order for future adaptive capacity research to be more practice-oriented and effectively guide policy, there is a need to develop indicators and assessments that are matched with the levels of potential policy applications
HIV prevalence in severely malnourished children admitted to nutrition rehabilitation units in Malawi: Geographical & seasonal variations a cross-sectional study
Background: Severe malnutrition in childhood associated with HIV infection presents a serious humanitarian and public health challenge in Southern Africa. The aim of this study was to collect country wide data on HIV infection patterns in severely malnourished children to guide the development of integrated care in a resource limited setting.Methods: A cross sectional survey was conducted in 12 representative rural and urban Nutrition Rehabilitation Units (NRUs), from each of Malawi's 3 regions.All children and their caretakers admitted to each NRU over a two week period were offered HIV counselling and testing. Testing was carried out using two different rapid antibody tests, with PCR testing for discordant results. Children under 15 months were excluded, to avoid difficulties with interpretation of false positive rapid test results.The survey was conducted once in the dry/post-harvest season, and repeated in the rainy/hungry season.Results: 570 children were eligible for study inclusion. Acceptability and uptake of HIV testing was high: 523(91.7%) of carers consented for their children to take part; 368(70.6%) themselves accepted testing.Overall HIV prevalence amongst children tested was 21.6%(95% confidence intervals, 18.2-25.5%). There was wide variation between individual NRUs: 2.0-50.0%.Geographical prevalence variations were significant between the three regions (p < 0.01) with the highest prevalence being in the south: Northern Region 23.1%(95%CI 14.3-34.0%), Central Region 10.9%(95%CI 7.5-15.3%), and Southern Region 36.9%(95%CI 14.3-34.0%). HIV prevalence was significantly higher in urban areas, 32.9%(95%CI 26.8-39.4%) than in rural 13.2%(95%CI 9.5-17.6%)(p < 0.01). NRU HIV prevalence rates were lower in the rainy/hungry season 18.4%(95%CI 14.7-22.7%) than in the dry/post-harvest season 30.9%(95%CI 23.2-39.4%) (p < 0.001%).Conclusion: There is a high prevalence of HIV infection in severely malnourished Malawian children attending NRUs with children in urban areas most likely to be infected. Testing for HIV is accepted by their carers in both urban and rural areas. NRUs could act as entry points to HIV treatment and support programmes for affected children and families. Recognition of wide geographical variations in childhood HIV prevalence will ensure that limited resources are initially targeted to areas of highest need.These findings may have implications for the other countries with similar patterns of childhood illness and food insecurity
Finishing the euchromatic sequence of the human genome
The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead
Recommended from our members
Designing and emergency information system : the Pittsburgh experience
Role of Social Network on Technology Adoption: Application to Nebraska Producers in the Face of Undesirable Vegetation Transitions
Conclusion
Producers need to have access to information regarding new conservation practices and technologies to ensure land management in the face of ecological threats in general and vegetation transitions (VTs) in the context of our study. This study investigates the role of an individual producer\u27s social network on the willingness to seek information about technologies and management practices and the likelihood of new technology adoption with special attention to risk attitudes and producer spillover effects. Our results provide evidence that network composition and information obtained through a producer\u27s social network don\u27t influence an individual\u27s willingness to seek information about new technologies or management practices. However, when it comes to adopting specific technology, like screening and imaging technology, social network measures have a significant impact. Additionally, we found risk attitude and spillover effect positively influence the likelihood of technology adoption. The significant positive impact of the spillover effect confirms that producers are reactive to the effects of VTs that they observe within their neighbor\u27s land and are willing to seek information regarding new practices and technologies and adopt them.
Considering the above results, it is evident that if public and private agencies are interested in addressing negative effects of VTs through changes in producer behaviors, they would be well served to invoke the mechanism of producer personal social networks to ensure effective receipt and dissemination of information regarding the new technology. Such effective information transmission can be combined with existing (and new) environmental policies to address VTs issues in Nebraska (and possibly in other areas where producers have similar profiles)
Is Size Discordancy an Indication for Delivery of Preterm Twins?
Objective: Our goal was to determine the clinical significance of size discordancy in preterm twins. Study Design: A retrospective study was performed to review outcomes of twins delivered between Jan. 1, 1988, and June 30, 1995. Maternal and neonatal records were assessed for demographic data, maternal medical history, and neonatal mortality and morbidity outcomes. Discordancy was defined as ≥20% difference in birth weight. The χ2 analysis was performed. Results: There were 42 sets of discordant twins and 77 sets of concordant twins in the final analysis. The distribution of gestational ages in both groups was similar. We found no difference in maternal morbidity between the groups. Discordant sets had a significantly longer hospital stay (p = 0.003) and more cases of hyperbilirubinemia (p = 0.01), but there were no other differences in morbid outcomes. There were no differences in outcome variables between the two twins within discordant sets with respect to gender, size, birth order, growth restriction, or route of delivery. There were no stillbirths among any of the 238 infants. Of the 15 neonatal deaths, none occurred in infants delivered after 32 weeks\u27 gestation or in infants weighing \u3e2000 gm at birth. Infants who were small for gestational age had a higher incidence of sepsis (p = 0.043) and longer hospital stays (p = 0.005) compared with infants who were appropriate for gestational age.
Conclusions: Size discordancy alone does not appear to be an indication for preterm delivery of twins. When results of antenatal testing are normal and growth restriction is absent, attempts should be made to achieve a gestational age \u3e32 weeks and weight \u3e2000 gm before delivery is considered
Is Size Discordancy an Indication for Delivery of Preterm Twins?
Objective: Our goal was to determine the clinical significance of size discordancy in preterm twins. Study Design: A retrospective study was performed to review outcomes of twins delivered between Jan. 1, 1988, and June 30, 1995. Maternal and neonatal records were assessed for demographic data, maternal medical history, and neonatal mortality and morbidity outcomes. Discordancy was defined as ≥20% difference in birth weight. The χ2 analysis was performed. Results: There were 42 sets of discordant twins and 77 sets of concordant twins in the final analysis. The distribution of gestational ages in both groups was similar. We found no difference in maternal morbidity between the groups. Discordant sets had a significantly longer hospital stay (p = 0.003) and more cases of hyperbilirubinemia (p = 0.01), but there were no other differences in morbid outcomes. There were no differences in outcome variables between the two twins within discordant sets with respect to gender, size, birth order, growth restriction, or route of delivery. There were no stillbirths among any of the 238 infants. Of the 15 neonatal deaths, none occurred in infants delivered after 32 weeks\u27 gestation or in infants weighing \u3e2000 gm at birth. Infants who were small for gestational age had a higher incidence of sepsis (p = 0.043) and longer hospital stays (p = 0.005) compared with infants who were appropriate for gestational age.
Conclusions: Size discordancy alone does not appear to be an indication for preterm delivery of twins. When results of antenatal testing are normal and growth restriction is absent, attempts should be made to achieve a gestational age \u3e32 weeks and weight \u3e2000 gm before delivery is considered
Recommended from our members
Lessons learned from the eMERGE Network: balancing genomics in discovery and practice
The Electronic Medical Records and Genomics (eMERGE) Network, established in 2007, is a consortium of academic and integrated health systems conducting discovery and implementation research in translational genomics. Here, we outline the history of the network, highlight major impacts and lessons learned, and present the tools and resources developed for large-scale genomic analyses and translation into a clinical setting. The network developed methods to extract phenotypes from the electronic medical record to perform genome-wide and phenome-wide association studies. Recruited cohorts were clinically sequenced off a custom panel for targeted sequencing of variants and monogenic disease risks and returned to participants to investigate the impact of return of genomic results. After generating a 105,000 participant-imputed genome-wide association study (GWAS) dataset for discovery, the network enrolled and sequenced 24,998 participants. Integration of these results into the medical record and the effects of results on participants provided key lessons to the field. These learned lessons inform genetic research in diverse populations and provide insights into the clinical impact of return and implementation of genomic medicine using the electronic medical record. The lessons produced by the eMERGE Network can be utilized by other consortia as translational genomic medicine research evolves
Recommended from our members
Risk of COVID-19 after natural infection or vaccinationResearch in context
Background: While vaccines have established utility against COVID-19, phase 3 efficacy studies have generally not comprehensively evaluated protection provided by previous infection or hybrid immunity (previous infection plus vaccination). Individual patient data from US government-supported harmonized vaccine trials provide an unprecedented sample population to address this issue. We characterized the protective efficacy of previous SARS-CoV-2 infection and hybrid immunity against COVID-19 early in the pandemic over three-to six-month follow-up and compared with vaccine-associated protection. Methods: In this post-hoc cross-protocol analysis of the Moderna, AstraZeneca, Janssen, and Novavax COVID-19 vaccine clinical trials, we allocated participants into four groups based on previous-infection status at enrolment and treatment: no previous infection/placebo; previous infection/placebo; no previous infection/vaccine; and previous infection/vaccine. The main outcome was RT-PCR-confirmed COVID-19 >7–15 days (per original protocols) after final study injection. We calculated crude and adjusted efficacy measures. Findings: Previous infection/placebo participants had a 92% decreased risk of future COVID-19 compared to no previous infection/placebo participants (overall hazard ratio [HR] ratio: 0.08; 95% CI: 0.05–0.13). Among single-dose Janssen participants, hybrid immunity conferred greater protection than vaccine alone (HR: 0.03; 95% CI: 0.01–0.10). Too few infections were observed to draw statistical inferences comparing hybrid immunity to vaccine alone for other trials. Vaccination, previous infection, and hybrid immunity all provided near-complete protection against severe disease. Interpretation: Previous infection, any hybrid immunity, and two-dose vaccination all provided substantial protection against symptomatic and severe COVID-19 through the early Delta period. Thus, as a surrogate for natural infection, vaccination remains the safest approach to protection. Funding: National Institutes of Health