285 research outputs found
The influence of the Cyclin D1 870 G>A polymorphism as an endometrial cancer risk factor
<p>Abstract</p> <p>Background</p> <p>Cyclin D1 is integral for the G1 to S phase of the cell cycle as it regulates cellular proliferation. A polymorphism in cyclin D1, 870 G>A, causes overexpression and supports uncontrollable cellular growth. This polymorphism has been associated with an increased risk of developing many cancers, including endometrial cancer.</p> <p>Methods</p> <p>The 870 G>A polymorphisms (rs605965) in the cyclin D1 gene was genotyped in an Australian endometrial cancer case-control population including 191 cases and 291 controls using real-time PCR analysis. Genotype analysis was performed using chi-squared (χ<sup>2</sup>) statistics and odds ratios were calculated using unconditional logistic regression, adjusting for potential endometrial cancer risk factors.</p> <p>Results</p> <p>Women homozygous for the variant cyclin D1 870 AA genotype showed a trend for an increased risk of developing endometrial cancer compared to those with the wild-type GG genotype, however this result was not statistically significant (OR 1.692 95% CI (0.939–3.049), p = 0.080). Moreover, the 870 G>A polymorphism was significantly associated with family history of colorectal cancer. Endometrial cancer patients with the homozygous variant AA genotype had a higher frequency of family members with colorectal cancer in comparison to endometrial cancer patients with the GG and combination of GG and GA genotypes (GG versus AA; OR 2.951, 95% CI (1.026–8.491), p = 0.045, and GG+GA versus AA; OR 2.265, 95% CI (1.048–4.894), p = 0.038, respectively).</p> <p>Conclusion</p> <p>These results suggest that the cyclin D1 870 G>A polymorphism is possibly involved in the development of endometrial cancer. A more complex relationship was observed between this polymorphism and familial colorectal cancer.</p
Consumer perceptions of co-branding alliances: Organizational dissimilarity signals and brand fit
This study explores how consumers evaluate co-branding alliances between dissimilar partner firms. Customers are well aware that different firms are behind a co-branded product and observe the partner firms’ characteristics. Drawing on signaling theory, we assert that consumers use organizational characteristics as signals in their assessment of brand fit and for their purchasing decisions. Some organizational signals are beyond the control of the co-branding partners or at least they cannot alter them on short notice. We use a quasi-experimental design and test how co-branding partner dissimilarity affects brand fit perception. The results show that co-branding partner dissimilarity in terms of firm size, industry scope, and country-of-origin image negatively affects brand fit perception. Firm age dissimilarity does not exert significant influence. Because brand fit generally fosters a benevolent consumer attitude towards a co-branding alliance, the findings suggest that high partner dissimilarity may reduce overall co-branding alliance performance
Long-term carbon sink in Borneo's forests halted by drought and vulnerable to edge effects
Less than half of anthropogenic carbon dioxide emissions remain in the atmosphere. While carbon balance models imply large carbon uptake in tropical forests, direct on-the-ground observations are still lacking in Southeast Asia. Here, using long-term plot monitoring records of up to half a century, we find that intact forests in Borneo gained 0.43 Mg C ha‾¹ per year (95% CI 0.14—0.72, mean period 1988-2010) above-ground live biomass. These results closely match those from African and Amazonian plot networks, suggesting that the world's remaining intact tropical forests are now en masse out-of-equilibrium. Although both pan-tropical and long-term, the sink in remaining intact forests appears vulnerable to climate and land use changes. Across Borneo the 1997-1998 El Niño drought temporarily halted the carbon sink by increasing tree mortality, while fragmentation persistently offset the sink and turned many edge-affected forests into a carbon source to the atmosphere
Attaining the canopy in dry and moist tropical forests: strong differences in tree growth trajectories reflect variation in growing conditions
Availability of light and water differs between tropical moist and dry forests, with typically higher understorey light levels and lower water availability in the latter. Therefore, growth trajectories of juvenile trees—those that have not attained the canopy—are likely governed by temporal fluctuations in light availability in moist forests (suppressions and releases), and by spatial heterogeneity in water availability in dry forests. In this study, we compared juvenile growth trajectories of Cedrela odorata in a dry (Mexico) and a moist forest (Bolivia) using tree rings. We tested the following specific hypotheses: (1) moist forest juveniles show more and longer suppressions, and more and stronger releases; (2) moist forest juveniles exhibit wider variation in canopy accession pattern, i.e. the typical growth trajectory to the canopy; (3) growth variation among dry forest juveniles persists over longer time due to spatial heterogeneity in water availability. As expected, the proportion of suppressed juveniles was higher in moist than in dry forest (72 vs. 17%). Moist forest suppressions also lasted longer (9 vs. 5 years). The proportion of juveniles that experienced releases in moist forest (76%) was higher than in dry forest (41%), and releases in moist forests were much stronger. Trees in the moist forest also had a wider variation in canopy accession patterns compared to the dry forest. Our results also showed that growth variation among juvenile trees persisted over substantially longer periods of time in dry forest (>64 years) compared to moist forest (12 years), most probably because of larger persistent spatial variation in water availability. Our results suggest that periodic increases in light availability are more important for attaining the canopy in moist forests, and that spatial heterogeneity in water availability governs long-term tree growth in dry forests
Increasing Clinical Virulence in Two Decades of the Italian HIV Epidemic
The recent origin and great evolutionary potential of HIV imply that the virulence of the virus might still be changing, which could greatly affect the future of the pandemic. However, previous studies of time trends of HIV virulence have yielded conflicting results. Here we used an established methodology to assess time trends in the severity (virulence) of untreated HIV infections in a large Italian cohort. We characterized clinical virulence by the decline slope of the CD4 count (n = 1423 patients) and the viral setpoint (n = 785 patients) in untreated patients with sufficient data points. We used linear regression models to detect correlations between the date of diagnosis (ranging 1984–2006) and the virulence markers, controlling for gender, exposure category, age, and CD4 count at entry. The decline slope of the CD4 count and the viral setpoint displayed highly significant correlation with the date of diagnosis pointing in the direction of increasing virulence. A detailed analysis of riskgroups revealed that the epidemics of intravenous drug users started with an apparently less virulent virus, but experienced the strongest trend towards steeper CD4 decline among the major exposure categories. While our study did not allow us to exclude the effect of potential time trends in host factors, our findings are consistent with the hypothesis of increasing HIV virulence. Importantly, the use of an established methodology allowed for a comparison with earlier results, which confirmed that genuine differences exist in the time trends of HIV virulence between different epidemics. We thus conclude that there is not a single global trend of HIV virulence, and results obtained in one epidemic cannot be extrapolated to others. Comparison of discordant patterns between riskgroups and epidemics hints at a converging trend, which might indicate that an optimal level of virulence might exist for the virus
Field trial of three different Plasmodium vivax-detecting rapid diagnostic tests with and without evaporative cool box storage in Afghanistan
<p>Abstract</p> <p>Background</p> <p>Accurate parasitological diagnosis of malaria is essential for targeting treatment where more than one species coexist. In this study, three rapid diagnostic tests (RDTs) (AccessBio CareStart (CSPfPan), CareStart PfPv (CSPfPv) and Standard Diagnostics Bioline (SDBPfPv)) were evaluated for their ability to detect natural <it>Plasmodium vivax </it>infections in a basic clinic setting. The potential for locally made evaporative cooling boxes (ECB) to protect the tests from heat damage in high summer temperatures was also investigated.</p> <p>Methods</p> <p>Venous blood was drawn from <it>P. vivax </it>positive patients in Jalalabad, Afghanistan and tested against a panel of six RDTs. The panel comprised two of each test type; one group was stored at room temperature and the other in an ECB. RDT results were evaluated against a consensus gold standard based on two double-read reference slides and PCR. The sensitivity, specificity and a measure of global performance for each test were determined and stratified by parasitaemia level and storage condition.</p> <p>Results</p> <p>In total, 306 patients were recruited, of which 284 were positive for <it>P. vivax</it>, one for <it>Plasmodium malariae </it>and none for <it>Plasmodium falciparum</it>; 21 were negative. All three RDTs were specific for malaria. The sensitivity and global performance index for each test were as follows: CSPfPan [98.6%, 95.1%], CSPfPv [91.9%, 90.5%] and SDBPfPv [96.5%, 82.9%], respectively. CSPfPv was 16% less sensitive to a parasitaemia below 5,000/μL. Room temperature storage of SDBPfPv led to a high proportion of invalid results (17%), which reduced to 10% in the ECB. Throughout the testing period, the ECB maintained ~8°C reduction over ambient temperatures and never exceeded 30°C.</p> <p>Conclusions</p> <p>Of the three RDTs, the CSPfPan test was the most consistent and reliable, rendering it appropriate for this <it>P. vivax </it>predominant region. The CSPfPv test proved unsuitable owing to its reduced sensitivity at a parasitaemia below 5,000/μL (affecting 43% of study samples). Although the SDBPfPv device was more sensitive than the CSPfPv test, its invalid rate was unacceptably high. ECB storage reduced the proportion of invalid results for the SDBPfPv test, but surprisingly had no impact on RDT sensitivity at low parasitaemia.</p
Quality assurance in psychiatry: quality indicators and guideline implementation
In many occasions, routine mental health care does not correspond to the standards that the medical profession itself puts forward. Hope exists to improve the outcome of severe mental illness by improving the quality of mental health care and by implementing evidence-based consensus guidelines. Adherence to guideline recommendations should reduce costly complications and unnecessary procedures. To measure the quality of mental health care and disease outcome reliably and validly, quality indicators have to be available. These indicators of process and outcome quality should be easily measurable with routine data, should have a strong evidence base, and should be able to describe quality aspects across all sectors over the whole disease course. Measurement-based quality improvement will not be successful when it results in overwhelming documentation reducing the time for clinicians for active treatment interventions. To overcome difficulties in the implementation guidelines and to reduce guideline non-adherence, guideline implementation and quality assurance should be embedded in a complex programme consisting of multifaceted interventions using specific psychological methods for implementation, consultation by experts, and reimbursement of documentation efforts. There are a number of challenges to select appropriate quality indicators in order to allow a fair comparison across different approaches of care. Carefully used, the use of quality indicators and improved guideline adherence can address suboptimal clinical outcomes, reduce practice variations, and narrow the gap between optimal and routine care
Strongyloides stercoralis age-1: A Potential Regulator of Infective Larval Development in a Parasitic Nematode
Infective third-stage larvae (L3i) of the human parasite Strongyloides stercoralis share many morphological, developmental, and behavioral attributes with Caenorhabditis elegans dauer larvae. The ‘dauer hypothesis’ predicts that the same molecular genetic mechanisms control both dauer larval development in C. elegans and L3i morphogenesis in S. stercoralis. In C. elegans, the phosphatidylinositol-3 (PI3) kinase catalytic subunit AGE-1 functions in the insulin/IGF-1 signaling (IIS) pathway to regulate formation of dauer larvae. Here we identify and characterize Ss-age-1, the S. stercoralis homolog of the gene encoding C. elegans AGE-1. Our analysis of the Ss-age-1 genomic region revealed three exons encoding a predicted protein of 1,209 amino acids, which clustered with C. elegans AGE-1 in phylogenetic analysis. We examined temporal patterns of expression in the S. stercoralis life cycle by reverse transcription quantitative PCR and observed low levels of Ss-age-1 transcripts in all stages. To compare anatomical patterns of expression between the two species, we used Ss-age-1 or Ce-age-1 promoter::enhanced green fluorescent protein reporter constructs expressed in transgenic animals for each species. We observed conservation of expression in amphidial neurons, which play a critical role in developmental regulation of both dauer larvae and L3i. Application of the PI3 kinase inhibitor LY294002 suppressed L3i in vitro activation in a dose-dependent fashion, with 100 µM resulting in a 90% decrease (odds ratio: 0.10, 95% confidence interval: 0.08–0.13) in the odds of resumption of feeding for treated L3i in comparison to the control. Together, these data support the hypothesis that Ss-age-1 regulates the development of S. stercoralis L3i via an IIS pathway in a manner similar to that observed in C. elegans dauer larvae. Understanding the mechanisms by which infective larvae are formed and activated may lead to novel control measures and treatments for strongyloidiasis and other soil-transmitted helminthiases
The impact of mass drug administration and long-lasting insecticidal net distribution on Wuchereria bancrofti infection in humans and mosquitoes: an observational study in northern Uganda
BACKGROUND: Lymphatic filariasis (LF) in Uganda is caused by Wuchereria bancrofti and transmitted by anopheline mosquitoes. The mainstay of elimination has been annual mass drug administration (MDA) with ivermectin and albendazole, targeted to endemic districts, but has been sporadic and incomplete in coverage. Vector control could potentially contribute to reducing W. bancrofti transmission, speeding up progress towards elimination. To establish whether the use of long-lasting insecticidal nets (LLINs) can contribute towards reducing transmission of W. bancrofti in a setting with ongoing MDA, a study was conducted in an area of Uganda highly endemic for both LF and malaria. Baseline parasitological and entomological assessments were conducted in 2007, followed by high-coverage LLIN distribution. Net use and entomological surveys were carried out after one year, and final parasitological and entomological evaluations were conducted in 2010. Three rounds of MDA had taken place before the study commenced, with a further three rounds completed during the course of the study. RESULTS: In 2007, rapid mapping indicated 22.3% of schoolchildren were W. bancrofti antigen positive, and a baseline survey during the same year found age-adjusted microfilaraemia prevalence was 3.7% (95% confidence interval (CI): 2.6-5.3%). In 2010, age-adjusted microfilaraemia prevalence had fallen to 0.4%, while antigenaemia rates were 0.2% in children < 5 years and 6.0% in ≥ 5 years. In 2010, universal coverage of mosquito nets in a household was found to be protective against W. bancrofti antigen (odds ratio = 0.44, 95% CI: 0.22-0.89). Prevalence of W. bancrofti larvae in anopheline mosquitoes had decreased significantly between the 2007 and 2010 surveys, but there was an apparent increase in vector densities. CONCLUSION: A marked reduction in W. bancrofti infection and infectivity in humans was observed in the study area, where both MDA and LLINs were used to reduce transmission. The extent to which LLINs contributed to this decline is equivocal, however. Further work investigating the impact of vector control on anopheline-transmitted LF in an endemic area not benefitting from MDA would be valuable to determine the effect of such interventions on their own
- …