377 research outputs found

    Evolution of the Spider Homeobox Gene Repertoire by Tandem and Whole Genome Duplication

    Get PDF
    Gene duplication generates new genetic material that can contribute to the evolution of gene regulatory networks and phenotypes. Duplicated genes can undergo subfunctionalization to partition ancestral functions and/or neofunctionalization to assume a new function. We previously found there had been a whole genome duplication (WGD) in an ancestor of arachnopulmonates, the lineage including spiders and scorpions but excluding other arachnids like mites, ticks, and harvestmen. This WGD was evidenced by many duplicated homeobox genes, including two Hox clusters, in spiders. However, it was unclear which homeobox paralogues originated by WGD versus smaller-scale events such as tandem duplications. Understanding this is a key to determining the contribution of the WGD to arachnopulmonate genome evolution. Here we characterized the distribution of duplicated homeobox genes across eight chromosome-level spider genomes. We found that most duplicated homeobox genes in spiders are consistent with an origin by WGD. We also found two copies of conserved homeobox gene clusters, including the Hox, NK, HRO, Irx, and SINE clusters, in all eight species. Consistently, we observed one copy of each cluster was degenerated in terms of gene content and organization while the other remained more intact. Focussing on the NK cluster, we found evidence for regulatory subfunctionalization between the duplicated NK genes in the spider Parasteatoda tepidariorum compared to their single-copy orthologues in the harvestman Phalangium opilio. Our study provides new insights into the relative contributions of multiple modes of duplication to the homeobox gene repertoire during the evolution of spiders and the function of NK genes

    Initiation of Dialysis Is Associated With Impaired Cardiovascular Functional Capacity

    Get PDF
    Background The transition to dialysis period carries a substantial increased cardiovascular risk in patients with chronic kidney disease. Despite this, alterations in cardiovascular functional capacity during this transition are largely unknown. The present study therefore sought to assess ventilatory exercise response measures in patients within 1 year of initiating dialysis. Methods and Results We conducted a cross‐sectional study of 241 patients with chronic kidney disease stage 5 from the CAPER (Cardiopulmonary Exercise Testing in Renal Failure) study and from the intradialytic low‐frequency electrical muscle stimulation pilot randomized controlled trial cohorts. Patients underwent cardiopulmonary exercise testing and echocardiography. Of the 241 patients (age, 48.9 [15.0] years; 154 [63.9%] men), 42 were predialytic (mean estimated glomerular filtration rate, 14 mL·min −1 ·1.73 m −2 ), 54 had a dialysis vintage ≤12 months, and 145 had a dialysis vintage >12 months. Dialysis vintage ≤12 months exhibited a significantly impaired cardiovascular functional capacity, as assessed by oxygen uptake at peak exercise (18.7 [5.8] mL·min −1 ·kg −1 ) compared with predialysis (22.7 [5.2] mL·min −1 ·kg −1 ; P <0.001). Dialysis vintage ≤12 months also exhibited reduced peak workload, impaired peak heart rate, reduced circulatory power, and increased left ventricular mass index ( P <0.05 for all) compared with predialysis. After excluding those with prior kidney transplant, dialysis vintage >12 months exhibited a lower oxygen uptake at peak exercise (17.0 [4.9] mL·min −1 ·kg −1 ) compared with dialysis vintage ≤12 months (18.9 [5.9] mL·min −1 ·kg −1 ; P =0.033). Conclusions Initiating dialysis is associated with a significant impairment in oxygen uptake at peak exercise and overall decrements in ventilatory and hemodynamic exercise responses that predispose patients to functional dependence. The magnitude of these changes is comparable to the differences between low‐risk New York Heart Association class I and higher‐risk New York Heart Association class II to IV heart failure

    Reliability of Rapid Diagnostic Tests in Diagnosing Pregnancy-Associated Malaria in North-Eastern Tanzania.

    Get PDF
    Accurate diagnosis and prompt treatment of pregnancy-associated malaria (PAM) are key aspects in averting adverse pregnancy outcomes. Microscopy is the gold standard in malaria diagnosis, but it has limited detection and availability. When used appropriately, rapid diagnostic tests (RDTs) could be an ideal diagnostic complement to microscopy, due to their ease of use and adequate sensitivity in detecting even sub-microscopic infections. Polymerase chain reaction (PCR) is even more sensitive, but it is mainly used for research purposes. The accuracy and reliability of RDTs in diagnosing PAM was evaluated using microscopy and PCR. A cohort of pregnant women in north-eastern Tanzania was followed throughout pregnancy for detection of plasmodial infection using venous and placental blood samples evaluated by histidine rich protein 2 (HRP-2) and parasite lactate dehydrogenase (pLDH) based RDTs (Parascreen™) or HRP-2 only (Paracheck Pf® and ParaHIT®f), microscopy and nested Plasmodium species diagnostic PCR. From a cohort of 924 pregnant women who completed the follow up, complete RDT and microscopy data was available for 5,555 blood samples and of these 442 samples were analysed by PCR. Of the 5,555 blood samples, 49 ((proportion and 95% confidence interval) 0.9% [0.7 -1.1]) samples were positive by microscopy and 91 (1.6% [1.3-2.0]) by RDT. Forty-six (50.5% [40.5 - 60.6]) and 45 (49.5% [39.4 - 59.5]) of the RDT positive samples were positive and negative by microscopy, respectively, whereas nineteen (42.2% [29.0 - 56.7]) of the microscopy negative, but RDT positive, samples were positive by PCR. Three (0.05% [0.02 - 0.2]) samples were positive by microscopy but negative by RDT. 351 of the 5,461 samples negative by both RDT and microscopy were tested by PCR and found negative. There was no statistically significant difference between the performances of the different RDTs. Microscopy underestimated the real burden of malaria during pregnancy and RDTs performed better than microscopy in diagnosing PAM. In areas where intermittent preventive treatment during pregnancy may be abandoned due to low and decreasing malaria risk and instead replaced with active case management, screening with RDT is likely to identify most infections in pregnant women and out-performs microscopy as a diagnostic tool

    Gene content evolution in the arthropods

    Get PDF
    Arthropods comprise the largest and most diverse phylum on Earth and play vital roles in nearly every ecosystem. Their diversity stems in part from variations on a conserved body plan, resulting from and recorded in adaptive changes in the genome. Dissection of the genomic record of sequence change enables broad questions regarding genome evolution to be addressed, even across hyper-diverse taxa within arthropods. Using 76 whole genome sequences representing 21 orders spanning more than 500 million years of arthropod evolution, we document changes in gene and protein domain content and provide temporal and phylogenetic context for interpreting these innovations. We identify many novel gene families that arose early in the evolution of arthropods and during the diversification of insects into modern orders. We reveal unexpected variation in patterns of DNA methylation across arthropods and examples of gene family and protein domain evolution coincident with the appearance of notable phenotypic and physiological adaptations such as flight, metamorphosis, sociality, and chemoperception. These analyses demonstrate how large-scale comparative genomics can provide broad new insights into the genotype to phenotype map and generate testable hypotheses about the evolution of animal diversity

    Genome-wide association study for renal traits in the Framingham Heart and Atherosclerosis Risk in Communities Studies

    Get PDF
    Background: The Framingham Heart Study (FHS) recently obtained initial results from the first genome-wide association scan for renal traits. The study of 70,987 single nucleotide polymorphisms (SNPs) in 1,010 FHS participants provides a list of SNPs showing the strongest associations with renal traits which need to be verified in independent study samples. Methods: Sixteen SNPs were selected for replication based on the most promising associations with chronic kidney disease (CKD), estimated glomerular filtration rate (eGFR), and serum cystatin C in FHS. These SNPs were genotyped in 15,747 participants of the Atherosclerosis in Communities (ARIC) Study and evaluated for association using multivariable adjusted regression analyses. Primary outcomes in ARIC were CKD and eGFR. Secondary prospective analyses were conducted for association with kidney disease progression using multivariable adjusted Cox proportional hazards regression. The definition of the outcomes, all covariates, and the use of an additive genetic model was consistent with the original analyses in FHS. Results: The intronic SNP rs6495446 in the gene MTHFS was significantly associated with CKD among white ARIC participants at visit 4: the odds ratio per each C allele was 1.24 (95% CI 1.09–1.41, p = 0.001). Borderline significant associations of rs6495446 were observed with CKD at study visit 1 (p = 0.024), eGFR at study visits 1 (p = 0.073) and 4 (lower mean eGFR per C allele by 0.6 ml/min/1.73 m2\text{m}^2, p = 0.043) and kidney disease progression (hazard ratio 1.13 per each C allele, 95% CI 1.00–1.26, p = 0.041). Another SNP, rs3779748 in EYA1, was significantly associated with CKD at ARIC visit 1 (odds ratio per each T allele 1.22, p = 0.01), but only with eGFR and cystatin C in FHS. Conclusion: This genome-wide association study provides unbiased information implicating MTHFS as a candidate gene for kidney disease. Our findings highlight the importance of replication to identify common SNPs associated with renal traits

    The incidence of varicella and herpes zoster in Massachusetts as measured by the Behavioral Risk Factor Surveillance System (BRFSS) during a period of increasing varicella vaccine coverage, 1998–2003

    Get PDF
    BACKGROUND: The authors sought to monitor the impact of widespread varicella vaccination on the epidemiology of varicella and herpes zoster. While varicella incidence would be expected to decrease, mathematical models predict an initial increase in herpes zoster incidence if re-exposure to varicella protects against reactivation of the varicella zoster virus. METHODS: In 1998–2003, as varicella vaccine uptake increased, incidence of varicella and herpes zoster in Massachusetts was monitored using the random-digit-dial Behavioral Risk Factor Surveillance System. RESULTS: Between 1998 and 2003, varicella incidence declined from 16.5/1,000 to 3.5/1,000 (79%) overall with ≥66% decreases for all age groups except adults (27% decrease). Age-standardized estimates of overall herpes zoster occurrence increased from 2.77/1,000 to 5.25/1,000 (90%) in the period 1999–2003, and the trend in both crude and adjusted rates was highly significant (p < 0.001). Annual age-specific rates were somewhat unstable, but all increased, and the trend was significant for the 25–44 year and 65+ year age groups. CONCLUSION: As varicella vaccine coverage in children increased, the incidence of varicella decreased and the occurrence of herpes zoster increased. If the observed increase in herpes zoster incidence is real, widespread vaccination of children is only one of several possible explanations. Further studies are needed to understand secular trends in herpes zoster before and after use of varicella vaccine in the United States and other countries

    Probing Real Sensory Worlds of Receivers with Unsupervised Clustering

    Get PDF
    The task of an organism to extract information about the external environment from sensory signals is based entirely on the analysis of ongoing afferent spike activity provided by the sense organs. We investigate the processing of auditory stimuli by an acoustic interneuron of insects. In contrast to most previous work we do this by using stimuli and neurophysiological recordings directly in the nocturnal tropical rainforest, where the insect communicates. Different from typical recordings in sound proof laboratories, strong environmental noise from multiple sound sources interferes with the perception of acoustic signals in these realistic scenarios. We apply a recently developed unsupervised machine learning algorithm based on probabilistic inference to find frequently occurring firing patterns in the response of the acoustic interneuron. We can thus ask how much information the central nervous system of the receiver can extract from bursts without ever being told which type and which variants of bursts are characteristic for particular stimuli. Our results show that the reliability of burst coding in the time domain is so high that identical stimuli lead to extremely similar spike pattern responses, even for different preparations on different dates, and even if one of the preparations is recorded outdoors and the other one in the sound proof lab. Simultaneous recordings in two preparations exposed to the same acoustic environment reveal that characteristics of burst patterns are largely preserved among individuals of the same species. Our study shows that burst coding can provide a reliable mechanism for acoustic insects to classify and discriminate signals under very noisy real-world conditions. This gives new insights into the neural mechanisms potentially used by bushcrickets to discriminate conspecific songs from sounds of predators in similar carrier frequency bands

    Antibody-Mediated Growth Inhibition of Plasmodium falciparum: Relationship to Age and Protection from Parasitemia in Kenyan Children and Adults

    Get PDF
    BACKGROUND: Antibodies that impair Plasmodium falciparum merozoite invasion and intraerythrocytic development are one of several mechanisms that mediate naturally acquired immunity to malaria. Attempts to correlate anti-malaria antibodies with risk of infection and morbidity have yielded inconsistent results. Growth inhibition assays (GIA) offer a convenient method to quantify functional antibody activity against blood stage malaria. METHODS: A treatment-time-to-infection study was conducted over 12-weeks in a malaria holoendemic area of Kenya. Plasma collected from healthy individuals (98 children and 99 adults) before artemether-lumefantrine treatment was tested by GIA in three separate laboratories. RESULTS: Median GIA levels varied with P. falciparum line (D10, 8.8%; 3D7, 34.9%; FVO, 51.4% inhibition). The magnitude of growth inhibition decreased with age in all P. falciparum lines tested with the highest median levels among children \u3c4 years compared to adults (e.g. 3D7, 45.4% vs. 30.0% respectively, p = 0.0003). Time-to-infection measured by weekly blood smears was significantly associated with level of GIA controlling for age. Upper quartile inhibition activity was associated with less risk of infection compared to individuals with lower levels (e.g. 3D7, hazard ratio = 1.535, 95% CI = 1.012-2.329; p = 0.0438). Various GIA methodologies had little effect on measured parasite growth inhibition. CONCLUSION: Plasma antibody-mediated growth inhibition of blood stage P. falciparum decreases with age in residents of a malaria holoendemic area. Growth inhibition assay may be a useful surrogate of protection against infection when outcome is controlled for age
    corecore