353 research outputs found
Single cell analysis shows decreasing FoxP3 and TGFβ1 coexpressing CD4+CD25+ regulatory T cells during autoimmune diabetes
Natural CD4+CD25+ regulatory T (CD4+CD25+ T reg) cells play a key role in the immunoregulation of autoimmunity. However, little is known about the interactions between CD4+CD25+ T reg cells and autoreactive T cells. This is due, in part, to the difficulty of using cell surface markers to identify CD4+CD25+ T reg cells accurately. Using a novel real-time PCR assay, mRNA copy number of FoxP3, TGFβ1, and interleukin (IL)-10 was measured in single cells to characterize and quantify CD4+CD25+ T reg cells in the nonobese diabetic (NOD) mouse, a murine model for type 1 diabetes (T1D). The suppressor function of CD4+CD25+CD62Lhi T cells, mediated by TGFβ, declined in an age-dependent manner. This loss of function coincided with a temporal decrease in the percentage of FoxP3 and TGFβ1 coexpressing T cells within pancreatic lymph node and islet infiltrating CD4+CD25+CD62Lhi T cells, and was detected in female NOD mice but not in NOD male mice, or NOR or C57BL/6 female mice. These results demonstrate that the majority of FoxP3-positive CD4+CD25+ T reg cells in NOD mice express TGFβ1 but not IL-10, and that a defect in the maintenance and/or expansion of this pool of immunoregulatory effectors is associated with the progression of T1D
Half brain irradiation in a murine model of breast cancer brain metastasis: Magnetic resonance imaging and histological assessments of dose-response
Background: Brain metastasis is becoming increasingly prevalent in breast cancer due to improved extra-cranial disease control. With emerging availability of modern image-guided radiation platforms, mouse models of brain metastases and small animal magnetic resonance imaging (MRI), we examined brain metastases\u27 responses from radiotherapy in the pre-clinical setting. In this study, we employed half brain irradiation to reduce inter-subject variability in metastases dose-response evaluations. Methods: Half brain irradiation was performed on a micro-CT/RT system in a human breast cancer (MDA-MB-231-BR) brain metastasis mouse model. Radiation induced DNA double stranded breaks in tumors and normal mouse brain tissue were quantified using γ-H2AX immunohistochemistry at 30 min (acute) and 11 days (longitudinal) after half-brain treatment for doses of 8, 16 and 24 Gy. In addition, tumor responses were assessed volumetrically with in-vivo longitudinal MRI and histologically for tumor cell density and nuclear size. Results: In the acute setting, γ-H2AX staining in tumors saturated at higher doses while normal mouse brain tissue continued to increase linearly in the phosphorylation of H2AX. While γ-H2AX fluorescence intensities returned to the background level in the brain 11 days after treatment, the residual γ-H2AX phosphorylation in the radiated tumors remained elevated compared to un-irradiated contralateral tumors. With radiation, MRI-derived relative tumor growth was significantly reduced compared to the un-irradiated side. While there was no difference in MRI tumor volume growth between 16 and 24 Gy, there was a significant reduction in tumor cell density from histology with increasing dose. In the longitudinal study, nuclear size in the residual tumor cells increased significantly as the radiation dose was increased. Conclusions: Radiation damages to the DNAs in the normal brain parenchyma are resolved over time, but remain unrepaired in the treated tumors. Furthermore, there is a radiation dose response in nuclear size of surviving tumor cells. Increase in nuclear size together with unrepaired DNA damage indicated that the surviving tumor cells post radiation had continued to progress in the cell cycle with DNA replication, but failed cytokinesis. Half brain irradiation provides efficient evaluation of dose-response for cancer cell lines, a pre-requisite to perform experiments to understand radio-resistance in brain metastases
Thoracic Aortic Calcium Versus Coronary Artery Calcium for the Prediction of Coronary Heart Disease and Cardiovascular Disease Events
ObjectivesThis study compared the ability of coronary artery calcium (CAC) and thoracic aortic calcium (TAC) to predict coronary heart disease (CHD) and cardiovascular disease (CVD) events.BackgroundCoronary artery calcium has been shown to strongly predict CHD and CVD events, but it is unknown whether TAC, also measured within a single cardiac computed tomography (CT) scan, is of further value in predicting events.MethodsA total of 2,303 asymptomatic adults (mean age 55.7 years, 38% female) with CT scans were followed up for 4.4 years for CHD (myocardial infarction, cardiac death, or late revascularizations) and CVD (CHD plus stroke). Cox regression, adjusted for Framingham risk score (FRS), examined the relation of Agatston CAC and TAC categories, and log-transformed CAC and TAC with the incidence of CHD and CVD events and receiver-operator characteristic (ROC) curves tested whether TAC improved prediction of events over CAC and FRS.ResultsA total of 53% of subjects had Agatston CAC scores of 0; 8% 1 to 9; 19% 10 to 99; 12% 100 to 399; and 8% ≥400. For TAC, proportions were 69%, 5%, 12%, 8%, and 7%, respectively; 41 subjects (1.8%) experienced CHD and 47 (2.0%) CVD events. The FRS-adjusted hazard ratios (HR) across increasing CAC groups (relative to <10) ranged from 3.7 (p = 0.04) to 19.6 (p < 0.001) for CHD and from 2.8 (p = 0.07) to 13.1 (p < 0.001) for CVD events; only TAC scores of 100 to 399 predicted CHD and CVD (HR: 3.0, p = 0.008, and HR: 2.3, p = 0.04, respectively); these risks were attenuated after accounting for CAC. Findings were consistent when using log-transformed CAC and TAC Agatston and volume scores. The ROC curve analyses showed CAC predicted CHD and CVD events over FRS alone (p < 0.01); however, TAC did not further add to predicting events over FRS or CAC.ConclusionsThis study found that CAC, but not TAC, is strongly related to CHD and CVD events. Moreover, TAC does not further improve event prediction over CAC
Infrastructural Speculations: Tactics for Designing and Interrogating Lifeworlds
This paper introduces “infrastructural speculations,” an orientation toward speculative design that considers the complex and long-lived relationships of technologies with broader systems, beyond moments of immediate invention and design. As modes of speculation are increasingly used to interrogate questions of broad societal concern, it is pertinent to develop an orientation that foregrounds the “lifeworld” of artifacts—the social, perceptual, and political environment in which they exist. While speculative designs often imply a lifeworld, infrastructural speculations place lifeworlds at the center of design concern, calling attention to the cultural, regulatory, environmental, and repair conditions that enable and surround particular future visions. By articulating connections and affinities between speculative design and infrastructure studies research, we contribute a set of design tactics for producing infrastructural speculations. These tactics help design researchers interrogate the complex and ongoing entanglements among technologies, institutions, practices, and systems of power when gauging the stakes of alternate lifeworlds
Recommended from our members
Methylation Analyses Reveal Promoter Hypermethylation as a Rare Cause of “Second Hit” in Germline BRCA1-Associated Pancreatic Ductal Adenocarcinoma
Background and objectivePancreatic ductal adenocarcinoma (PDAC) is characterized by the occurrence of pathogenic variants in BRCA1/2 in 5-6% of patients. Biallelic loss of BRCA1/2 enriches for response to platinum agents and poly (ADP-ribose) polymerase 1 inhibitors. There is a dearth of evidence on the mechanism of inactivation of the wild-type BRCA1 allele in PDAC tumors with a germline BRCA1 (gBRCA1) pathogenic or likely pathogenic variant (P/LPV). Herein, we examine promotor hypermethylation as a "second hit" mechanism in patients with gBRCA1-PDAC.MethodsWe evaluated patients with PDAC who underwent Memorial Sloan Kettering-Integrated Mutation Profiling of Actionable Cancer Targets (MSK-IMPACT) somatic and germline testing from an institutional database. DNA isolated from tumor tissue and matched normal peripheral blood were sequenced by MSK-IMPACT. In patients with gBRCA1-PDAC, we examined the somatic BRCA1 mutation status and promotor methylation status of the tumor BRCA1 allele via a methylation array analysis. In patients with sufficient remaining DNA, a second methylation analysis by pyrosequencing was performed.ResultsOf 1012 patients with PDAC, 19 (1.9%) were identified to harbor a gBRCA1 P/LPV. Fifteen patients underwent a methylation array and the mean percentage of BRCA1 promotor methylation was 3.62%. In seven patients in whom sufficient DNA was available, subsequent pyrosequencing confirmed an unmethylated BRCA1 promotor. Loss of heterozygosity was detected in 12 of 19 (63%, 95% confidence interval 38-84) patients, demonstrating loss of heterozygosity is the major molecular mechanism of BRCA1 inactivation in PDAC. Two (10.5%) cases had a somatic BRCA1 mutation.ConclusionsIn patients with gBRCA1-P/LPV-PDAC, loss of heterozygosity is the main inactivating mechanism of the wild-type BRCA1 allele in the tumor, and methylation of the BRCA1 promoter is a distinctly uncommon occurrence
Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors
Background:
Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries.
Methods:
In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants.
Findings:
45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups.
Interpretation:
Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency.
Funding:
NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation
Genome sequence of an Australian kangaroo, Macropus eugenii, provides insight into the evolution of mammalian reproduction and development.
BACKGROUND: We present the genome sequence of the tammar wallaby, Macropus eugenii, which is a member of the kangaroo family and the first representative of the iconic hopping mammals that symbolize Australia to be sequenced. The tammar has many unusual biological characteristics, including the longest period of embryonic diapause of any mammal, extremely synchronized seasonal breeding and prolonged and sophisticated lactation within a well-defined pouch. Like other marsupials, it gives birth to highly altricial young, and has a small number of very large chromosomes, making it a valuable model for genomics, reproduction and development. RESULTS: The genome has been sequenced to 2 × coverage using Sanger sequencing, enhanced with additional next generation sequencing and the integration of extensive physical and linkage maps to build the genome assembly. We also sequenced the tammar transcriptome across many tissues and developmental time points. Our analyses of these data shed light on mammalian reproduction, development and genome evolution: there is innovation in reproductive and lactational genes, rapid evolution of germ cell genes, and incomplete, locus-specific X inactivation. We also observe novel retrotransposons and a highly rearranged major histocompatibility complex, with many class I genes located outside the complex. Novel microRNAs in the tammar HOX clusters uncover new potential mammalian HOX regulatory elements. CONCLUSIONS: Analyses of these resources enhance our understanding of marsupial gene evolution, identify marsupial-specific conserved non-coding elements and critical genes across a range of biological systems, including reproduction, development and immunity, and provide new insight into marsupial and mammalian biology and genome evolution
Genome-Wide Analysis of Neuroblastomas using High-Density Single Nucleotide Polymorphism Arrays
BACKGROUND: Neuroblastomas are characterized by chromosomal alterations with biological and clinical significance. We analyzed paired blood and primary tumor samples from 22 children with high-risk neuroblastoma for loss of heterozygosity (LOH) and DNA copy number change using the Affymetrix 10K single nucleotide polymorphism (SNP) array. FINDINGS: Multiple areas of LOH and copy number gain were seen. The most commonly observed area of LOH was on chromosome arm 11q (15/22 samples; 68%). Chromosome 11q LOH was highly associated with occurrence of chromosome 3p LOH: 9 of the 15 samples with 11q LOH had concomitant 3p LOH (P = 0.016). Chromosome 1p LOH was seen in one-third of cases. LOH events on chromosomes 11q and 1p were generally accompanied by copy number loss, indicating hemizygous deletion within these regions. The one exception was on chromosome 11p, where LOH in all four cases was accompanied by normal copy number or diploidy, implying uniparental disomy. Gain of copy number was most frequently observed on chromosome arm 17q (21/22 samples; 95%) and was associated with allelic imbalance in six samples. Amplification of MYCN was also noted, and also amplification of a second gene, ALK, in a single case. CONCLUSIONS: This analysis demonstrates the power of SNP arrays for high-resolution determination of LOH and DNA copy number change in neuroblastoma, a tumor in which specific allelic changes drive clinical outcome and selection of therapy
From KIDSCREEN-10 to CHU9D: creating a unique mapping algorithm for application in economic evaluation
Background: The KIDSCREEN-10 index and the Child Health Utility 9D (CHU9D) are two recently developed generic instruments for the measurement of health-related quality of life in children and adolescents. Whilst the CHU9D is a preference based instrument developed specifically for application in cost-utility analyses, the KIDSCREEN-10 is not currently suitable for application in this context. This paper provides an algorithm for mapping the KIDSCREEN-10 index onto the CHU9D utility scores.
Methods: A sample of 590 Australian adolescents (aged 11–17) completed both the KIDSCREEN-10 and the CHU9D. Several econometric models were estimated, including ordinary least squares estimator, censored least absolute deviations estimator, robust MM-estimator and generalised linear model, using a range of explanatory variables with KIDSCREEN-10 items scores as key predictors. The predictive performance of each model was judged using mean absolute error (MAE) and root mean squared error (RMSE).
Results: The MM-estimator with stepwise-selected KIDSCREEN-10 items scores as explanatory variables had the best predictive accuracy using MAE, whilst the equivalent ordinary least squares model had the best predictive accuracy using RMSE.
Conclusions: The preferred mapping algorithm (i.e. the MM-estimate with stepwise selected KIDSCREEN-10 item scores as the predictors) can be used to predict CHU9D utility from KIDSCREEN-10 index with a high degree of accuracy. The algorithm may be usefully applied within cost-utility analyses to generate cost per quality adjusted life year estimates where KIDSCREEN-10 data only are available
- …