1,033 research outputs found
Cultural Rights and Civic Virtue
This paper addresses the potential tension between two broadly stated policy objectives: the preservation of distinctive cultural traditions, often through the mechanism of formal legal rights, and the fostering of civic virtue, a sense of local community and the advancement of common civic enterprises. Many political liberals argued that liberal societies have an obligation to accommodate the cultural traditions of various sub groups through legal rights and a redistribution of social resources. The “right to cultural difference” is now widely (if not universally) understood to be a basic human right, on par with rights to religious liberty and racial equality. Other theorists writing in the liberal, civic republican, and urban sociology traditions expounded on the necessity of civic virtue, community and common enterprises initiated and executed at the local or municipal level of government or private association. These theorists argued that common projects, shared norms and social trust are indispensable elements of effective democratic government and are necessary to the altruism and public spiritedness that in turn secure social justice. These two policy goals therefore may at times be in conflict. This conflict is especially severe in larger culturally diverse cities, where social trust and civic virtue are most needed and often in shortest supply. Policies designed to counter cosmopolitan alienation and anomie by fostering civic virtue, social trust and common social norms will inevitably conflict with the cultural traditions and sub group identification of some minority groups. The paper argues that such conflicts are often best confronted on the field of political debate and policy analysis, not in the language of civil rights. Rights discourse, with its inherent absolutism, is ill suited to the type of subtle tradeoffs that these conflicts often entail.Law, Rights, Multiculturalism
Over-expression of Thioredoxin-1 mediates growth, survival, and chemoresistance and is a druggable target in diffuse large B-cell lymphoma
Diffuse Large B cell lymphomas (DLBCL) are the most prevalent of the non-Hodgkin lymphomas and are currently initially treated fairly successfully, but frequently relapse as refractory disease, resulting in poor salvage therapy options and short survival. The greatest challenge in improving survival of DLBCL patients is overcoming chemo-resistance, whose basis is poorly understood. Among the potential mediators of DLBCL chemo-resistance is the thioredxoin (Trx) family, primarily because Trx family members play critical roles in the regulation of cellular redox homeostasis, and recent studies have indicated that dysregulated redox homeostasis also plays a key role in chemoresistance. In this study, we showed that most of the DLBCL-derived cell lines and primary DLBCL cells express higher basal levels of Trx-1 than normal B cells and that Trx-1 expression level is associated with decreased patients survival. Our functional studies showed that inhibition of Trx-1 by small interfering RNA or a Trx-1 inhibitor (PX-12) inhibited DLBCL cell growth, clonogenicity, and also sensitized DLBCL cells to doxorubicin-induced cell growth inhibition in vitro. These results indicate that Trx-1 plays a key role in cell growth and survival, as well as chemoresistance, and is a potential target to overcome drug resistance in relapsed/refractory DLBCL
Recommended from our members
Distinct Effects on Diversifying Selection by Two Mechanisms of Immunity Against Streptococcus pneumoniae
Antigenic variation to evade host immunity has long been assumed to be a driving force of diversifying selection in pathogens. Colonization by Streptococcus pneumoniae, which is central to the organism's transmission and therefore evolution, is limited by two arms of the immune system: antibody- and T cell- mediated immunity. In particular, the effector activity of CD4+ TH17 cell mediated immunity has been shown to act in trans, clearing co-colonizing pneumococci that do not bear the relevant antigen. It is thus unclear whether TH17 cell immunity allows benefit of antigenic variation and contributes to diversifying selection. Here we show that antigen-specific CD4+ TH17 cell immunity almost equally reduces colonization by both an antigen-positive strain and a co-colonized, antigen-negative strain in a mouse model of pneumococcal carriage, thus potentially minimizing the advantage of escape from this type of immunity. Using a proteomic screening approach, we identified a list of candidate human CD4+ TH17 cell antigens. Using this list and a previously published list of pneumococcal Antibody antigens, we bioinformatically assessed the signals of diversifying selection among the identified antigens compared to non-antigens. We found that Antibody antigen genes were significantly more likely to be under diversifying selection than the TH17 cell antigen genes, which were indistinguishable from non-antigens. Within the Antibody antigens, epitopes recognized by human antibodies showed stronger evidence of diversifying selection. Taken together, the data suggest that TH17 cell-mediated immunity, one form of T cell immunity that is important to limit carriage of antigen-positive pneumococcus, favors little diversifying selection in the targeted antigen. The results could provide new insight into pneumococcal vaccine design
Spitzer Imaging of i'-drop Galaxies: Old Stars at z~6
We present new evidence for mature stellar populations with ages >100Myr in
massive galaxies (M_stellar>10^10M_sun) seen at a time when the Universe was
less than 1Gyr old. We analyse the prominent detections of two z~6 star-forming
galaxies (SBM03#1 & #3) made at wavelengths corresponding to the rest-frame
optical using the IRAC camera onboard the Spitzer Space Telescope. We had
previously identified these galaxies in HST/ACS GOODS images of Chandra Deep
Field South through the "i-drop" Lyman break technique, and subsequently
confirmed spectroscopically with the Keck telescope. The new Spitzer photometry
reveals significant Balmer/4000Ang discontinuities, indicative of dominant
stellar populations with ages >100Myr. Fitting a range of population synthesis
models (for normal initial mass functions) to the HST/Spitzer photometry yields
ages of 250-650Myr and implied formation redshifts z~7.5-13.5 in
presently-accepted world models. Remarkably, our sources have best-fit stellar
masses of 1.3-3.8x10^10M_sun (95% confidence) assuming a Salpeter initial mass
function. This indicates that at least some galaxies with stellar masses >20%
of those of a present-day L* galaxy had already assembled within the first Gyr
after the Big Bang. We also deduce that the past average star formation rate
must be comparable to the current observed rate (SFR_UV~5-30M_sun/yr),
suggesting that there may have been more vigorous episodes of star formation in
such systems at higher redshifts. Although a small sample, limited primarily by
Spitzer's detection efficiency, our result lends support to the hypothesis
advocated in our earlier analyses of the Ultra Deep Field and GOODS HST/ACS
data. The presence of established systems at z~6 suggests long-lived sources at
earlier epochs (z>7) played a key role in reionizing the Universe.Comment: Accepted for publication in MNRAS (minor corrections made
Compensatory mutations reducing the fitness cost of plasmid carriage occur in plant rhizosphere communities
Plasmids drive bacterial evolutionary innovation by transferring ecologically important functions between lineages, but acquiring a plasmid often comes at a fitness cost to the host cell. Compensatory mutations, which ameliorate the cost of plasmid carriage, promote plasmid maintenance in simplified laboratory media across diverse plasmid-host associations. Whether such compensatory evolution can occur in more complex communities inhabiting natural environmental niches where evolutionary paths may be more constrained is, however, unclear. Here, we show a substantial fitness cost of carrying the large conjugative plasmid pQBR103 in Pseudomonas fluorescens SBW25 in the plant rhizosphere. This plasmid fitness cost could be ameliorated by compensatory mutations affecting the chromosomal global regulatory system gacA/gacS, which arose rapidly in plant rhizosphere communities and were exclusive to plasmid carriers. These findings expand our understanding of the importance of compensatory evolution in plasmid dynamics beyond simplified lab media. Compensatory mutations contribute to plasmid survival in bacterial populations living within complex microbial communities in their environmental niche
Recommended from our members
A toxicokinetic model for thiamethoxam in rats: implications for higher-tier risk assessment
Risk assessment for mammals is currently based on external exposure measurements, but effects of toxicants are better correlated with the systemically available dose than with the external administered dose. So for risk assessment of pesticides, toxicokinetics should be interpreted in the context of potential exposure in the field taking account of the timescale of exposure and individual patterns of feeding. Internal concentration is the net result of absorption, distribution, metabolism and excretion (ADME). We present a case study for thiamethoxam to show how data from ADME study on rats can be used to parameterize a body burden model which predicts body residue levels after exposures to LD50 dose either as a bolus or eaten at different feeding rates. Kinetic parameters were determined in male and female rats after an intravenous and oral administration of 14C labelled by fitting one-compartment models to measured pesticide concentrations in blood for each individual separately. The concentration of thiamethoxam in blood over time correlated closely with concentrations in other tissues and so was considered representative of pesticide concentration in the whole body. Body burden model simulations showed that maximum body weight-normalized doses of thiamethoxam were lower if the same external dose was ingested normally than if it was force fed in a single bolus dose. This indicates lower risk to rats through dietary exposure than would be estimated from the bolus LD50. The importance of key questions that should be answered before using the body burden approach in risk assessment, data requirements and assumptions made in this study are discussed in detail
Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors
Background:
Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries.
Methods:
In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants.
Findings:
45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups.
Interpretation:
Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency.
Funding:
NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation
- …