69 research outputs found

    Political mobilisation by minorities in Britain: negative feedback of ‘race relations'?

    Get PDF
    This article uses a political opportunity approach to study the relationship of minority groups to the political community in Britain. The main argument is that the British race relations approach established in the 1960s had an important effect that still shapes the patterns of political contention by different minority groups today. Original data on political claims-making by minorities demonstrate that British 'racialised' cultural pluralism has structured an inequality of opportunities for the two main groups, African-Caribbeans and Indian subcontinent minorities. African-Caribbeans mobilise along racial lines, use a strongly assimilative 'black' identity, conventional action forms, and target state institutions with demands for justice that are framed within the recognised framework of race relations. Conversely, a high proportion of the Indian subcontinent minority mobilisation is by Muslim groups, a non-assimilative religious identity. These are autonomously organised, but largely make public demands for extending the principle of racial equality to their non-racial group. Within the Indian subcontinent minorities, the relative absence of mobilisation by Indian, Sikh and Hindu minorities, who have achieved much better levels of socio-economic success than Pakistani and Bangladeshi Muslims, suggests that there is also a strong socioeconomic basis for shared experiences and grievances as Muslims in Britain. This relativises the notion that Muslim mobilisation is Britain is purely an expression of the right for cultural difference per se, and sees it as a product of the paradoxes of British race relations

    Strong gravitational lensing probes of the particle nature of dark matter

    Full text link
    There is a vast menagerie of plausible candidates for the constituents of dark matter, both within and beyond extensions of the Standard Model of particle physics. Each of these candidates may have scattering (and other) cross section properties that are consistent with the dark matter abundance, BBN, and the most scales in the matter power spectrum; but which may have vastly different behavior at sub-galactic "cutoff" scales, below which dark matter density fluctuations are smoothed out. The only way to quantitatively measure the power spectrum behavior at sub-galactic scales at distances beyond the local universe, and indeed over cosmic time, is through probes available in multiply imaged strong gravitational lenses. Gravitational potential perturbations by dark matter substructure encode information in the observed relative magnifications, positions, and time delays in a strong lens. Each of these is sensitive to a different moment of the substructure mass function and to different effective mass ranges of the substructure. The time delay perturbations, in particular, are proving to be largely immune to the degeneracies and systematic uncertainties that have impacted exploitation of strong lenses for such studies. There is great potential for a coordinated theoretical and observational effort to enable a sophisticated exploitation of strong gravitational lenses as direct probes of dark matter properties. This opportunity motivates this white paper, and drives the need for: a) strong support of the theoretical work necessary to understand all astrophysical consequences for different dark matter candidates; and b) tailored observational campaigns, and even a fully dedicated mission, to obtain the requisite data.Comment: Science white paper submitted to the Astro2010 Decadal Cosmology & Fundamental Physics Science Frontier Pane

    Insulin resistance, lipotoxicity, type 2 diabetes and atherosclerosis: the missing links. The Claude Bernard Lecture 2009

    Get PDF
    Insulin resistance is a hallmark of type 2 diabetes mellitus and is associated with a metabolic and cardiovascular cluster of disorders (dyslipidaemia, hypertension, obesity [especially visceral], glucose intolerance, endothelial dysfunction), each of which is an independent risk factor for cardiovascular disease (CVD). Multiple prospective studies have documented an association between insulin resistance and accelerated CVD in patients with type 2 diabetes, as well as in non-diabetic individuals. The molecular causes of insulin resistance, i.e. impaired insulin signalling through the phosphoinositol-3 kinase pathway with intact signalling through the mitogen-activated protein kinase pathway, are responsible for the impairment in insulin-stimulated glucose metabolism and contribute to the accelerated rate of CVD in type 2 diabetes patients. The current epidemic of diabetes is being driven by the obesity epidemic, which represents a state of tissue fat overload. Accumulation of toxic lipid metabolites (fatty acyl CoA, diacylglycerol, ceramide) in muscle, liver, adipocytes, beta cells and arterial tissues contributes to insulin resistance, beta cell dysfunction and accelerated atherosclerosis, respectively, in type 2 diabetes. Treatment with thiazolidinediones mobilises fat out of tissues, leading to enhanced insulin sensitivity, improved beta cell function and decreased atherogenesis. Insulin resistance and lipotoxicity represent the missing links (beyond the classical cardiovascular risk factors) that help explain the accelerated rate of CVD in type 2 diabetic patients

    Impact of COVID-19 on cardiovascular testing in the United States versus the rest of the world

    Get PDF
    Objectives: This study sought to quantify and compare the decline in volumes of cardiovascular procedures between the United States and non-US institutions during the early phase of the coronavirus disease-2019 (COVID-19) pandemic. Background: The COVID-19 pandemic has disrupted the care of many non-COVID-19 illnesses. Reductions in diagnostic cardiovascular testing around the world have led to concerns over the implications of reduced testing for cardiovascular disease (CVD) morbidity and mortality. Methods: Data were submitted to the INCAPS-COVID (International Atomic Energy Agency Non-Invasive Cardiology Protocols Study of COVID-19), a multinational registry comprising 909 institutions in 108 countries (including 155 facilities in 40 U.S. states), assessing the impact of the COVID-19 pandemic on volumes of diagnostic cardiovascular procedures. Data were obtained for April 2020 and compared with volumes of baseline procedures from March 2019. We compared laboratory characteristics, practices, and procedure volumes between U.S. and non-U.S. facilities and between U.S. geographic regions and identified factors associated with volume reduction in the United States. Results: Reductions in the volumes of procedures in the United States were similar to those in non-U.S. facilities (68% vs. 63%, respectively; p = 0.237), although U.S. facilities reported greater reductions in invasive coronary angiography (69% vs. 53%, respectively; p < 0.001). Significantly more U.S. facilities reported increased use of telehealth and patient screening measures than non-U.S. facilities, such as temperature checks, symptom screenings, and COVID-19 testing. Reductions in volumes of procedures differed between U.S. regions, with larger declines observed in the Northeast (76%) and Midwest (74%) than in the South (62%) and West (44%). Prevalence of COVID-19, staff redeployments, outpatient centers, and urban centers were associated with greater reductions in volume in U.S. facilities in a multivariable analysis. Conclusions: We observed marked reductions in U.S. cardiovascular testing in the early phase of the pandemic and significant variability between U.S. regions. The association between reductions of volumes and COVID-19 prevalence in the United States highlighted the need for proactive efforts to maintain access to cardiovascular testing in areas most affected by outbreaks of COVID-19 infection

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jÀsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Ileal transposition in rats reduces energy intake, body weight, and body fat most efficaciously when ingesting a high-protein diet

    Get PDF
    PURPOSE: Ileal transposition (IT) allows exploration of hindgut effects of bariatric procedures in inducing weight loss and reducing adiposity. Here we investigated the role of dietary macronutrient content on IT effects in rats. METHODS: Male Lewis rats consuming one of three isocaloric liquid diets enriched with fat (HF), carbohydrates (HC), or protein (HP) underwent IT or sham surgery. Body weight, energy intake, energy efficiency, body composition, and (meal-induced) changes in plasma GIP, GLP-1, PYY, neurotensin, and insulin levels were measured. RESULTS: Following IT, HC intake remained highest leading to smallest weight loss among dietary groups. IT in HF rats caused high initial weight loss and profound hypophagia, but the rats caught up later, and finally had the highest body fat content among IT rats. HP diet most efficaciously supported IT-induced reduction in body weight and adiposity, but (as opposed to other diet groups) lean mass was also reduced. Energy efficiency decreased immediately after IT irrespective of diet, but normalized later. Energy intake alone explained variation in post-operative weight change by 80%. GLP-1, neurotensin, and PYY were upregulated by IT, particularly during (0-60 min) and following 17-h post-ingestive intake, with marginal diet effects. Thirty-day post-operative cumulative energy intake was negatively correlated to 17-h post-ingestive PYY levels, explaining 47% of its variation. CONCLUSION: Reduction in energy intake underlies IT-induced weight loss, with highest efficacy of the HP diet. PYY, GLP-1, and neurotensin levels are upregulated by IT, of which PYY may be most specifically related to reduced intake and weight loss after IT
    • 

    corecore