3,380 research outputs found

    Oscillatory and propagating modes of temperature variability at the 3–3.5- and 4–4.5-yr time scales in the Upper Southwest Pacific Ocean

    Get PDF
    This paper investigates oscillatory and propagating patterns of normalized surface and subsurface temperature anomalies (from the seasonal cycle) in the southwest Pacific Ocean using an extended empirical orthogonal function (EEOF) analysis. The temperature data (and errors) are from the Digital Atlas of Southwest Pacific upper Ocean Temperatures (DASPOT). These data are 3 monthly in time (January, April, July, and October), 2° X 2° in space, and 5 m in the vertical to 450-m depths. The temperature anomalies in the EEOF analysis are normalized by the objective mapping temperature errors at each grid point. They are also Butterworth filtered in the 3–7-yr band to examine interannual variations in the temperature field. The oscillating and propagating patterns of the modes are examined across four vertical levels: the surface, and 100-, 250-, and 450-m depths. The dominant mode EEOF (70% of the total variance of the filtered data) oscillates in a 4–4.5-yr quasi-periodic manner that is consistent with El Niño–Southern Oscillation (ENSO). Anomalies peak first at the surface in the subtropics between New Caledonia and Fiji (centered around 17°S, 177°E), then 6 months later in the tropical far west centered around the Solomon Islands (5°S, 153°–157°E), with a maximum at the base of the mixed layer (100 m) and upper thermocline (250 m), and then eastward in the northeast of the southwest Pacific region (0°–10°S, 160°E–180°). Mode 2 (25% variance of the filtered data) has a periodicity of 3–3.5 yr, with centers of action in all four vertical levels. The mode-2 patterns are consistent with variations in the subtropical gyre circulation, including the East Australian Current and its separation, and are continuous with the Tasman Front. Two spatial dipoles are apparent: (i) one in sea surface temperature (SST) at about 5°S, straddling west–east either side of the Solomon Islands, consistent with the classic Pacific-wide ENSO SST anomaly mode, and (ii) a subsurface dipole pattern, with centers in the Solomon Islands region at 100- and 250-m depths, and the western Tasman Sea (27°–33°S, 157°–161°E) at 250- and 450-m depths, consistent with dynamic changes in the gyre intensity

    Maternal Particulate Matter Exposure Impairs Lung Health and Is Associated with Mitochondrial Damage.

    Full text link
    Relatively little is known about the transgenerational effects of chronic maternal exposure to low-level traffic-related air pollution (TRAP) on the offspring lung health, nor are the effects of removing such exposure before pregnancy. Female BALB/c mice were exposed to PM2.5 (PM2.5, 5 µg/day) for 6 weeks before mating and during gestation and lactation; in a subgroup, PM was removed when mating started to model mothers moving to cleaner areas during pregnancy to protect their unborn child (Pre-exposure). Lung pathology was characterised in both dams and offspring. A subcohort of female offspring was also exposed to ovalbumin to model allergic airways disease. PM2.5 and Pre-exposure dams exhibited airways hyper-responsiveness (AHR) with mucus hypersecretion, increased mitochondrial reactive oxygen species (ROS) and mitochondrial dysfunction in the lungs. Female offspring from PM2.5 and Pre-exposure dams displayed AHR with increased lung inflammation and mitochondrial ROS production, while males only displayed increased lung inflammation. After the ovalbumin challenge, AHR was increased in female offspring from PM2.5 dams compared with those from control dams. Using an in vitro model, the mitochondria-targeted antioxidant MitoQ reversed mitochondrial dysfunction by PM stimulation, suggesting that the lung pathology in offspring is driven by dysfunctional mitochondria. In conclusion, chronic exposure to low doses of PM2.5 exerted transgenerational impairment on lung health

    Blame, Symbolic Stigma and HIV Misconceptions are Associated with Support for Coercive Measures in Urban India

    Get PDF
    This study was designed to examine the prevalence of stigma and its underlying factors in two large Indian cities. Cross-sectional interview data were collected from 1,076 non-HIV patients in multiple healthcare settings in Mumbai and Bengaluru, India. The vast majority of participants supported mandatory testing for marginalized groups and coercive family policies for PLHA, stating that they “deserved” their infections and “didn’t care” about infecting others. Most participants did not want to be treated at the same clinic or use the same utensils as PLHA and transmission misconceptions were common. Multiple linear regression showed that blame, transmission misconceptions, symbolic stigma and negative feelings toward PLHA were significantly associated with both stigma and discrimination. The results indicate an urgent need for continued stigma reduction efforts to reduce the suffering of PLHA and barriers to prevention and treatment. Given the high levels of blame and endorsement of coercive policies, it is crucial that such programs are shaped within a human rights framework

    Emergency and critical care services in Tanzania: a survey of ten hospitals.

    Get PDF
    While there is a need for good quality care for patients with serious reversible disease in all countries in the world, Emergency and Critical Care tends to be one of the weakest parts of health systems in low-income countries. We assessed the structure and availability of resources for Emergency and Critical Care in Tanzania in order to identify the priorities for improving care in this neglected specialty. Ten hospitals in four regions of Tanzania were assessed using a structured data collection tool. Quality was evaluated with standards developed from the literature and expert opinion. Important deficits were identified in infrastructure, routines and training. Only 30% of the hospitals had an emergency room for adult and paediatric patients. None of the seven district and regional hospitals had a triage area or intensive care unit for adults. Only 40% of the hospitals had formal systems for adult triage and in less than one third were critically ill patients seen by clinicians more than once daily. In 80% of the hospitals there were no staff trained in adult triage or critical care. In contrast, a majority of equipment and drugs necessary for emergency and critical care were available in the hospitals (median 90% and 100% respectively. The referral/private hospitals tended to have a greater overall availability of resources (median 89.7%) than district/regional hospitals (median 70.6). Many of the structures necessary for Emergency and Critical Care are lacking in hospitals in Tanzania. Particular weaknesses are infrastructure, routines and training, whereas the availability of drugs and equipment is generally good. Policies to improve hospital systems for the care of emergency and critically ill patients should be prioritised

    Time-Based Partitioning Model for Predicting Neurologically Favorable Outcome among Adults with Witnessed Bystander Out-of-Hospital CPA

    Get PDF
    Background:Optimal acceptable time intervals from collapse to bystander cardiopulmonary resuscitation (CPR) for neurologically favorable outcome among adults with witnessed out-of-hospital cardiopulmonary arrest (CPA) have been unclear. Our aim was to assess the optimal acceptable thresholds of the time intervals of CPR for neurologically favorable outcome and survival using a recursive partitioning model.Methods and Findings:From January 1, 2005 through December 31, 2009, we conducted a prospective population-based observational study across Japan involving consecutive out-of-hospital CPA patients (N = 69,648) who received a witnessed bystander CPR. Of 69,648 patients, 34,605 were assigned to the derivation data set and 35,043 to the validation data set. Time factors associated with better outcomes: the better outcomes were survival and neurologically favorable outcome at one month, defined as category one (good cerebral performance) or two (moderate cerebral disability) of the cerebral performance categories. Based on the recursive partitioning model from the derivation dataset (n = 34,605) to predict the neurologically favorable outcome at one month, 5 min threshold was the acceptable time interval from collapse to CPR initiation; 11 min from collapse to ambulance arrival; 18 min from collapse to return of spontaneous circulation (ROSC); and 19 min from collapse to hospital arrival. Among the validation dataset (n = 35,043), 209/2,292 (9.1%) in all patients with the acceptable time intervals and 1,388/2,706 (52.1%) in the subgroup with the acceptable time intervals and pre-hospital ROSC showed neurologically favorable outcome.Conclusions:Initiation of CPR should be within 5 min for obtaining neurologically favorable outcome among adults with witnessed out-of-hospital CPA. Patients with the acceptable time intervals of bystander CPR and pre-hospital ROSC within 18 min could have 50% chance of neurologically favorable outcome

    B Cells Regulate Neutrophilia during Mycobacterium tuberculosis Infection and BCG Vaccination by Modulating the Interleukin-17 Response

    Get PDF
    We have previously demonstrated that B cells can shape the immune response to Mycobacterium tuberculosis, including the level of neutrophil infiltration and granulomatous inflammation at the site of infection. The present study examined the mechanisms by which B cells regulate the host neutrophilic response upon exposure to mycobacteria and how neutrophilia may influence vaccine efficacy. To address these questions, a murine aerosol infection tuberculosis (TB) model and an intradermal (ID) ear BCG immunization mouse model, involving both the μMT strain and B cell-depleted C57BL/6 mice, were used. IL (interleukin)-17 neutralization and neutrophil depletion experiments using these systems provide evidence that B cells can regulate neutrophilia by modulating the IL-17 response during M. tuberculosis infection and BCG immunization. Exuberant neutrophilia at the site of immunization in B cell-deficient mice adversely affects dendritic cell (DC) migration to the draining lymph nodes and attenuates the development of the vaccine-induced Th1 response. The results suggest that B cells are required for the development of optimal protective anti-TB immunity upon BCG vaccination by regulating the IL-17/neutrophilic response. Administration of sera derived from M. tuberculosis-infected C57BL/6 wild-type mice reverses the lung neutrophilia phenotype in tuberculous μMT mice. Together, these observations provide insight into the mechanisms by which B cells and humoral immunity modulate vaccine-induced Th1 response and regulate neutrophila during M. tuberculosis infection and BCG immunization. © 2013 Kozakiewicz et al

    Hepatitis B and Renal Disease

    Get PDF
    Glomerulonephritis is an important extrahepatic manifestation of chronic hepatitis B virus (HBV) infection. The uncommon occurrence, variability in renal histopathology, and heterogeneity in clinical course present challenges in clinical studies and have resulted in a relative paucity of data and uncertainty with regard to the optimal management of HBV-related glomerular diseases. The advent of nucleos(t)ide analogue medications that effectively suppress HBV replication has markedly altered the clinical outcomes of kidney transplant recipients with HBV infection, but the emergence of drug resistance is an escalating problem. This article reviews the recent knowledge of the pathogenesis and treatment of HBV-related membranous nephropathy, and discusses the management of hepatitis B in kidney transplant recipients, which is continuously evolving

    Changes in Inflammatory Response after Endovascular Treatment for Type B Aortic Dissection

    Get PDF
    This present study aims to investigate the changes in the inflammatory markers after elective endovascular treatment of Type B aortic dissection with aneurysm, as related to different anatomical features of the dissection flap in the paravisceral perfusion. Consecutive patients with type B aortic dissections with elective endovascular stent graft repair were recruited and categorized into different groups. Serial plasma levels of cytokines (Interleukin-1β, -6, -8, -10, TNF-α), chemokines (MCP-1), and serum creatinine were monitored at pre-, peri- and post-operative stages. The length of stent graft employed in each surgery was retrieved and correlated with the change of all studied biochemical parameters. A control group of aortic dissected patients with conventional medication management was recruited for comparing the baseline biochemical parameters. In total, 22 endovascular treated and 16 aortic dissected patients with surveillance were recruited. The endovascular treated patients had comparable baseline levels as the non-surgical patients. There was no immediate or thirty day-mortality, and none of the surgical patients developed post-operative mesenteric ischaemia or clinically significant renal impairment. All surgical patients had detectable pro-inflammatory mediators, but none of the them showed any statistical significant surge in the peri-operative period except IL-1β and IL-6. Similar results were obtained when categorized into different groups. IL-1β and IL-6 showed maximal levels within hours of the endovascular procedure (range, 3.93 to 27.3 higher than baseline; p = 0.001), but returned to baseline 1 day post-operatively. The change of IL-1β and IL-6 at the stent graft deployment was statistically greater in longer stent graft (p>0.05). No significant changes were observed in the serum creatinine levels. In conclusion, elective endovascular repair of type B aortic dissection associated with insignificant changes in inflammatory mediators and creatinine. All levels fell toward basal levels post-operatively suggesting that thoracic endovascular aortic repair is rather less aggressive with insignificant inflammatory modulation

    How do we create, and improve, the evidence base? 

    Get PDF
    Providing best clinical care involves using the best available evidence of effectiveness to inform treatment decisions. Producing this evidence begins with trials and continues through synthesis of their findings towards evidence incorporation within comprehensible, usable guidelines, for clinicians and patients at the point of care. However, there is enormous wastage in this evidence production process, with less than 50% of the published biomedical literature considered sufficient in conduct and reporting to be fit for purpose. Over the last 30 years, independent collaborative initiatives have evolved to optimise the evidence to improve patient care. These collaborations each recommend how to improve research quality in a small way at many different stages of the evidence production and distillation process. When we consider these minimal improvements at each stage from an 'aggregation of marginal gains' perspective, the accumulation of small enhancements aggregates, thereby greatly improving the final product of 'best available evidence'. The myriad of tools to reduce research quality leakage and evidence loss should be routinely used by all those with responsibility for ensuring that research benefits patients, that is, those who pay for research (funders), produce it (researchers), take part in it (patients/participants) and use it (clinicians, policy makers and service commissioners)
    corecore