30 research outputs found
Chemically bonded liquid crystals as stationary phases for high performance liquid chromatography
Phosphorus dynamics in the marshland upwelling system
The Marshland Upwelling System (MUS) is an alternative onsite wastewater treatment technology designed to utilize the natural ecology of saltwater marshes to remove human-borne contaminants. Previous research has assessed the ability of MUS to remove both total and orthophosphate. Studies have also indicated a clear zone of phosphorus (P) saturation occurring in MUS soils. Laboratory column study was performed to accomplish the objectives of this study which were to: 1) determine the fate and fractionation of phosphorus in the soil matrix, 2) understand sorption kinetics and determine phosphorus sorption potential of wetland soils in retaining phosphorus in the MUS, 3) determine the service life of the MUS for phosphorus retention. Column study was performed under saltwater and freshwater conditions, wherein artificial wastewater was injected in the columns at a flow rate of 0.7 mL/min, every alternate day. At the end of the study, soil in all columns receiving different salinity treatments was analysed for different phosphorus fractions. Inorganic-P was found to be dominating in sub-surface layers. Organic-P fractions were found in considerable amount in surface layers, which were potentially released by the soil microbial activity. Inorganic-P fractions were likely precipitated by high concentrations of Fe, Al, Ca and, Mg cations present in the soil, under low redox and near neutral to alkaline pH conditions. According to the P-sorption studies conducted, Langmuir one-site isotherm proved best to predict phosphorus sorption mechanism for the MUS soils. It showed a minimum of 361 mg P/kg soil and maximum of 646 mg-P/kg-soil of maximum adsorption capacity at different ionic strengths under anaerobic conditions. Significant differences (p\u3c0.0001) were found between soil and salinity (ionic strength) interactions for the sorption phenomena for aerobic and anaerobic conditions. Longevity parameter (LT) was developed to predict the service life of MUS based on the saturation of phosphorus observed. Service life of minimum and maximum of 15 and 26.9 years, respectively were predicted by the LT under anaerobic conditions for a representative filter volume of 125 m2 at a constant depth of 4m and hydraulic loading rate of 2016 L/d at an influent phosphorus concentration of 15 mg/L
A Systematic Review Investigating the Use of Earth Observation for the Assistance of Water, Sanitation and Hygiene in Disaster Response and Recovery
The use of Earth observation technology such as satellites, unmanned aircraft, or drones as part of early-warning systems and disaster risk reduction plans is a widely researched and established area of study. However, the use this technology can have in the provision of water, sanitation and hygiene services in the response and recovery phases of a disaster is not widely researched. A systematic literature review was undertaken assessing relevant literature to identify Earth observation technology and methods that can be applied to the context of water, sanitation and hygiene in disaster response and recovery. Whilst there were many water-related studies, there was a lack of studies looking at the potential uses of Earth observation for sanitation. This is an area that requires further research. Three main common uses of Earth observation technology were identified as relevant: (1) Monitoring of surface water quality; (2) Groundwater Sensing; and (3) Mapping and monitoring of hazards and infrastructure. Whilst the studies of Earth observation in these areas highlight that this technology could be usefully applied to assist with water, sanitation and hygiene during disaster response and recovery, more research is needed and there are limitations to consider—predominantly that funding, communication and integration between many agencies and technologies are required. Additionally, some technologies are subject to local regulations which can cause restrictions to their use over contested or private areas, or trans-national boundaries—common situations in disasters. This review was largely influenced by the search strings inputted during the identification of relevant literature; changing the search strings would likely result in a different combination of literature available for review and subsequent variations in the findings
Obesity and diabetes are major risk factors for epicardial adipose tissue inflammation
BACKGROUND. Epicardial adipose tissue (EAT) directly overlies the myocardium, with changes in its morphology and volume associated with myriad cardiovascular and metabolic diseases. However, EAT's immune structure and cellular characterization remain incompletely described. We aimed to define the immune phenotype of EAT in humans and compare such profiles across lean, obese, and diabetic patients. METHODS. We recruited 152 patients undergoing open-chest coronary artery bypass grafting (CABG), valve repair/replacement (VR) surgery, or combined CABG/VR. Patients' clinical and biochemical data and EAT, subcutaneous adipose tissue (SAT), and preoperative blood samples were collected. Immune cell profiling was evaluated by flow cytometry and complemented by gene expression studies of immune mediators. Bulk RNA-Seq was performed in EAT across metabolic profiles to assess whole-transcriptome changes observed in lean, obese, and diabetic groups. RESULTS. Flow cytometry analysis demonstrated EAT was highly enriched in adaptive immune (T and B) cells. Although overweight/obese and diabetic patients had similar EAT cellular profiles to lean control patients, the EAT exhibited significantly (P ≤ 0.01) raised expression of immune mediators, including IL-1, IL-6, TNF-α, and IFN-γ. These changes were not observed in SAT or blood. Neither underlying coronary artery disease nor the presence of hypertension significantly altered the immune profiles observed. Bulk RNA-Seq demonstrated significant alterations in metabolic and inflammatory pathways in the EAT of overweight/obese patients compared with lean controls. CONCLUSION. Adaptive immune cells are the predominant immune cell constituent in human EAT and SAT. The presence of underlying cardiometabolic conditions, specifically obesity and diabetes, rather than cardiac disease phenotype appears to alter the inflammatory profile of EAT. Obese states markedly alter EAT metabolic and inflammatory signaling genes, underlining the impact of obesity on the EAT transcriptome profile
Achieving Surgical, Obstetric, Trauma, and Anesthesia (SOTA) care for all in South Asia
South Asia is a demographically crucial, economically aspiring, and socio-culturally diverse region in the world. The region contributes to a large burden of surgically-treatable disease conditions. A large number of people in South Asia cannot access safe and affordable surgical, obstetric, trauma, and anesthesia (SOTA) care when in need. Yet, attention to the region in Global Surgery and Global Health is limited. Here, we assess the status of SOTA care in South Asia. We summarize the evidence on SOTA care indicators and planning. Region-wide, as well as country-specific challenges are highlighted. We also discuss potential directions—initiatives and innovations—toward addressing these challenges. Local partnerships, sustained research and advocacy efforts, and politics can be aligned with evidence-based policymaking and health planning to achieve equitable SOTA care access in the South Asian region under the South Asian Association for Regional Cooperation (SAARC)
Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19
IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19.
Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19.
DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022).
INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days.
MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes.
RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively).
CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes.
TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570
Recommended from our members
The Development of Resistance of Human Immunodeficiency Virus to RNA Interference Therapies: Understanding Mechanism and Developing Strategies to Overcome
Human immunodeficiency virus (HIV) possesses a prolific ability to mutate and adapt to an ever-changing environment. This intrinsic capacity for mutation not only allows HIV to evade the immune response, but also allows the virus to develop resistance to antiretroviral therapies. As an approach that targets RNA sequence rather than protein structure, RNA interference (RNAi) offers the potential for faster drug development and fewer side effects for treating HIV infection. However, the very sequence-specificity that gives RNAi-based therapies these advantages also makes the therapy susceptible to HIV escape, and the development of resistance to RNAi has been extensively documented. The work presented in this dissertation systematically analyzes how therapy delivery limitations, properties of RNAi targets, combinatorial approaches and existing HIV diversity can affect therapy efficacy and the development of resistance to RNAi. We have demonstrated that HIV can escape RNAi by indirect mechanisms of resistance to RNAi. When HIV was exposed to a RNAi therapy targeting the highly conserved trans-activation response (TAR) hairpin of the long terminal repeat (LTR), we failed to isolate any viral clones with mutations in the target site. Instead, we identified many mutations in the U3 region of the LTR that served to tune viral gene expression and overwhelm the RNAi pathway.One method to combat resistance is to use combinations of siRNAs in a manner similar to the existing highly active antiretroviral therapy (HAART). Combinations inhibit the virus at multiple loci, making it highly unlikely that a variant resistant to all components of the combination will emerge. While combinatorial RNAi therapy may delay the onset of resistance, our results indicate that the distribution or compartmentalization of the combination within cellular subpopulations may not be a critical factor in determining therapy efficacy. While we isolated several viral clones with mutations with the RNAi targeted regions, extensive sequence analysis indicated that these mutations were not fixed. We again identified mutations in the U3 region of the LTR, many of which were fixed and unique to virus that was exposed to a RNAi selective pressure. When compared to HIV that was propagated in the absence of a RNAi selective pressure, a significantly higher number of mutations in the U3 region correlated with the degree of sequence conservation of the RNAi target site. Taken together, these data suggest that high degrees of sequence conservation at the RNAi target site could divert selective pressure to the U3 region of the LTR. Finally, we have explored how existing global sequence diversity of HIV can affect a sequence-specific therapy such as RNAi. We identified two regions of the HIV genome that could potentially serve as targets for a global RNAi therapy and we developed a cell culture system that could serve as the foundation of any long-term studies of the evolution of different HIV subtypes in response to a RNAi therapy. In summary, RNAi is a promising therapy for HIV; however, a number of challenges concerning viral escape remain. We have developed a number of systems to study HIV evolution in response to RNAi therapies, and our findings emphasize that one must look "outside the target" when searching for resistance as this may represent a more general mechanism by which viruses adapt to selective pressure and escape antiviral therapy
The clinical and demographic characteristics of elderly patients of Polish origin newly referred to a geriatric psychiatry service
A Systematic Review Investigating the Use of Earth Observation for the Assistance of Water, Sanitation and Hygiene in Disaster Response and Recovery
The use of Earth observation technology such as satellites, unmanned aircraft, or drones as part of early-warning systems and disaster risk reduction plans is a widely researched and established area of study. However, the use this technology can have in the provision of water, sanitation and hygiene services in the response and recovery phases of a disaster is not widely researched. A systematic literature review was undertaken assessing relevant literature to identify Earth observation technology and methods that can be applied to the context of water, sanitation and hygiene in disaster response and recovery. Whilst there were many water-related studies, there was a lack of studies looking at the potential uses of Earth observation for sanitation. This is an area that requires further research. Three main common uses of Earth observation technology were identified as relevant: (1) Monitoring of surface water quality; (2) Groundwater Sensing; and (3) Mapping and monitoring of hazards and infrastructure. Whilst the studies of Earth observation in these areas highlight that this technology could be usefully applied to assist with water, sanitation and hygiene during disaster response and recovery, more research is needed and there are limitations to consider—predominantly that funding, communication and integration between many agencies and technologies are required. Additionally, some technologies are subject to local regulations which can cause restrictions to their use over contested or private areas, or trans-national boundaries—common situations in disasters. This review was largely influenced by the search strings inputted during the identification of relevant literature; changing the search strings would likely result in a different combination of literature available for review and subsequent variations in the findings