29 research outputs found

    Effect of Nanoparticle Weight on the Cellular Uptake and Drug Delivery Potential of PLGA Nanoparticles

    Get PDF
    Biodegradable and biocompatible polymeric nanoparticles (NPs) stand out as a key tool for improving drug bioavailability, reducing the inherent toxicity, and targeting the intended site. Most importantly, the ease of polymer synthesis and its derivatization to add functional properties makes them potentially ideal to fulfill the requirements for intended therapeutic applications. Among many polymers, US FDA-approved poly(L-lactic-co-glycolic) acid (PLGA) is a widely used biocompatible and biodegradable co-polymer in drug delivery and in implantable biomaterials. While many studies have been conducted using PLGA NPs as a drug delivery system, less attention has been given to understanding the effect of NP weight on cellular behaviors such as uptake. Here we discuss the synthesis of PLGA NPs with varying NP weights and their colloidal and biological properties. Following nanoprecipitation, we have synthesized PLGA NP sizes ranging from 60 to 100 nm by varying the initial PLGA feed in the system. These NPs were found to be stable for a prolonged period in colloidal conditions. We further studied cellular uptake and found that these NPs are cytocompatible; however, they are differentially uptaken by cancer and immune cells, which are greatly influenced by NPs’ weight. The drug delivery potential of these nanoparticles (NPs) was assessed using doxorubicin (DOX) as a model drug, loaded into the NP core at a concentration of 7.0 ± 0.5 wt % to study its therapeutic effects. The results showed that both concentration and treatment time are crucial factors for exhibiting therapeutic effects, as observed with DOX-NPs exhibiting a higher potency at lower concentrations. The observations revealed that DOX-NPs exhibited a higher cellular uptake of DOX compared to the free-DOX treatment group. This will allow us to reduce the recommended dose to achieve the desired effect, which otherwise required a large dose when treated with free DOX. Considering the significance of PLGA-based nanoparticle drug delivery systems, we anticipate that this study will contribute to the establishment of design considerations and guidelines for the therapeutic applications of nanoparticles

    Real Time Spectroscopic Ellipsometry Analysis of First Stage CuIn1-xGaxSe2 Growth: Indium-Gallium Selenide Co-Evaporation

    Get PDF
    Real time spectroscopic ellipsometry (RTSE) has been applied for in-situ monitoring of the first stage of copper indium-gallium diselenide (CIGS) thin film deposition by the three-stage co-evaporation process used for fabrication of high efficiency thin film photovoltaic (PV) devices. The first stage entails the growth of indium-gallium selenide (In1-xGax)₂Se₃ (IGS) on a substrate of Mo-coated soda lime glass maintained at a temperature of 400 °C. This is a critical stage of CIGS deposition because a large fraction of the final film thickness is deposited, and as a result precise compositional control is desired in order to achieve the optimum performance of the resulting CIGS solar cell. RTSE is sensitive to monolayer level film growth processes and can provide accurate measurements of bulk and surface roughness layer thicknesses. These in turn enable accurate measurements of the bulk layer optical response in the form of the complex dielectric function Δ = Δ₁ - iΔ₂, spectra. Here, RTSE has been used to obtain the (Δ₁, Δ₂) spectra at the measurement temperature of 400 °C for IGS thin films of different Ga contents (x) deduced from different ranges of accumulated bulk layer thickness during the deposition process. Applying an analytical expression in common for each of the (Δ₁, Δ₂) spectra of these IGS films, oscillator parameters have been obtained in the best fits and these parameters in turn have been fitted with polynomials in x. From the resulting database of polynomial coefficients, the (Δ₁, Δ₂) spectra can be generated for any composition of IGS from the single parameter, x. The results have served as an RTSE fingerprint for IGS composition and have provided further structural information beyond simply thicknesses, for example information related to film density and grain size. The deduced IGS structural evolution and the (Δ₁, Δ₂) spectra have been interpreted as well in relation to observations from scanning electron microscopy, X-ray diffractometry and energy-dispersive X-ray spectroscopy profiling analyses. Overall the structural, optical and compositional analysis possible by RTSE has assisted in understanding the growth and properties of three stage CIGS absorbers for solar cells and shows future promise for enhancing cell performance through monitoring and control

    Global burden of 369 diseases and injuries in 204 countries and territories, 1990–2019: a systematic analysis for the Global Burden of Disease Study 2019

    Get PDF
    Background: In an era of shifting global agendas and expanded emphasis on non-communicable diseases and injuries along with communicable diseases, sound evidence on trends by cause at the national level is essential. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) provides a systematic scientific assessment of published, publicly available, and contributed data on incidence, prevalence, and mortality for a mutually exclusive and collectively exhaustive list of diseases and injuries. Methods: GBD estimates incidence, prevalence, mortality, years of life lost (YLLs), years lived with disability (YLDs), and disability-adjusted life-years (DALYs) due to 369 diseases and injuries, for two sexes, and for 204 countries and territories. Input data were extracted from censuses, household surveys, civil registration and vital statistics, disease registries, health service use, air pollution monitors, satellite imaging, disease notifications, and other sources. Cause-specific death rates and cause fractions were calculated using the Cause of Death Ensemble model and spatiotemporal Gaussian process regression. Cause-specific deaths were adjusted to match the total all-cause deaths calculated as part of the GBD population, fertility, and mortality estimates. Deaths were multiplied by standard life expectancy at each age to calculate YLLs. A Bayesian meta-regression modelling tool, DisMod-MR 2.1, was used to ensure consistency between incidence, prevalence, remission, excess mortality, and cause-specific mortality for most causes. Prevalence estimates were multiplied by disability weights for mutually exclusive sequelae of diseases and injuries to calculate YLDs. We considered results in the context of the Socio-demographic Index (SDI), a composite indicator of income per capita, years of schooling, and fertility rate in females younger than 25 years. Uncertainty intervals (UIs) were generated for every metric using the 25th and 975th ordered 1000 draw values of the posterior distribution. Findings: Global health has steadily improved over the past 30 years as measured by age-standardised DALY rates. After taking into account population growth and ageing, the absolute number of DALYs has remained stable. Since 2010, the pace of decline in global age-standardised DALY rates has accelerated in age groups younger than 50 years compared with the 1990–2010 time period, with the greatest annualised rate of decline occurring in the 0–9-year age group. Six infectious diseases were among the top ten causes of DALYs in children younger than 10 years in 2019: lower respiratory infections (ranked second), diarrhoeal diseases (third), malaria (fifth), meningitis (sixth), whooping cough (ninth), and sexually transmitted infections (which, in this age group, is fully accounted for by congenital syphilis; ranked tenth). In adolescents aged 10–24 years, three injury causes were among the top causes of DALYs: road injuries (ranked first), self-harm (third), and interpersonal violence (fifth). Five of the causes that were in the top ten for ages 10–24 years were also in the top ten in the 25–49-year age group: road injuries (ranked first), HIV/AIDS (second), low back pain (fourth), headache disorders (fifth), and depressive disorders (sixth). In 2019, ischaemic heart disease and stroke were the top-ranked causes of DALYs in both the 50–74-year and 75-years-and-older age groups. Since 1990, there has been a marked shift towards a greater proportion of burden due to YLDs from non-communicable diseases and injuries. In 2019, there were 11 countries where non-communicable disease and injury YLDs constituted more than half of all disease burden. Decreases in age-standardised DALY rates have accelerated over the past decade in countries at the lower end of the SDI range, while improvements have started to stagnate or even reverse in countries with higher SDI. Interpretation: As disability becomes an increasingly large component of disease burden and a larger component of health expenditure, greater research and developm nt investment is needed to identify new, more effective intervention strategies. With a rapidly ageing global population, the demands on health services to deal with disabling outcomes, which increase with age, will require policy makers to anticipate these changes. The mix of universal and more geographically specific influences on health reinforces the need for regular reporting on population health in detail and by underlying cause to help decision makers to identify success stories of disease control to emulate, as well as opportunities to improve. Funding: Bill & Melinda Gates Foundation. © 2020 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 licens

    Global age-sex-specific fertility, mortality, healthy life expectancy (HALE), and population estimates in 204 countries and territories, 1950-2019 : a comprehensive demographic analysis for the Global Burden of Disease Study 2019

    Get PDF
    Background: Accurate and up-to-date assessment of demographic metrics is crucial for understanding a wide range of social, economic, and public health issues that affect populations worldwide. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2019 produced updated and comprehensive demographic assessments of the key indicators of fertility, mortality, migration, and population for 204 countries and territories and selected subnational locations from 1950 to 2019. Methods: 8078 country-years of vital registration and sample registration data, 938 surveys, 349 censuses, and 238 other sources were identified and used to estimate age-specific fertility. Spatiotemporal Gaussian process regression (ST-GPR) was used to generate age-specific fertility rates for 5-year age groups between ages 15 and 49 years. With extensions to age groups 10–14 and 50–54 years, the total fertility rate (TFR) was then aggregated using the estimated age-specific fertility between ages 10 and 54 years. 7417 sources were used for under-5 mortality estimation and 7355 for adult mortality. ST-GPR was used to synthesise data sources after correction for known biases. Adult mortality was measured as the probability of death between ages 15 and 60 years based on vital registration, sample registration, and sibling histories, and was also estimated using ST-GPR. HIV-free life tables were then estimated using estimates of under-5 and adult mortality rates using a relational model life table system created for GBD, which closely tracks observed age-specific mortality rates from complete vital registration when available. Independent estimates of HIV-specific mortality generated by an epidemiological analysis of HIV prevalence surveys and antenatal clinic serosurveillance and other sources were incorporated into the estimates in countries with large epidemics. Annual and single-year age estimates of net migration and population for each country and territory were generated using a Bayesian hierarchical cohort component model that analysed estimated age-specific fertility and mortality rates along with 1250 censuses and 747 population registry years. We classified location-years into seven categories on the basis of the natural rate of increase in population (calculated by subtracting the crude death rate from the crude birth rate) and the net migration rate. We computed healthy life expectancy (HALE) using years lived with disability (YLDs) per capita, life tables, and standard demographic methods. Uncertainty was propagated throughout the demographic estimation process, including fertility, mortality, and population, with 1000 draw-level estimates produced for each metric. Findings: The global TFR decreased from 2·72 (95% uncertainty interval [UI] 2·66–2·79) in 2000 to 2·31 (2·17–2·46) in 2019. Global annual livebirths increased from 134·5 million (131·5–137·8) in 2000 to a peak of 139·6 million (133·0–146·9) in 2016. Global livebirths then declined to 135·3 million (127·2–144·1) in 2019. Of the 204 countries and territories included in this study, in 2019, 102 had a TFR lower than 2·1, which is considered a good approximation of replacement-level fertility. All countries in sub-Saharan Africa had TFRs above replacement level in 2019 and accounted for 27·1% (95% UI 26·4–27·8) of global livebirths. Global life expectancy at birth increased from 67·2 years (95% UI 66·8–67·6) in 2000 to 73·5 years (72·8–74·3) in 2019. The total number of deaths increased from 50·7 million (49·5–51·9) in 2000 to 56·5 million (53·7–59·2) in 2019. Under-5 deaths declined from 9·6 million (9·1–10·3) in 2000 to 5·0 million (4·3–6·0) in 2019. Global population increased by 25·7%, from 6·2 billion (6·0–6·3) in 2000 to 7·7 billion (7·5–8·0) in 2019. In 2019, 34 countries had negative natural rates of increase; in 17 of these, the population declined because immigration was not sufficient to counteract the negative rate of decline. Globally, HALE increased from 58·6 years (56·1–60·8) in 2000 to 63·5 years (60·8–66·1) in 2019. HALE increased in 202 of 204 countries and territories between 2000 and 2019

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Chronic Metabolic Acidosis Elicits Hypertension via Upregulation of Intrarenal Angiotensin II and Induction of Oxidative Stress

    No full text
    Chronic metabolic acidosis (CMA) can be a consequence of persistent hypertension but could potentially play a role in invoking hypertension. Currently, there is a scarcity of studies examining the outcome of induced chronic acidosis on blood pressure regulation. This study investigates CMA as a cause of hypertension. Chronic acidosis was induced in Sprague Dawley rats (100–150 g) by providing a weak acid solution of 0.28 M ammonium chloride (NH4Cl) in tap water for 8 weeks. To determine whether the rats were acidotic, blood pH was measured, while blood pressure (BP) was monitored by tail-cuff plethysmography weekly. Rats were divided into five groups: control, CMA, CMA ± spironolactone, captopril, and tempol. Serum sodium and potassium; renal interstitial fluid (for Angiotensin II concentration); and kidney proximal tubules (for Na+/K+ ATPase- α1 concentration) were analyzed. Reactive oxygen species (ROS) were detected in renal cortical homogenates using electron paramagnetic resonance (EPR). In the CMA rats, a sustained elevation in mean arterial pressure (MAP) associated with a significant decrease in blood pH was observed compared to that of control over the 8 weeks. A significant decrease in MAP was observed in acidotic rats treated with captopril/tempol, whereas spironolactone treatment caused no decrease in MAP as compared to that of the CMA group. The interstitial angiotensin II was increased in the CMA group but decreased in the CMA with captopril and tempol groups. In addition, the urinary sodium was decreased, and the serum sodium levels increased significantly in the CMA groups as compared to that of control. However, the acidotic groups with captopril and tempol showed reduced levels of serum sodium and an elevation in urinary sodium as compared to that of the CMA group. In addition, there was a significant increase in plasma renin and no change in plasma aldosterone in the CMA group with no significant differences in plasma renin or aldosterone observed during spironolactone, captopril, or tempol treatments. The increased expression of Na+/K+ ATPase-α1 in the CMA group suggests that active transport of Na+ to the blood could be causative of the observed hypertension. Furthermore, the EPR analysis confirmed an elevation in superoxide (O2-) radical levels in the CMA group, but the tempol/captopril treated acidotic groups showed less (O2-) compared to that of either the CMA group or control. Taken together, our data suggest that induction of CMA could potentially be causative of hypertension, while the mechanisms underlying the increased BP could be through the activation of intrarenal Ang II and induction of oxidative stress

    Effects of Forest Management Approach on Carbon Stock and Plant Diversity: A Case Study from Karnali Province, Nepal

    No full text
    The mitigation of global warming and conservation of biodiversity are two significant environmental challenges today. Estimating and comparing forest carbon stock and plant diversity under different management approaches provide insight into the choice of management approaches for carbon and plant diversity management. We investigated the variation in carbon stock and diversity of plant species in two forest managements under different approaches: the Kakrebihar protection forest (PF) and Sano Surkhet community forest (CF) in Karnali Province, Surkhet, Nepal. In total, 63 sample plots (30 plots in PF and 33 plots in CF) were laid out systematically across the forests. Dendrometric measurements were carried out for trees, poles, and saplings, and representative leaf litter and herb samples were collected. Soil samples were taken at 10 cm, 20 cm, and 30 cm depths using a soil auger. The existing tree volume equations of tree species of interest were used to estimate tree volume, and species-specific wood density and conversion factors were used to obtain total biomass and carbon content. Soil samples were analyzed using the Walkley-Black method to determine soil organic carbon. PF had higher carbon stock, plant species richness, and abundance at the landscape level than CF; however, the scenario differed at the plot level. At the plot level, PF had significantly higher total carbon stock and biomass carbon stock than CF. However, PF and CF were statistically indistinguishable in term of soil carbon stock. At the plot level, PF and CF were statistically indistinguishable regarding richness, Simpson diversity, and Shannon diversity, but PF had significantly higher plant abundance than CF. In conclusion, the value of PF for carbon stock and plant diversity surpassed those of CF. This study suggests that PF might be a better strategy to enhance carbon stock in forests and maintain habitat for various plant species
    corecore