110 research outputs found
A Severe Lack of Evidence Limits Effective Conservation of the World's Primates
Threats to biodiversity are well documented. However, to effectively conserve species and their habitats, we need to know which conservation interventions do (or do not) work. Evidence-based conservation evaluates interventions within a scientific framework. The Conservation Evidence project has summarized thousands of studies testing conservation interventions and compiled these as synopses for various habitats and taxa. In the present article, we analyzed the interventions assessed in the primate synopsis and compared these with other taxa. We found that despite intensive efforts to study primates and the extensive threats they face, less than 1% of primate studies evaluated conservation effectiveness. The studies often lacked quantitative data, failed to undertake postimplementation monitoring of populations or individuals, or implemented several interventions at once. Furthermore, the studies were biased toward specific taxa, geographic regions, and interventions. We describe barriers for testing primate conservation interventions and propose actions to improve the conservation evidence base to protect this endangered and globally important taxon
Computational approaches to explainable artificial intelligence: Advances in theory, applications and trends
Deep Learning (DL), a groundbreaking branch of Machine Learning (ML), has emerged as a driving force in both theoretical and applied Artificial Intelligence (AI). DL algorithms, rooted in complex and non-linear artificial neural systems, excel at extracting high-level features from data. DL has demonstrated human-level performance in real-world tasks, including clinical diagnostics, and has unlocked solutions to previously intractable problems in virtual agent design, robotics, genomics, neuroimaging, computer vision, and industrial automation. In this paper, the most relevant advances from the last few years in Artificial Intelligence (AI) and several applications to neuroscience, neuroimaging, computer vision, and robotics are presented, reviewed and discussed. In this way, we summarize the state-of-the-art in AI methods, models and applications within a collection of works presented at the 9th International Conference on the Interplay between Natural and Artificial Computation (IWINAC). The works presented in this paper are excellent examples of new scientific discoveries made in laboratories that have successfully transitioned to real-life applications.MCIU - Nvidia(UMA18-FEDERJA-084
Computational approaches to Explainable Artificial Intelligence:Advances in theory, applications and trends
Deep Learning (DL), a groundbreaking branch of Machine Learning (ML), has emerged as a driving force in both theoretical and applied Artificial Intelligence (AI). DL algorithms, rooted in complex and non-linear artificial neural systems, excel at extracting high-level features from data. DL has demonstrated human-level performance in real-world tasks, including clinical diagnostics, and has unlocked solutions to previously intractable problems in virtual agent design, robotics, genomics, neuroimaging, computer vision, and industrial automation. In this paper, the most relevant advances from the last few years in Artificial Intelligence (AI) and several applications to neuroscience, neuroimaging, computer vision, and robotics are presented, reviewed and discussed. In this way, we summarize the state-of-the-art in AI methods, models and applications within a collection of works presented at the 9th International Conference on the Interplay between Natural and Artificial Computation (IWINAC). The works presented in this paper are excellent examples of new scientific discoveries made in laboratories that have successfully transitioned to real-life applications.</p
Computational Approaches to Explainable Artificial Intelligence:Advances in Theory, Applications and Trends
Deep Learning (DL), a groundbreaking branch of Machine Learning (ML), has emerged as a driving force in both theoretical and applied Artificial Intelligence (AI). DL algorithms, rooted in complex and non-linear artificial neural systems, excel at extracting high-level features from data. DL has demonstrated human-level performance in real-world tasks, including clinical diagnostics, and has unlocked solutions to previously intractable problems in virtual agent design, robotics, genomics, neuroimaging, computer vision, and industrial automation. In this paper, the most relevant advances from the last few years in Artificial Intelligence (AI) and several applications to neuroscience, neuroimaging, computer vision, and robotics are presented, reviewed and discussed. In this way, we summarize the state-of-the-art in AI methods, models and applications within a collection of works presented at the 9 International Conference on the Interplay between Natural and Artificial Computation (IWINAC). The works presented in this paper are excellent examples of new scientific discoveries made in laboratories that have successfully transitioned to real-life applications
Class-modeling analysis reveals T-cell homeostasis disturbances involved in loss of immune control in elite controllers
Despite long-lasting HIV replication control, a significant proportion of elite controller (EC) patients may experience CD4 T-cell loss. Discovering perturbations in immunological parameters could help our understanding of the mechanisms that may be operating in those patients experiencing loss of immunological control.
Methods A case–control study was performed to evaluate if alterations in different T-cell homeostatic parameters can predict CD4 T-cell loss in ECs by comparing data from EC patients showing significant CD4 decline (cases) and EC patients showing stable CD4 counts (controls). The partial least-squares–class modeling (PLS-CM) statistical methodology was employed to discriminate between the two groups of patients, and as a predictive model.
Results
Herein, we show that among T-cell homeostatic alterations, lower levels of naïve and recent thymic emigrant subsets of CD8 cells and higher levels of effector and senescent subsets of CD8 cells as well as higher levels of exhaustion of CD4 cells, measured prior to CD4 T-cell loss, predict the loss of immunological control.
Conclusions
These data indicate that the parameters of T-cell homeostasis may identify those EC patients with a higher proclivity to CD4 T-cell loss. Our results may open new avenues for understanding the mechanisms underlying immunological progression despite HIV replication control, and eventually, for finding a functional cure through immune-based clinical trials.projects RD12/0017/0031, RD16/0025/
0013, and SAF2015-66193-R as part of the Health Research and Development
Strategy, State Plan for Scientific and Technical Research and Innovation (2008–
2011 and 2013–2016) and cofinanced by the Institute of Health Carlos III (ISCIII),
Sub-Directorate General for Research Assessment and Promotion and European
Regional Development Fund. NR is a Miguel Servet investigator from the ISCIII
(CP14/00198), Madrid, Spain. C Restrepo was funded by project RD12/0017/
0031 and is currently funded by project RD16/0025/0013. M García is a
predoctoral student co-funded by grant CP14/00198 and an Intramural
Research Scholarship from Instituto de Investigación Sanitaria-Fundación Jiménez
Díaz (IIS-FJD)
Impact of common cardio-metabolic risk factors on fatal and non-fatal cardiovascular disease in Latin America and the Caribbean: an individual-level pooled analysis of 31 cohort studies
Background: Estimates of the burden of cardio-metabolic risk factors in Latin America and the Caribbean (LAC) rely on relative risks (RRs) from non-LAC countries. Whether these RRs apply to LAC remains un- known.
Methods: We pooled LAC cohorts. We estimated RRs per unit of exposure to body mass index (BMI), systolic blood pressure (SBP), fasting plasma glucose (FPG), total cholesterol (TC) and non-HDL cholesterol on fatal (31 cohorts, n = 168,287) and non-fatal (13 cohorts, n = 27,554) cardiovascular diseases, adjusting for regression dilution bias. We used these RRs and national data on mean risk factor levels to estimate the number of cardiovascular deaths attributable to non-optimal levels of each risk factor.
Results: Our RRs for SBP, FPG and TC were like those observed in cohorts conducted in high-income countries; however, for BMI, our RRs were consistently smaller in people below 75 years of age. Across risk factors, we observed smaller RRs among older ages. Non-optimal SBP was responsible for the largest number of attributable cardiovascular deaths ranging from 38 per 10 0,0 0 0 women and 54 men in Peru, to 261 (Dominica, women) and 282 (Guyana, men). For non-HDL cholesterol, the lowest attributable rate was for women in Peru (21) and men in Guatemala (25), and the largest in men (158) and women (142) from Guyana.
Interpretation: RRs for BMI from studies conducted in high-income countries may overestimate disease burden metrics in LAC; conversely, RRs for SBP, FPG and TC from LAC cohorts are similar to those esti- mated from cohorts in high-income countries
Incident type 2 diabetes attributable to suboptimal diet in 184 countries
The global burden of diet-attributable type 2 diabetes (T2D) is not well established. This risk assessment model estimated T2D incidence among adults attributable to direct and body weight-mediated effects of 11 dietary factors in 184 countries in 1990 and 2018. In 2018, suboptimal intake of these dietary factors was estimated to be attributable to 14.1 million (95% uncertainty interval (UI), 13.814.4 million) incident T2D cases, representing 70.3% (68.871.8%) of new cases globally. Largest T2D burdens were attributable to insufficient whole-grain intake (26.1% (25.027.1%)), excess refined rice and wheat intake (24.6% (22.327.2%)) and excess processed meat intake (20.3% (18.323.5%)). Across regions, highest proportional burdens were in central and eastern Europe and central Asia (85.6% (83.487.7%)) and Latin America and the Caribbean (81.8% (80.183.4%)); and lowest proportional burdens were in South Asia (55.4% (52.160.7%)). Proportions of diet-attributable T2D were generally larger in men than in women and were inversely correlated with age. Diet-attributable T2D was generally larger among urban versus rural residents and higher versus lower educated individuals, except in high-income countries, central and eastern Europe and central Asia, where burdens were larger in rural residents and in lower educated individuals. Compared with 1990, global diet-attributable T2D increased by 2.6 absolute percentage points (8.6 million more cases) in 2018, with variation in these trends by world region and dietary factor. These findings inform nutritional priorities and clinical and public health planning to improve dietary quality and reduce T2D globally. (c) 2023, The Author(s)
Children's and adolescents' rising animal-source food intakes in 1990-2018 were impacted by age, region, parental education and urbanicity
Animal-source foods (ASF) provide nutrition for children and adolescents physical and cognitive development. Here, we use data from the Global Dietary Database and Bayesian hierarchical models to quantify global, regional and national ASF intakes between 1990 and 2018 by age group across 185 countries, representing 93% of the worlds child population. Mean ASF intake was 1.9 servings per day, representing 16% of children consuming at least three daily servings. Intake was similar between boys and girls, but higher among urban children with educated parents. Consumption varied by age from 0.6 at <1 year to 2.5 servings per day at 1519 years. Between 1990 and 2018, mean ASF intake increased by 0.5 servings per week, with increases in all regions except sub-Saharan Africa. In 2018, total ASF consumption was highest in Russia, Brazil, Mexico and Turkey, and lowest in Uganda, India, Kenya and Bangladesh. These findings can inform policy to address malnutrition through targeted ASF consumption programmes. (c) 2023, The Author(s)
Global overview of the management of acute cholecystitis during the COVID-19 pandemic (CHOLECOVID study)
Background: This study provides a global overview of the management of patients with acute cholecystitis during the initial phase of the COVID-19 pandemic. Methods: CHOLECOVID is an international, multicentre, observational comparative study of patients admitted to hospital with acute cholecystitis during the COVID-19 pandemic. Data on management were collected for a 2-month study interval coincident with the WHO declaration of the SARS-CoV-2 pandemic and compared with an equivalent pre-pandemic time interval. Mediation analysis examined the influence of SARS-COV-2 infection on 30-day mortality. Results: This study collected data on 9783 patients with acute cholecystitis admitted to 247 hospitals across the world. The pandemic was associated with reduced availability of surgical workforce and operating facilities globally, a significant shift to worse severity of disease, and increased use of conservative management. There was a reduction (both absolute and proportionate) in the number of patients undergoing cholecystectomy from 3095 patients (56.2 per cent) pre-pandemic to 1998 patients (46.2 per cent) during the pandemic but there was no difference in 30-day all-cause mortality after cholecystectomy comparing the pre-pandemic interval with the pandemic (13 patients (0.4 per cent) pre-pandemic to 13 patients (0.6 per cent) pandemic; P = 0.355). In mediation analysis, an admission with acute cholecystitis during the pandemic was associated with a non-significant increased risk of death (OR 1.29, 95 per cent c.i. 0.93 to 1.79, P = 0.121). Conclusion: CHOLECOVID provides a unique overview of the treatment of patients with cholecystitis across the globe during the first months of the SARS-CoV-2 pandemic. The study highlights the need for system resilience in retention of elective surgical activity. Cholecystectomy was associated with a low risk of mortality and deferral of treatment results in an increase in avoidable morbidity that represents the non-COVID cost of this pandemic
Elective Cancer Surgery in COVID-19-Free Surgical Pathways During the SARS-CoV-2 Pandemic: An International, Multicenter, Comparative Cohort Study.
PURPOSE: As cancer surgery restarts after the first COVID-19 wave, health care providers urgently require data to determine where elective surgery is best performed. This study aimed to determine whether COVID-19-free surgical pathways were associated with lower postoperative pulmonary complication rates compared with hospitals with no defined pathway. PATIENTS AND METHODS: This international, multicenter cohort study included patients who underwent elective surgery for 10 solid cancer types without preoperative suspicion of SARS-CoV-2. Participating hospitals included patients from local emergence of SARS-CoV-2 until April 19, 2020. At the time of surgery, hospitals were defined as having a COVID-19-free surgical pathway (complete segregation of the operating theater, critical care, and inpatient ward areas) or no defined pathway (incomplete or no segregation, areas shared with patients with COVID-19). The primary outcome was 30-day postoperative pulmonary complications (pneumonia, acute respiratory distress syndrome, unexpected ventilation). RESULTS: Of 9,171 patients from 447 hospitals in 55 countries, 2,481 were operated on in COVID-19-free surgical pathways. Patients who underwent surgery within COVID-19-free surgical pathways were younger with fewer comorbidities than those in hospitals with no defined pathway but with similar proportions of major surgery. After adjustment, pulmonary complication rates were lower with COVID-19-free surgical pathways (2.2% v 4.9%; adjusted odds ratio [aOR], 0.62; 95% CI, 0.44 to 0.86). This was consistent in sensitivity analyses for low-risk patients (American Society of Anesthesiologists grade 1/2), propensity score-matched models, and patients with negative SARS-CoV-2 preoperative tests. The postoperative SARS-CoV-2 infection rate was also lower in COVID-19-free surgical pathways (2.1% v 3.6%; aOR, 0.53; 95% CI, 0.36 to 0.76). CONCLUSION: Within available resources, dedicated COVID-19-free surgical pathways should be established to provide safe elective cancer surgery during current and before future SARS-CoV-2 outbreaks
- …