1,295 research outputs found
Extracellular Vesicles from Mesenchymal Stromal Cells for the Treatment of Inflammation-Related Conditions
Over the past two decades, mesenchymal stromal cells (MSCs) have demonstrated great potential in the treatment of inflammation-related conditions. Numerous early stage clinical trials have suggested that this treatment strategy has potential to lead to significant improvements in clinical outcomes. While promising, there remain substantial regulatory hurdles, safety concerns, and logistical issues that need to be addressed before cell-based treatments can have widespread clinical impact. These drawbacks, along with research aimed at elucidating the mechanisms by which MSCs exert their therapeutic effects, have inspired the development of extracellular vesicles (EVs) as anti-inflammatory therapeutic agents. The use of MSC-derived EVs for treating inflammation-related conditions has shown therapeutic potential in both in vitro and small animal studies. This review will explore the current research landscape pertaining to the use of MSC-derived EVs as anti-inflammatory and pro-regenerative agents in a range of inflammation-related conditions: osteoarthritis, rheumatoid arthritis, Alzheimer's disease, cardiovascular disease, and preeclampsia. Along with this, the mechanisms by which MSC-derived EVs exert their beneficial effects on the damaged or degenerative tissues will be reviewed, giving insight into their therapeutic potential. Challenges and future perspectives on the use of MSC-derived EVs for the treatment of inflammation-related conditions will be discussed
Neutron Thermal Cross Sections, Westcott Factors, Resonance Integrals, Maxwellian Averaged Cross Sections and Astrophysical Reaction Rates Calculated from Major Evaluated Data Libraries
We present calculations of neutron thermal cross sections, Westcott factors,
resonance integrals, Maxwellian-averaged cross sections and astrophysical
reaction rates for 843 ENDF materials using data from the major evaluated
nuclear libraries and European activation file. Extensive analysis of
newly-evaluated neutron reaction cross sections, neutron covariances, and
improvements in data processing techniques motivated us to calculate nuclear
industry and neutron physics quantities, produce s-process Maxwellian-averaged
cross sections and astrophysical reaction rates, systematically calculate
uncertainties, and provide additional insights on currently available
neutron-induced reaction data. Nuclear reaction calculations are discussed and
new results are presented.Comment: 145 pages, 15 figures, 19 table
Increased axonal bouton dynamics in the aging mouse cortex
Aging is a major risk factor for many neurological diseases and is associated with mild cognitive decline. Previous studies suggest that aging is accompanied by reduced synapse number and synaptic plasticity in specific brain regions. However, most studies, to date, used either postmortem or ex vivo preparations and lacked key in vivo evidence. Thus, whether neuronal arbors and synaptic structures remain dynamic in the intact aged brain and whether specific synaptic deficits arise during aging remains unknown. Here we used in vivo two-photon imaging and a unique analysis method to rigorously measure and track the size and location of axonal boutons in aged mice. Unexpectedly, the aged cortex shows circuit-specific increased rates of axonal bouton formation, elimination, and destabilization. Compared with the young adult brain, large (i.e., strong) boutons show 10-fold higher rates of destabilization and 20-fold higher turnover in the aged cortex. Size fluctuations of persistent boutons, believed to encode long-term memories, also are larger in the aged brain, whereas bouton size and density are not affected. Our data uncover a striking and unexpected increase in axonal bouton dynamics in the aged cortex. The increased turnover and destabilization rates of large boutons indicate that learning and memory deficits in the aged brain arise not through an inability to form new synapses but rather through decreased synaptic tenacity. Overall our study suggests that increased synaptic structural dynamics in specific cortical circuits may be a mechanism for age-related cognitive decline
Beyond Volume: The Impact of Complex Healthcare Data on the Machine Learning Pipeline
From medical charts to national census, healthcare has traditionally operated
under a paper-based paradigm. However, the past decade has marked a long and
arduous transformation bringing healthcare into the digital age. Ranging from
electronic health records, to digitized imaging and laboratory reports, to
public health datasets, today, healthcare now generates an incredible amount of
digital information. Such a wealth of data presents an exciting opportunity for
integrated machine learning solutions to address problems across multiple
facets of healthcare practice and administration. Unfortunately, the ability to
derive accurate and informative insights requires more than the ability to
execute machine learning models. Rather, a deeper understanding of the data on
which the models are run is imperative for their success. While a significant
effort has been undertaken to develop models able to process the volume of data
obtained during the analysis of millions of digitalized patient records, it is
important to remember that volume represents only one aspect of the data. In
fact, drawing on data from an increasingly diverse set of sources, healthcare
data presents an incredibly complex set of attributes that must be accounted
for throughout the machine learning pipeline. This chapter focuses on
highlighting such challenges, and is broken down into three distinct
components, each representing a phase of the pipeline. We begin with attributes
of the data accounted for during preprocessing, then move to considerations
during model building, and end with challenges to the interpretation of model
output. For each component, we present a discussion around data as it relates
to the healthcare domain and offer insight into the challenges each may impose
on the efficiency of machine learning techniques.Comment: Healthcare Informatics, Machine Learning, Knowledge Discovery: 20
Pages, 1 Figur
Activated Ion Electron Capture Dissociation (AI ECD) of proteins: synchronization of infrared and electron irradiation with ion magnetron motion.
Here, we show that to perform activated ion electron capture dissociation (AI-ECD) in a Fourier transform ion cyclotron resonance (FT-ICR) mass spectrometer equipped with a CO(2) laser, it is necessary to synchronize both infrared irradiation and electron capture dissociation with ion magnetron motion. This requirement is essential for instruments in which the infrared laser is angled off-axis, such as the Thermo Finnigan LTQ FT. Generally, the electron irradiation time required for proteins is much shorter (ms) than that required for peptides (tens of ms), and the modulation of ECD, AI ECD, and infrared multiphoton dissociation (IRMPD) with ion magnetron motion is more pronounced. We have optimized AI ECD for ubiquitin, cytochrome c, and myoglobin; however the results can be extended to other proteins. We demonstrate that pre-ECD and post-ECD activation are physically different and display different kinetics. We also demonstrate how, by use of appropriate AI ECD time sequences and normalization, the kinetics of protein gas-phase refolding can be deconvoluted from the diffusion of the ion cloud and measured on the time scale longer than the period of ion magnetron motion
Multiple Imputation Ensembles (MIE) for dealing with missing data
Missing data is a significant issue in many real-world datasets, yet there are no robust methods for dealing with it appropriately. In this paper, we propose a robust approach to dealing with missing data in classification problems: Multiple Imputation Ensembles (MIE). Our method integrates two approaches: multiple imputation and ensemble methods and compares two types of ensembles: bagging and stacking. We also propose a robust experimental set-up using 20 benchmark datasets from the UCI machine learning repository. For each dataset, we introduce increasing amounts of data Missing Completely at Random. Firstly, we use a number of single/multiple imputation methods to recover the missing values and then ensemble a number of different classifiers built on the imputed data. We assess the quality of the imputation by using dissimilarity measures. We also evaluate the MIE performance by comparing classification accuracy on the complete and imputed data. Furthermore, we use the accuracy of simple imputation as a benchmark for comparison. We find that our proposed approach combining multiple imputation with ensemble techniques outperform others, particularly as missing data increases
AWPP: A New Scheme for Wireless Access Control Proportional to Traffic Priority and Rate
Cutting-edge wireless networking approaches are required to efficiently differentiate traffic and handle it according to its special characteristics. The current Medium Access Control (MAC) scheme which is expected to be sufficiently supported by well-known networking vendors comes from the IEEE 802.11e workgroup. The standardized solution is the Hybrid Coordination Function (HCF), that includes the mandatory Enhanced Distributed Channel Access (EDCA) protocol and the optional Hybrid Control Channel Access (HCCA) protocol. These two protocols greatly differ in nature and they both have significant limitations. The objective of this work is the development of a high-performance MAC scheme for wireless networks, capable of providing predictable Quality of Service (QoS) via an efficient traffic differentiation algorithm in proportion to the traffic priority and generation rate. The proposed Adaptive Weighted and Prioritized Polling (AWPP) protocol is analyzed, and its superior deterministic operation is revealed
Rhythmic dynamics and synchronization via dimensionality reduction : application to human gait
Reliable characterization of locomotor dynamics of human walking is vital to understanding the neuromuscular control of human locomotion and disease diagnosis. However, the inherent oscillation and ubiquity of noise in such non-strictly periodic signals pose great challenges to current methodologies. To this end, we exploit the state-of-the-art technology in pattern recognition and, specifically, dimensionality reduction techniques, and propose to reconstruct and characterize the dynamics accurately on the cycle scale of the signal. This is achieved by deriving a low-dimensional representation of the cycles through global optimization, which effectively preserves the topology of the cycles that are embedded in a high-dimensional Euclidian space. Our approach demonstrates a clear advantage in capturing the intrinsic dynamics and probing the subtle synchronization patterns from uni/bivariate oscillatory signals over traditional methods. Application to human gait data for healthy subjects and diabetics reveals a significant difference in the dynamics of ankle movements and ankle-knee coordination, but not in knee movements. These results indicate that the impaired sensory feedback from the feet due to diabetes does not influence the knee movement in general, and that normal human walking is not critically dependent on the feedback from the peripheral nervous system
Chapter 11: Challenges in and Principles for Conducting Systematic Reviews of Genetic Tests used as Predictive Indicators
In this paper, we discuss common challenges in and principles for conducting systematic reviews of genetic tests. The types of genetic tests discussed are those used to 1). determine risk or susceptibility in asymptomatic individuals; 2). reveal prognostic information to guide clinical management in those with a condition; or 3). predict response to treatments or environmental factors. This paper is not intended to provide comprehensive guidance on evaluating all genetic tests. Rather, it focuses on issues that have been of particular concern to analysts and stakeholders and on areas that are of particular relevance for the evaluation of studies of genetic tests. The key points include:The general principles that apply in evaluating genetic tests are similar to those for other prognostic or predictive tests, but there are differences in how the principles need to be applied or the degree to which certain issues are relevant.A clear definition of the clinical scenario and an analytic framework is important when evaluating any test, including genetic tests.Organizing frameworks and analytic frameworks are useful constructs for approaching the evaluation of genetic tests.In constructing an analytic framework for evaluating a genetic test, analysts should consider preanalytic, analytic, and postanalytic factors; such factors are useful when assessing analytic validity.Predictive genetic tests are generally characterized by a delayed time between testing and clinically important events.Finding published information on the analytic validity of some genetic tests may be difficult. Web sites (FDA or diagnostic companies) and gray literature may be important sources.In situations where clinical factors associated with risk are well characterized, comparative effectiveness reviews should assess the added value of using genetic testing along with known factors compared with using the known factors alone.For genome-wide association studies, reviewers should determine whether the association has been validated in multiple studies to minimize both potential confounding and publication bias. In addition, reviewers should note whether appropriate adjustments for multiple comparisons were used
- …