3,642 research outputs found
Explainable Brain Age Prediction using coVariance Neural Networks
In computational neuroscience, there has been an increased interest in
developing machine learning algorithms that leverage brain imaging data to
provide estimates of "brain age" for an individual. Importantly, the
discordance between brain age and chronological age (referred to as "brain age
gap") can capture accelerated aging due to adverse health conditions and
therefore, can reflect increased vulnerability towards neurological disease or
cognitive impairments. However, widespread adoption of brain age for clinical
decision support has been hindered due to lack of transparency and
methodological justifications in most existing brain age prediction algorithms.
In this paper, we leverage coVariance neural networks (VNN) to propose an
anatomically interpretable framework for brain age prediction using cortical
thickness features. Specifically, our brain age prediction framework extends
beyond the coarse metric of brain age gap in Alzheimer's disease (AD) and we
make two important observations: (i) VNNs can assign anatomical
interpretability to elevated brain age gap in AD by identifying contributing
brain regions, (ii) the interpretability offered by VNNs is contingent on their
ability to exploit specific eigenvectors of the anatomical covariance matrix.
Together, these observations facilitate an explainable perspective to the task
of brain age prediction.Comment: arXiv admin note: substantial text overlap with arXiv:2305.0180
Noise Reduction Using Singular Value Decomposition with JensenāShannon Divergence for Coronary Computed Tomography Angiography
Coronary computed tomography angiography (CCTA) is widely used due to its improvements in computed tomography (CT) diagnostic performance. Unlike other CT examinations, CCTA requires shorter rotation times of the X-ray tube, improving the temporal resolution and facilitating the imaging of the beating heart in a stationary state. However, reconstructed CT images, including those of the coronary arteries, contain insufficient X-ray photons and considerable noise. In this study, we introduce an image-processing technique for noise reduction using singular value decomposition (SVD) for CCTA images. The threshold of SVD was determined on the basis of minimization of JensenāShannon (JS) divergence. Experiments were performed with various numerical phantoms and varying levels of noise to reduce noise in clinical CCTA images using the determined threshold value. The numerical phantoms produced 10% higher-quality images than the conventional noise reduction method when compared on a quantitative SSIM basis. The threshold value determined by minimizing the JSādivergence was found to be useful for efficient noise reduction in actual clinical images, depending on the level of noise
Learning and Control of Dynamical Systems
Despite the remarkable success of machine learning in various domains in recent years, our understanding of its fundamental limitations remains incomplete. This knowledge gap poses a grand challenge when deploying machine learning methods in critical decision-making tasks, where incorrect decisions can have catastrophic consequences. To effectively utilize these learning-based methods in such contexts, it is crucial to explicitly characterize their performance. Over the years, significant research efforts have been dedicated to learning and control of dynamical systems where the underlying dynamics are unknown or only partially known a priori, and must be inferred from collected data. However, much of these classical results have focused on asymptotic guarantees, providing limited insights into the amount of data required to achieve desired control performance while satisfying operational constraints such as safety and stability, especially in the presence of statistical noise.
In this thesis, we study the statistical complexity of learning and control of unknown dynamical systems. By utilizing recent advances in statistical learning theory, high-dimensional statistics, and control theoretic tools, we aim to establish a fundamental understanding of the number of samples required to achieve desired (i) accuracy in learning the unknown dynamics, (ii) performance in the control of the underlying system, and (iii) satisfaction of the operational constraints such as safety and stability. We provide finite-sample guarantees for these objectives and propose efficient learning and control algorithms that achieve the desired performance at these statistical limits in various dynamical systems. Our investigation covers a broad range of dynamical systems, starting from fully observable linear dynamical systems to partially observable linear dynamical systems, and ultimately, nonlinear systems.
We deploy our learning and control algorithms in various adaptive control tasks in real-world control systems and demonstrate their strong empirical performance along with their learning, robustness, and stability guarantees. In particular, we implement one of our proposed methods, Fourier Adaptive Learning and Control (FALCON), on an experimental aerodynamic testbed under extreme turbulent flow dynamics in a wind tunnel. The results show that FALCON achieves state-of-the-art stabilization performance and consistently outperforms conventional and other learning-based methods by at least 37%, despite using 8 times less data. The superior performance of FALCON arises from its physically and theoretically accurate modeling of the underlying nonlinear turbulent dynamics, which yields rigorous finite-sample learning and performance guarantees. These findings underscore the importance of characterizing the statistical complexity of learning and control of unknown dynamical systems.</p
Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5
This ļ¬fth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different ļ¬elds of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered.
First Part of this book presents some theoretical advances on DSmT, dealing mainly with modiļ¬ed Proportional Conļ¬ict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classiļ¬ers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes.
Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identiļ¬cation of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classiļ¬cation.
Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classiļ¬cation, and hybrid techniques mixing deep learning with belief functions as well
Sustainable Reservoir Management Approaches under Impacts of Climate Change - A Case Study of Mangla Reservoir, Pakistan
Reservoir sedimentation is a major issue for water resource management around the world. It has serious economic, environmental, and social consequences, such as reduced water storage capacity, increased flooding risk, decreased hydropower generation, and deteriorated water quality. Increased rainfall intensity, higher temperatures, and more extreme weather events due to climate change are expected to exacerbate the problem of reservoir sedimentation. As a result, sedimentation must be managed to ensure the long-term viability of reservoirs and their associated infrastructure. Effective reservoir sedimentation management in the face of climate change necessitates an understanding of the sedimentation process and the factors that influence it, such as land use practices, erosion, and climate. Monitoring and modelling sedimentation rates are also useful tools for forecasting future impacts and making management decisions.
The goal of this research is to create long-term reservoir management strategies in the face of climate change by simulating the effects of various reservoir-operating strategies on reservoir sedimentation and sediment delta movement at Mangla Reservoir in Pakistan (the second-largest dam in the country). In order to assess the impact of the Mangla Reservoir's sedimentation and reservoir life, a framework was developed. This framework incorporates both hydrological and morphodynamic models and various soft computing models. In addition to taking climate change uncertainty into consideration, the proposed framework also incorporates sediment source, sediment delivery, and reservoir morphology changes. Furthermore, the purpose of this study is to provide a practical methodology based on the limited data available.
In the first phase of this study, it was investigated how to accurately quantify the missing suspended sediment load (SSL) data in rivers by utilizing various techniques, such as sediment rating curves (SRC) and soft computing models (SCMs), including local linear regression (LLR), artificial neural networks (ANN) and wavelet-cum-ANN (WANN). Further, the Gamma and M-test were performed to select the best-input variables and appropriate data length for SCMs development. Based on an evaluation of the outcomes of all leading models for SSL estimation, it can be concluded that SCMs are more effective than SRC approaches. Additionally, the results also indicated that the WANN model was the most accurate model for reconstructing the SSL time series because it is capable of identifying the salient characteristics in a data series.
The second phase of this study examined the feasibility of using four satellite precipitation datasets (SPDs) which included GPM, PERSIANN_CDR, CHIRPS, and CMORPH to predict streamflow and sediment loads (SL) within a poorly gauged mountainous catchment, by employing the SWAT hydrological model as well as SWAT coupled soft computing models (SCMs) such as artificial neural networks (SWAT-ANN), random forests (SWAT-RF), and support vector regression (SWAT-SVR). SCMs were developed using the outputs of un-calibrated SWAT hydrological models to improve the predictions. The results indicate that during the entire simulation, the GPM shows the best performance in both schemes, while PERSIAN_CDR and CHIRPS also perform well, whereas CMORPH predicts streamflow for the Upper Jhelum River Basin (UJRB) with relatively poor performance. Among the best GPM-based models, SWAT-RF offered the best performance to simulate the entire streamflow, while SWAT-ANN excelled at simulating the SL. Hence, hydrological coupled SCMs based on SPDs could be an effective technique for simulating streamflow and SL, particularly in complex terrain where gauge network density is low or uneven.
The third and last phase of this study investigated the impact of different reservoir operating strategies on Mangla reservoir sedimentation using a 1D sediment transport model. To improve the accuracy of the model, more accurate boundary conditions for flow and sediment load were incorporated into the numerical model (derived from the first and second phases of this study) so that the successive morphodynamic model could precisely predict bed level changes under given climate conditions. Further, in order to assess the long-term effect of a changing climate, a Global Climate Model (GCM) under Representative Concentration Pathways (RCP) scenarios 4.5 and 8.5 for the 21st century is used. The long-term modelling results showed that a gradual increase in the reservoir minimum operating level (MOL) slows down the delta movement rate and the bed level close to the dam. However, it may compromise the downstream irrigation demand during periods of high water demand. The findings may help the reservoir managers to improve the reservoir operation rules and ultimately support the objective of sustainable reservoir use for societal benefit.
In summary, this study provides comprehensive insights into reservoir sedimentation phenomena and recommends an operational strategy that is both feasible and sustainable over the long term under the impact of climate change, especially in cases where a lack of data exists. Basically, it is very important to improve the accuracy of sediment load estimates, which are essential in the design and operation of reservoir structures and operating plans in response to incoming sediment loads, ensuring accurate reservoir lifespan predictions. Furthermore, the production of highly accurate streamflow forecasts, particularly when on-site data is limited, is important and can be achieved by the use of satellite-based precipitation data in conjunction with hydrological and soft computing models. Ultimately, the use of soft computing methods produces significantly improved input data for sediment load and discharge, enabling the application of one-dimensional hydro-morphodynamic numerical models to evaluate sediment dynamics and reservoir useful life under the influence of climate change at various operating conditions in a way that is adequate for evaluating sediment dynamics.:Chapter 1: Introduction
Chapter 2:Reconstruction of Sediment Load Data in Rivers
Chapter 3:Assessment of The Hydrological and Coupled Soft Computing Models, Based on Different Satellite Precipitation Datasets, To Simulate Streamflow and Sediment Load in A Mountainous Catchment
Chapter 4:Simulating the Impact of Climate Change with Different Reservoir Operating Strategies on Sedimentation of the Mangla Reservoir, Northern Pakistan
Chapter 5:Conclusions and Recommendation
Complex systems methods characterizing nonlinear processes in the near-Earth electromagnetic environment: recent advances and open challenges
Learning from successful applications of methods originating in statistical mechanics, complex systems science, or information theory in one scientific field (e.g., atmospheric physics or climatology) can provide important insights or conceptual ideas for other areas (e.g., space sciences) or even stimulate new research questions and approaches. For instance, quantification and attribution of dynamical complexity in output time series of nonlinear dynamical systems is a key challenge across scientific disciplines. Especially in the field of space physics, an early and accurate detection of characteristic dissimilarity between normal and abnormal states (e.g., pre-storm activity vs. magnetic storms) has the potential to vastly improve space weather diagnosis and, consequently, the mitigation of space weather hazards.
This review provides a systematic overview on existing nonlinear dynamical systems-based methodologies along with key results of their previous applications in a space physics context, which particularly illustrates how complementary modern complex systems approaches have recently shaped our understanding of nonlinear magnetospheric variability. The rising number of corresponding studies demonstrates that the multiplicity of nonlinear time series analysis methods developed during the last decades offers great potentials for uncovering relevant yet complex processes interlinking different geospace subsystems, variables and spatiotemporal scales
- ā¦