10,060 research outputs found

    Using outlier elimination to assess learning-based correspondence matching methods

    Get PDF
    Recently, deep learning (DL) technology has been widely used in correspondence matching. The learning-based models are usually trained on benign image pairs with partial overlaps. Since DL model is usually data-dependent, non-overlapping images may be used as poison samples to fool the model and produce false registrations. In this study, we propose an outlier elimination based assessment method (OEAM) to assess the registrations of learning-based correspondence matching method on partially overlapping and non-overlapping images. OEAM first eliminates outliers based on spatial paradox. Then OEAM implements registration assessment in two streams using the obtained core correspondence set. If the cardinality of the core set is sufficiently small, the input registration is assessed as a low-quality registration. Otherwise, it is assessed to be of high quality, and OEAM improves its registration performance using the core set. OEAM is a post-processing technique imposed on learning-based method. The comparison experiments are implemented on outdoor (YFCC100M) and indoor (SUN3D) datasets using four deep learning-based methods. The experimental results on registrations of partially overlapping images show that OEAM can reliably infer low-quality registrations and improve performance on high-quality registrations. The experiments on registrations of non-overlapping images demonstrate that learning-based methods are vulnerable to poisoning attacks launched by non overlapping images, and OEAM is robust against poisoning attacks crafted by non-overlapping images

    Flood dynamics derived from video remote sensing

    Get PDF
    Flooding is by far the most pervasive natural hazard, with the human impacts of floods expected to worsen in the coming decades due to climate change. Hydraulic models are a key tool for understanding flood dynamics and play a pivotal role in unravelling the processes that occur during a flood event, including inundation flow patterns and velocities. In the realm of river basin dynamics, video remote sensing is emerging as a transformative tool that can offer insights into flow dynamics and thus, together with other remotely sensed data, has the potential to be deployed to estimate discharge. Moreover, the integration of video remote sensing data with hydraulic models offers a pivotal opportunity to enhance the predictive capacity of these models. Hydraulic models are traditionally built with accurate terrain, flow and bathymetric data and are often calibrated and validated using observed data to obtain meaningful and actionable model predictions. Data for accurately calibrating and validating hydraulic models are not always available, leaving the assessment of the predictive capabilities of some models deployed in flood risk management in question. Recent advances in remote sensing have heralded the availability of vast video datasets of high resolution. The parallel evolution of computing capabilities, coupled with advancements in artificial intelligence are enabling the processing of data at unprecedented scales and complexities, allowing us to glean meaningful insights into datasets that can be integrated with hydraulic models. The aims of the research presented in this thesis were twofold. The first aim was to evaluate and explore the potential applications of video from air- and space-borne platforms to comprehensively calibrate and validate two-dimensional hydraulic models. The second aim was to estimate river discharge using satellite video combined with high resolution topographic data. In the first of three empirical chapters, non-intrusive image velocimetry techniques were employed to estimate river surface velocities in a rural catchment. For the first time, a 2D hydraulicvmodel was fully calibrated and validated using velocities derived from Unpiloted Aerial Vehicle (UAV) image velocimetry approaches. This highlighted the value of these data in mitigating the limitations associated with traditional data sources used in parameterizing two-dimensional hydraulic models. This finding inspired the subsequent chapter where river surface velocities, derived using Large Scale Particle Image Velocimetry (LSPIV), and flood extents, derived using deep neural network-based segmentation, were extracted from satellite video and used to rigorously assess the skill of a two-dimensional hydraulic model. Harnessing the ability of deep neural networks to learn complex features and deliver accurate and contextually informed flood segmentation, the potential value of satellite video for validating two dimensional hydraulic model simulations is exhibited. In the final empirical chapter, the convergence of satellite video imagery and high-resolution topographical data bridges the gap between visual observations and quantitative measurements by enabling the direct extraction of velocities from video imagery, which is used to estimate river discharge. Overall, this thesis demonstrates the significant potential of emerging video-based remote sensing datasets and offers approaches for integrating these data into hydraulic modelling and discharge estimation practice. The incorporation of LSPIV techniques into flood modelling workflows signifies a methodological progression, especially in areas lacking robust data collection infrastructure. Satellite video remote sensing heralds a major step forward in our ability to observe river dynamics in real time, with potentially significant implications in the domain of flood modelling science

    On the Generation of Realistic and Robust Counterfactual Explanations for Algorithmic Recourse

    Get PDF
    This recent widespread deployment of machine learning algorithms presents many new challenges. Machine learning algorithms are usually opaque and can be particularly difficult to interpret. When humans are involved, algorithmic and automated decisions can negatively impact people’s lives. Therefore, end users would like to be insured against potential harm. One popular way to achieve this is to provide end users access to algorithmic recourse, which gives end users negatively affected by algorithmic decisions the opportunity to reverse unfavorable decisions, e.g., from a loan denial to a loan acceptance. In this thesis, we design recourse algorithms to meet various end user needs. First, we propose methods for the generation of realistic recourses. We use generative models to suggest recourses likely to occur under the data distribution. To this end, we shift the recourse action from the input space to the generative model’s latent space, allowing to generate counterfactuals that lie in regions with data support. Second, we observe that small changes applied to the recourses prescribed to end users likely invalidate the suggested recourse after being nosily implemented in practice. Motivated by this observation, we design methods for the generation of robust recourses and for assessing the robustness of recourse algorithms to data deletion requests. Third, the lack of a commonly used code-base for counterfactual explanation and algorithmic recourse algorithms and the vast array of evaluation measures in literature make it difficult to compare the per formance of different algorithms. To solve this problem, we provide an open source benchmarking library that streamlines the evaluation process and can be used for benchmarking, rapidly developing new methods, and setting up new experiments. In summary, our work contributes to a more reliable interaction of end users and machine learned models by covering fundamental aspects of the recourse process and suggests new solutions towards generating realistic and robust counterfactual explanations for algorithmic recourse

    Multi-epoch machine learning for galaxy formation

    Get PDF
    In this thesis I utilise a range of machine learning techniques in conjunction with hydrodynamical cosmological simulations. In Chapter 2 I present a novel machine learning method for predicting the baryonic properties of dark matter only subhalos taken from N-body simulations. The model is built using a tree-based algorithm and incorporates subhalo properties over a wide range of redshifts as its input features. I train the model using a hydrodynamical simulation which enables it to predict black hole mass, gas mass, magnitudes, star formation rate, stellar mass, and metallicity. This new model surpasses the performance of previous models. Furthermore, I explore the predictive power of each input property by looking at feature importance scores from the tree-based model. By applying the method to the LEGACY N-body simulation I generate a large volume mock catalog of the quasar population at z=3. By comparing this mock catalog with observations, I demonstrate that the IllustrisTNG subgrid model for black holes is not accurately capturing the growth of the most massive objects. In Chapter 3 I apply my method to investigate the evolution of galaxy properties in different simulations, and in various environments within a single simulation. By comparing the Illustris, EAGLE, and TNG simulations I show that subgrid model physics plays a more significant role than the choice of hydrodynamics method. Using the CAMELS simulation suite I consider the impact of cosmological and astrophysical parameters on the buildup of stellar mass within the TNG and SIMBA models. In the final chapter I apply a combination of neural networks and symbolic regression methods to construct a semi-analytic model which reproduces the galaxy population from a cosmological simulation. The neural network based approach is capable of producing a more accurate population than a previous method of binning based on halo mass. The equations resulting from symbolic regression are found to be a good approximation of the neural network

    Measuring and Correcting the Effects of Scintillation in Astronomy

    Get PDF
    High-precision ground-based time-resolved photometry is significantly limited by the effects of the Earth's atmosphere. Optical atmospheric turbulence, produced by the mixing of layers of air of different temperatures, results in layers of spatially and temporally varying refractive indices. These result in phase aberrations of the star light which have two effects: firstly the point spread function is broadened, thus limiting the resolution, and secondly the propagation of these aberrations results in spatio-temporal intensity fluctuations in the pupil-plane of the telescope known as scintillation. The first effect can be corrected with adaptive optics, however the scintillation noise remains. In this thesis, the results from testing a scintillation correction technique that uses tomographic wavefront sensing are presented. The technique was explored extensively in simulation before being tested on-sky on the Isaac Newton Telescope in La Palma, Spain. Scintillation noise also limits the signal-to-noise ratio that can be achieved for standard differential photometry as the random noise fluctuations in the comparison star and the target star light curves add in quadrature. A differential photometry technique that uses optimised temporal binning of the comparison star to minimise the addition of random noise fluctuations is presented and tested both in simulation and with on-sky data. Finally, an investigation into the use of sparse arrays of small telescopes to reduce scintillation noise in photometry is presented. The impact of several parameters on the correlation of scintillation noise measured between sub-apertures in the array is explored

    Effects of municipal smoke-free ordinances on secondhand smoke exposure in the Republic of Korea

    Get PDF
    ObjectiveTo reduce premature deaths due to secondhand smoke (SHS) exposure among non-smokers, the Republic of Korea (ROK) adopted changes to the National Health Promotion Act, which allowed local governments to enact municipal ordinances to strengthen their authority to designate smoke-free areas and levy penalty fines. In this study, we examined national trends in SHS exposure after the introduction of these municipal ordinances at the city level in 2010.MethodsWe used interrupted time series analysis to assess whether the trends of SHS exposure in the workplace and at home, and the primary cigarette smoking rate changed following the policy adjustment in the national legislation in ROK. Population-standardized data for selected variables were retrieved from a nationally representative survey dataset and used to study the policy action’s effectiveness.ResultsFollowing the change in the legislation, SHS exposure in the workplace reversed course from an increasing (18% per year) trend prior to the introduction of these smoke-free ordinances to a decreasing (−10% per year) trend after adoption and enforcement of these laws (β2 = 0.18, p-value = 0.07; β3 = −0.10, p-value = 0.02). SHS exposure at home (β2 = 0.10, p-value = 0.09; β3 = −0.03, p-value = 0.14) and the primary cigarette smoking rate (β2 = 0.03, p-value = 0.10; β3 = 0.008, p-value = 0.15) showed no significant changes in the sampled period. Although analyses stratified by sex showed that the allowance of municipal ordinances resulted in reduced SHS exposure in the workplace for both males and females, they did not affect the primary cigarette smoking rate as much, especially among females.ConclusionStrengthening the role of local governments by giving them the authority to enact and enforce penalties on SHS exposure violation helped ROK to reduce SHS exposure in the workplace. However, smoking behaviors and related activities seemed to shift to less restrictive areas such as on the streets and in apartment hallways, negating some of the effects due to these ordinances. Future studies should investigate how smoke-free policies beyond public places can further reduce the SHS exposure in ROK

    A First Course in Causal Inference

    Full text link
    I developed the lecture notes based on my ``Causal Inference'' course at the University of California Berkeley over the past seven years. Since half of the students were undergraduates, my lecture notes only require basic knowledge of probability theory, statistical inference, and linear and logistic regressions

    Synthesis of multifunctional glyco-pseudodendrimers and glyco-dendrimers and their investigation as anti-Alzheimer agents

    Get PDF
    As the world population is aging, the cases of Alzheimer’s Disease (AD) are increasing. AD is a disorder of the brain which is characterized by the aggregation of amyloid beta (Aβ) plaques. This leads to the death of numerous brain cells thus affecting the cognitive and motor functions of the individual. Till date, no cure for the disease is available. Aβ are peptides with 40/42 amino acid residues but, their exact mechanism(s) of action in AD is under debate. Having different amino acid residues makes them susceptible to form hydrogen bonds. Dendrimers with sugar units are often referred to as glycopolymers and have been shown to have potential anti-amyloidogenic activity. However, they also have drawbacks, the synthesis involves multiple tedious steps, and dendrimers themselves offer only a limited number of functional units. Pseudodendrimers are another class of branched polymers based on hyperbranched polymers. Unlike the dendrimers, they are easy to synthesize with a dense shell of functional units on the surface. One of the main goals in this dissertation is the synthesis and characterization of pseudodendrimers and dendrimers based on 2,2-bis(hydroxymethyl)-propionic acid (bis-MPA), an aliphatic polyester scaffold, as it offers biocompatibility and easy degradability. Furthermore, they are decorated with mannose units on the surface using a ‘click’ reaction forming glyco-pseudodendrimers and glyco-dendrimers. A detailed characterization of their structures and physical properties was undertaken using techniques such as size exclusion chromatography, asymmetric flow field flow fractionation (AF4), and dynamic light scattering. The second main focus of this work has been to investigate the interaction of synthesized glyco-pseudodendrimers and glyco-dendrimers with Aβ 40 peptides. For this task, five different concentrations of the synthesized glycopolymers were tested with Aβ 40 using the Thioflavin T assay. The results of the synthesized polymers which produced the best results of showing maximum anti-aggregation behavior against Aβ 40 were confirmed with circular dichroism spectroscopy. AF4 was also used to investigate Aβ 40-glycopolymer aggregates, which has never been done before and constitutes the highlight of this dissertation. Atomic force microscopy was used to image Aβ 40-glycopseudodenrimer aggregates. A basic but important step in the development of drug delivery platforms is to evaluate the toxicity of the drugs synthesized. In this work, preliminary studies of the cytotoxicity of glyco-pseudodendrimers were performed in two different cell lines. Thus, this study comprises a preliminary investigation of the anti-amyloidogenic activity of glyco-pseudodendrimers synthesized on an aliphatic polyester backbone.:Abstract List of Tables List of Figures Abbreviations 1 Introduction 1.1 Objectives of the work 1.2 Thesis overview 2 Fundamentals and Literature 2.1 Alzheimer’s Disease and its impact 2.1.1 Neurological diagnosis of AD 2.1.2 Histopathology of AD 2.1.3 Amyloid precursor protein (APP) and its role in AD 2.2. Amyloid Beta (Aβ) peptide 2.2.1 Aβ peptide 2.2.2. Location and function 2.2.3 Amyloid hypothesis 2.2.4 The mechanism of Aβ aggregation 2.2.5 Amyloid fibrils 2.2.6 Toxicity of Aβ 2.3 Research methods to study Aβ aggregates 2.3.1 Models to study the mode of action of aggregates 2.3.2 Endogenous Aβ aggregates and synthetic aggregates 2.3.3 Strategies to alter aggregation of amyloids 2.4 Treatment and therapeutics 2.4.1 Current therapeutics 2.4.2 Current therapeutic research 2.4.2.1 Reduction of Aβ production 2.4.2.2 Reduction of Aβ plaque accumulation 2.4.2.2.1 Anti-amyloid aggregation agents 2.4.2.2.2 Metals 2.4.2.2.3 Immunotherapy 2.4.2.2.4 Dendrimers as potential anti-amyloidogenic agent 2.6 Dendrimers 2.6.1 Definition 2.6.2 Structure Table of Contents 2.6.3 Synthesis 2.6.4 Properties 2.7 Pseudodendrimers - a sub-class of hyperbranched polymer 2.7.1 Definition 2.7.2 Structure 2.7.3 Synthesis 3 Analytical Techniques 3.1 Size Exclusion Chromatography Coupled to Light Scattering (SEC-MALS) 3.2 Asymmetric Flow Field Flow Fractionation (AF4) 3.3 Dynamic Light Scattering 3.4 Molecular Dynamics Simulation 3.5 Nuclear Magnetic Resonance Spectroscopy 3.6 Thioflavin T fluorescence 3.6.1 Kinetic analysis 3.7 Circular Dichroism Spectroscopy 3.8 Atomic Force Microscopy 3.9 Cytotoxic assay 3.9.1 MTT assay 3.9.2 Determining the level of reactive oxygen species 3.9.3 Changes in mitochondrial transmembrane potential 3.9.4 Flow cytometric detection of phosphatidyl serine exposure 4 Experimental Details and Methodology 4.1 Details of chemicals/components used 4.1.1 Other materials 4.1.2 Peptide preparation 4.1.3 Buffer preparation 4.1.4 Fibril growth conditions 4.2 Synthesis and characterization of polymers 4.2.1 Synthesis and characterization of pseudodendrimers and dendrimers 4.2.1.1 Synthesis of hyperbranched polymer (1) 4.2.1.2 Synthesis of protected monomer 4.2.1.2.1 bis-MPA acetonide (2) 4.2.1.2.2 bis-MPA-acetonide anhydride (3) 4.2.1.3 Synthesis of protected pseudodendrimers (4, 6 and 8) and protected dendrimers (10, 12, and 14) 4.2.1.4 Deprotection of pseudodendrimers (5,7, and 9) and dendrimers (11,13 and 15) 4.2.2 Synthesis of glyco-pseudodendrimers and glyco-dendrimers 4.2.2.1 Pentynoic anhydride (16) 4.2.2.2 Synthesis of pentinate modified pseudodendrimers (17, 18 and 19) and dendrimers (20, 21 and 22) 4.2.2.3 3-Azido-1-propanol (23) 4.2.2.4 Mannose propyl azide tetraacetate (24) Table of Contents 4.2.2.5 Mannosepropylazide (25) 4.2.2.6 Glyco-pseudodendrimers (Gl-P) (26, 27 and 28) and glyco- dendrimers (Gl-D) (29, 30 and 31) 4.3 Analytical techniques and their general details 4.3.1 SEC-MALS - Instrumentation, software and analysis 4.3.2 AF4 - Instrumentation, software and analysis 4.3.2.1 Sample preparation 4.3.2.2 Method development for analysis of Gl-P and Gl-D 4.3.2.3 Method development for analysis of Aβ 40 and its interaction with Gl-P and Gl-D 4.3.3 Batch DLS - Instrumentation, software and analysis 4.3.3.1 Sample preparation 4.3.4 Theoretical calculations and molecular dynamics simulations 4.3.4.1 Ab-initio calculations 4.3.4.2 Modelling of the polymer structures 4.3.4.2.1 Pseudodendrimers 4.3.4.2.2 Dendrimers 4.3.4.2.3 Modification of the polymers with special end groups 4.3.4.2.4 Preparing of the THF solvent box 4.3.4.2.5 Solvation of the polymer structures 4.3.4.3 Molecular dynamics simulations 4.3.4.3.1 Evaluation of the simulation trajectories 4.4 Investigation of interaction of Gl-P and Gl-D with amyloid beta (Aβ 40) 4.4.1 ThT Assay - Instrumentation and software 4.4.1.1 Sample preparation 4.4.1.2 Kinetics based on ThT assay- software and data analysis 4.4.2 CD spectroscopy - Instrumentation and software 4.4.2.1 Sample preparation 4.4.3 AFM - Instrumentation and software 4.4.3.1 Substrate and sample preparation 4.4.3.2 Height determination and counting procedures 4.4.3.3 Topography and diameter 4.5 Cytotoxicity 4.5.1 Zeta potential 4.5.2 Cell culturing 4.5.3 Sample preparation 4.5.4 MTT assay 4.5.5 Changes in mitochondrial transmembrane potential (JC-1 method) 4.5.6 Flow cytometric detection of phosphatidyl serine exposure (Annexin V and PI method) 5 Results and Discussion 5.1 Synthesis and characterization of glyco-pseudodendrimers and glyco- dendrimers 5.1.1 Synthesis and characterization of hyperbranched polyester Table of Contents 5.1.2 Synthesis and characterization of pseudodendrimers P-G1-OH, P-G2-OH and P-G3-OH 5.1.3 Synthesis and characterization of dendrimers D-G4-OH, D-G5-OH and D-G6-OH 5.1.4 Synthesis and characterization of Gl-P and Gl-D 5.1.4.1 Molecular size determination of Gl-P and Gl-D using SEC 5.1.4.2 Particle size determination using batch DLS 5.1.4.3 Apparent densities 5.1.4.4 Molecular size determination of Gl-P and Gl-D using AF4 ..... 5.1.5 Molecular dynamics simulation 5.2 Investigation of interaction of Gl-P and Gl-D with amyloid beta (Aβ 40) ...... 5.2.1 ThT Assay 5.2.1.1 Kinetics based on ThT assay 5.2.2 CD spectroscopy 5.2.3 Time dependent AF4 5.3.2.1 Separation of Aβ 40 by AF4 5.3.2.2 Aβ 40 amyloid aggregation in the presence of Gl-P and Gl-D 5.2.4 AFM 5.2.4.1 Height 5.2.4.2 Topography and diameter 5.2.4.3 Length 5.2.4.4 Morphology 5.2.5 Cytotoxicity 5.2.5.1 MTT assay 5.2.5.2 Changes in mitochondrial transmembrane potential 5.2.5.3 Flow cytometric detection of phosphatidyl serine exposure 6 Conclusions and Outlook 7 Bibliography Appendix Acknowledgement
    • …
    corecore