3,838 research outputs found

    Integrating expert-based objectivist and nonexpert-based subjectivist paradigms in landscape assessment

    Get PDF
    This thesis explores the integration of objective and subjective measures of landscape aesthetics, particularly focusing on crowdsourced geo-information. It addresses the increasing importance of considering public perceptions in national landscape governance, in line with the European Landscape Convention's emphasis on public involvement. Despite this, national landscape assessments often remain expert-centric and top-down, facing challenges in resource constraints and limited public engagement. The thesis leverages Web 2.0 technologies and crowdsourced geographic information, examining correlations between expert-based metrics of landscape quality and public perceptions. The Scenic-Or-Not initiative for Great Britain, GIS-based Wildness spatial layers, and LANDMAP dataset for Wales serve as key datasets for analysis. The research investigates the relationships between objective measures of landscape wildness quality and subjective measures of aesthetics. Multiscale geographically weighted regression (MGWR) reveals significant correlations, with different wildness components exhibiting varying degrees of association. The study suggests the feasibility of incorporating wildness and scenicness measures into formal landscape aesthetic assessments. Comparing expert and public perceptions, the research identifies preferences for water-related landforms and variations in upland and lowland typologies. The study emphasizes the agreement between experts and non-experts on extreme scenic perceptions but notes discrepancies in mid-spectrum landscapes. To overcome limitations in systematic landscape evaluations, an integrative approach is proposed. Utilizing XGBoost models, the research predicts spatial patterns of landscape aesthetics across Great Britain, based on the Scenic-Or-Not initiatives, Wildness spatial layers, and LANDMAP data. The models achieve comparable accuracy to traditional statistical models, offering insights for Landscape Character Assessment practices and policy decisions. While acknowledging data limitations and biases in crowdsourcing, the thesis discusses the necessity of an aggregation strategy to manage computational challenges. Methodological considerations include addressing the modifiable areal unit problem (MAUP) associated with aggregating point-based observations. The thesis comprises three studies published or submitted for publication, each contributing to the understanding of the relationship between objective and subjective measures of landscape aesthetics. The concluding chapter discusses the limitations of data and methods, providing a comprehensive overview of the research

    Deep generative models for network data synthesis and monitoring

    Get PDF
    Measurement and monitoring are fundamental tasks in all networks, enabling the down-stream management and optimization of the network. Although networks inherently have abundant amounts of monitoring data, its access and effective measurement is another story. The challenges exist in many aspects. First, the inaccessibility of network monitoring data for external users, and it is hard to provide a high-fidelity dataset without leaking commercial sensitive information. Second, it could be very expensive to carry out effective data collection to cover a large-scale network system, considering the size of network growing, i.e., cell number of radio network and the number of flows in the Internet Service Provider (ISP) network. Third, it is difficult to ensure fidelity and efficiency simultaneously in network monitoring, as the available resources in the network element that can be applied to support the measurement function are too limited to implement sophisticated mechanisms. Finally, understanding and explaining the behavior of the network becomes challenging due to its size and complex structure. Various emerging optimization-based solutions (e.g., compressive sensing) or data-driven solutions (e.g. deep learning) have been proposed for the aforementioned challenges. However, the fidelity and efficiency of existing methods cannot yet meet the current network requirements. The contributions made in this thesis significantly advance the state of the art in the domain of network measurement and monitoring techniques. Overall, we leverage cutting-edge machine learning technology, deep generative modeling, throughout the entire thesis. First, we design and realize APPSHOT , an efficient city-scale network traffic sharing with a conditional generative model, which only requires open-source contextual data during inference (e.g., land use information and population distribution). Second, we develop an efficient drive testing system — GENDT, based on generative model, which combines graph neural networks, conditional generation, and quantified model uncertainty to enhance the efficiency of mobile drive testing. Third, we design and implement DISTILGAN, a high-fidelity, efficient, versatile, and real-time network telemetry system with latent GANs and spectral-temporal networks. Finally, we propose SPOTLIGHT , an accurate, explainable, and efficient anomaly detection system of the Open RAN (Radio Access Network) system. The lessons learned through this research are summarized, and interesting topics are discussed for future work in this domain. All proposed solutions have been evaluated with real-world datasets and applied to support different applications in real systems

    Flood dynamics derived from video remote sensing

    Get PDF
    Flooding is by far the most pervasive natural hazard, with the human impacts of floods expected to worsen in the coming decades due to climate change. Hydraulic models are a key tool for understanding flood dynamics and play a pivotal role in unravelling the processes that occur during a flood event, including inundation flow patterns and velocities. In the realm of river basin dynamics, video remote sensing is emerging as a transformative tool that can offer insights into flow dynamics and thus, together with other remotely sensed data, has the potential to be deployed to estimate discharge. Moreover, the integration of video remote sensing data with hydraulic models offers a pivotal opportunity to enhance the predictive capacity of these models. Hydraulic models are traditionally built with accurate terrain, flow and bathymetric data and are often calibrated and validated using observed data to obtain meaningful and actionable model predictions. Data for accurately calibrating and validating hydraulic models are not always available, leaving the assessment of the predictive capabilities of some models deployed in flood risk management in question. Recent advances in remote sensing have heralded the availability of vast video datasets of high resolution. The parallel evolution of computing capabilities, coupled with advancements in artificial intelligence are enabling the processing of data at unprecedented scales and complexities, allowing us to glean meaningful insights into datasets that can be integrated with hydraulic models. The aims of the research presented in this thesis were twofold. The first aim was to evaluate and explore the potential applications of video from air- and space-borne platforms to comprehensively calibrate and validate two-dimensional hydraulic models. The second aim was to estimate river discharge using satellite video combined with high resolution topographic data. In the first of three empirical chapters, non-intrusive image velocimetry techniques were employed to estimate river surface velocities in a rural catchment. For the first time, a 2D hydraulicvmodel was fully calibrated and validated using velocities derived from Unpiloted Aerial Vehicle (UAV) image velocimetry approaches. This highlighted the value of these data in mitigating the limitations associated with traditional data sources used in parameterizing two-dimensional hydraulic models. This finding inspired the subsequent chapter where river surface velocities, derived using Large Scale Particle Image Velocimetry (LSPIV), and flood extents, derived using deep neural network-based segmentation, were extracted from satellite video and used to rigorously assess the skill of a two-dimensional hydraulic model. Harnessing the ability of deep neural networks to learn complex features and deliver accurate and contextually informed flood segmentation, the potential value of satellite video for validating two dimensional hydraulic model simulations is exhibited. In the final empirical chapter, the convergence of satellite video imagery and high-resolution topographical data bridges the gap between visual observations and quantitative measurements by enabling the direct extraction of velocities from video imagery, which is used to estimate river discharge. Overall, this thesis demonstrates the significant potential of emerging video-based remote sensing datasets and offers approaches for integrating these data into hydraulic modelling and discharge estimation practice. The incorporation of LSPIV techniques into flood modelling workflows signifies a methodological progression, especially in areas lacking robust data collection infrastructure. Satellite video remote sensing heralds a major step forward in our ability to observe river dynamics in real time, with potentially significant implications in the domain of flood modelling science

    3D Innovations in Personalized Surgery

    Get PDF
    Current practice involves the use of 3D surgical planning and patient-specific solutions in multiple surgical areas of expertise. Patient-specific solutions have been endorsed for several years in numerous publications due to their associated benefits around accuracy, safety, and predictability of surgical outcome. The basis of 3D surgical planning is the use of high-quality medical images (e.g., CT, MRI, or PET-scans). The translation from 3D digital planning toward surgical applications was developed hand in hand with a rise in 3D printing applications of multiple biocompatible materials. These technical aspects of medical care require engineers’ or technical physicians’ expertise for optimal safe and effective implementation in daily clinical routines.The aim and scope of this Special Issue is high-tech solutions in personalized surgery, based on 3D technology and, more specifically, bone-related surgery. Full-papers or highly innovative technical notes or (systematic) reviews that relate to innovative personalized surgery are invited. This can include optimization of imaging for 3D VSP, optimization of 3D VSP workflow and its translation toward the surgical procedure, or optimization of personalized implants or devices in relation to bone surgery

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Revisiting the capitalization of public transport accessibility into residential land value: an empirical analysis drawing on Open Science

    Get PDF
    Background: The delivery and effective operation of public transport is fundamental for a for a transition to low-carbon emission transport systems’. However, many cities face budgetary challenges in providing and operating this type of infrastructure. Land value capture (LVC) instruments, aimed at recovering all or part of the land value uplifts triggered by actions other than the landowner, can alleviate some of this pressure. A key element of LVC lies in the increment in land value associated with a particular public action. Urban economic theory supports this idea and considers accessibility to be a core element for determining residential land value. Although the empirical literature assessing the relationship between land value increments and public transport infrastructure is vast, it often assumes homogeneous benefits and, therefore, overlooks relevant elements of accessibility. Advancements in the accessibility concept in the context of Open Science can ease the relaxation of such assumptions. Methods: This thesis draws on the case of Greater Mexico City between 2009 and 2019. It focuses on the effects of the main public transport network (MPTN) which is organised in seven temporal stages according to its expansion phases. The analysis incorporates location based accessibility measures to employment opportunities in order to assess the benefits of public transport infrastructure. It does so by making extensive use of the open-source software OpenTripPlanner for public transport route modelling (≈ 2.1 billion origin-destination routes). Potential capitalizations are assessed according to the hedonic framework. The property value data includes individual administrative mortgage records collected by the Federal Mortgage Society (≈ 800,000). The hedonic function is estimated using a variety of approaches, i.e. linear models, nonlinear models, multilevel models, and spatial multilevel models. These are estimated by the maximum likelihood and Bayesian methods. The study also examines possible spatial aggregation bias using alternative spatial aggregation schemes according to the modifiable areal unit problem (MAUP) literature. Results: The accessibility models across the various temporal stages evidence the spatial heterogeneity shaped by the MPTN in combination with land use and the individual perception of residents. This highlights the need to transition from measures that focus on the characteristics of transport infrastructure to comprehensive accessibility measures which reflect such heterogeneity. The estimated hedonic function suggests a robust, positive, and significant relationship between MPTN accessibility and residential land value in all the modelling frameworks in the presence of a variety of controls. The residential land value increases between 3.6% and 5.7% for one additional standard deviation in MPTN accessibility to employment in the final set of models. The total willingness to pay (TWTP) is considerable, ranging from 0.7 to 1.5 times the equivalent of the capital costs of the bus rapid transit Line-7 of the MetrobĂșs system. A sensitivity analysis shows that the hedonic model estimation is sensitive to the MAUP. In addition, the use of a post code zoning scheme produces the closest results compared to the smallest spatial analytical scheme (0.5 km hexagonal grid). Conclusion: The present thesis advances the discussion on the capitalization of public transport on residential land value by adopting recent contributions from the Open Science framework. Empirically, it fills a knowledge gap given the lack of literature around this topic in this area of study. In terms of policy, the findings support LVC as a mechanism of considerable potential. Regarding fee-based LVC instruments, there are fairness issues in relation to the distribution of charges or exactions to households that could be addressed using location based measures. Furthermore, the approach developed for this analysis serves as valuable guidance for identifying sites with large potential for the implementation of development based instruments, for instance land readjustments or the sale/lease of additional development rights

    Essays on noncausal and noninvertible time series

    Get PDF
    Over the last two decades, there has been growing interest among economists in nonfundamental univariate processes, generally represented by noncausal and non-invertible time series. These processes have become increasingly popular due to their ability to capture nonlinear dynamics such as volatility clustering, asymmetric cycles, and local explosiveness - all of which are commonly observed in Macroeconomics and Finance. In particular, the incorporation of both past and future components into noncausal and noninvertible processes makes them attractive options for modeling forward-looking behavior in economic activities. However, the classical techniques used for analyzing time series models are largely limited to causal and invertible counterparts. This dissertation seeks to contribute to the field by providing theoretical tools robust to noncausal and noninvertible time series in testing and estimation. In the first chapter, "Quantile Autoregression-Based Non-causality Testing", we investigate the statistical properties of empirical conditional quantiles of non-causal processes. Specifically, we show that the quantile autoregression (QAR) estimates for non-causal processes do not remain constant across different quantiles in contrast to their causal counterparts. Furthermore, we demonstrate that non-causal autoregressive processes admit nonlinear representations for conditional quantiles given past observations. Exploiting these properties, we propose three novel testing strategies of non-causality for non-Gaussian processes within the QAR framework. The tests are constructed either by verifying the constancy of the slope coefficients or by applying a misspecification test of the linear QAR model over different quantiles of the process. Some numerical experiments are included to examine the finite sample performance of the testing strategies, where we compare different specification tests for dynamic quantiles with the Kolmogorov-Smirnov constancy test. The new methodology is applied to some time series from financial markets to investigate the presence of speculative bubbles. The extension of the approach based on the specification tests to AR processes driven by innovations with heteroskedasticity is studied through simulations. The performance of QAR estimates of non-causal processes at extreme quantiles is also explored. In the second chapter, "Estimation of Time Series Models Using the Empirical Distribution of Residuals", we introduce a novel estimation technique for general linear time series models, potentially noninvertible and noncausal, by utilizing the empirical cumulative distribution function of residuals. The proposed method relies on the generalized spectral cumulative function to characterize the pairwise dependence of residuals at all lags. Model identification can be achieved by exploiting the information in the joint distribution of residuals under the iid assumption. This method yields consistent estimates of the model parameters without imposing stringent conditions on the higher-order moments or any distributional assumptions on the innovations beyond non-Gaussianity. We investigate the asymptotic distribution of the estimates by employing a smoothed cumulative distribution function to approximate the indicator function, considering the non-differentiability of the original loss function. Efficiency improvements can be achieved by properly choosing the scaling parameter for residuals. Finite sample properties are explored through Monte Carlo simulations. An empirical application to illustrate this methodology is provided by fitting the daily trading volume of Microsoft stock by autoregressive models with noncausal representation. The flexibility of the cumulative distribution function permits the proposed method to be extended to more general dependence structures where innovations are only conditional mean or quantile independent. In the third chapter, "Directional Predictability Tests", joint with Carlos Velasco, we propose new tests of predictability for non-Gaussian sequences that may display general nonlinear dependence in higher-order properties. We test the null of martingale difference against parametric alternatives which can introduce linear or nonlinear dependence as generated by ARMA and all-pass restricted ARMA models, respectively. We also develop tests to check for linear predictability under the white noise null hypothesis parameterized by an all-pass model driven by martingale difference innovations and tests of non-linear predictability on ARMA residuals. Our Lagrange Multiplier tests are developed from a loss function based on pairwise dependence measures that identify the predictability of levels. We provide asymptotic and finite sample analysis of the properties of the new tests and investigate the predictability of different series of financial returns.This thesis has been possible thanks to the financial support from the grant BES-2017-082695 from the Ministerio de EconomĂ­a Industria y Competitividad.Programa de Doctorado en EconomĂ­a por la Universidad Carlos III de MadridPresidente: Miguel ĂĄngel Delgado GonzĂĄlez.- Secretario: Manuel DomĂ­nguez Toribio.- Vocal: Majid M. Al Sadoo

    ENGINEERING HIGH-RESOLUTION EXPERIMENTAL AND COMPUTATIONAL PIPELINES TO CHARACTERIZE HUMAN GASTROINTESTINAL TISSUES IN HEALTH AND DISEASE

    Get PDF
    In recent decades, new high-resolution technologies have transformed how scientists study complex cellular processes and the mechanisms responsible for maintaining homeostasis and the emergence and progression of gastrointestinal (GI) disease. These advances have paved the way for the use of primary human cells in experimental models which together can mimic specific aspects of the GI tract such as compartmentalized stem-cell zones, gradients of growth factors, and shear stress from fluid flow. The work presented in this dissertation has focused on integrating high-resolution bioinformatics with novel experimental models of the GI epithelium systems to describe the complexity of human pathophysiology of the human small intestines, colon, and stomach in homeostasis and disease. Here, I used three novel microphysiological systems and developed four computational pipelines to describe comprehensive gene expression patterns of the GI epithelium in various states of health and disease. First, I used single cell RNAseq (scRNAseq) to establish the transcriptomic landscape of the entire epithelium of the small intestine and colon from three human donors, describing cell-type specific gene expression patterns in high resolution. Second, I used single cell and bulk RNAseq to model intestinal absorption of fatty acids and show that fatty acid oxidation is a critical regulator of the flux of long- and medium-chain fatty acids across the epithelium. Third, I use bulk RNAseq and a machine learning model to describe how inflammatory cytokines can regulate proliferation of intestinal stem cells in an experimental model of inflammatory hypoxia. Finally, I developed a high throughput platform that can associate phenotype to gene expression in clonal organoids, providing unprecedented resolution into the relationship between comprehensive gene expression patterns and their accompanying phenotypic effects. Through these studies, I have demonstrated how the integration of computational and experimental approaches can measurably advance our understanding of human GI physiology.Doctor of Philosoph

    Deep learning for computer vision constrained by limited supervision

    Get PDF
    This thesis presents the research work conducted on developing algo- rithms capable of training neural networks for image classification and re- gression in low supervision settings. The research was conducted on publicly available benchmark image datasets as well as real world data with appli- cations to herbage quality estimation in an agri-tech scope at the VistaMilk SFI centre. Topics include label noise and web-crawled datasets where some images have an incorrect classification label, semi-supervised learning where only a small part of the available images have been annotated by humans and unsupervised learning where the images are not annotated. The principal contributions are summarized as follows. Label noise: a study highlighting the dual in- and out-of-distribution nature of web-noise; a noise detection metric than can independently retrieve each noise type; an observation of the linear separability of in- and out-of-distribution images in unsupervised contrastive feature spaces; two noise-robust algorithms DSOS and SNCF that iteratively improve the state-of-the-art accuracy on the mini-Webvision dataset. Semi-supervised learning: we use unsupervised features to propagate labels from a few labeled examples to the entire dataset; ReLaB an algorithm that allows to decrease the classification error up to 8% with one labeled representative image on CIFAR-10. Biomass composition estimation from images: two semi-supervised approaches that utilize unlabeled images either through an approximate annotator or by adapting semi-supervised algorithms from the image classification litterature. To scale the biomass to drone images, we use super-resolution paired with semi-supervised learning. Early results on grass biomass estimation show the feasibility of automating the process with accuracies on par or better than human experts. The conclusion of the thesis will summarize the research contributions and discuss thoughts on future research that I believe should be tackled in the field of low supervision computer vision

    Pan-cancer analysis of post-translational modifications reveals shared patterns of protein regulation

    Get PDF
    Post-translational modifications (PTMs) play key roles in regulating cell signaling and physiology in both normal and cancer cells. Advances in mass spectrometry enable high-throughput, accurate, and sensitive measurement of PTM levels to better understand their role, prevalence, and crosstalk. Here, we analyze the largest collection of proteogenomics data from 1,110 patients with PTM profiles across 11 cancer types (10 from the National Cancer Institute\u27s Clinical Proteomic Tumor Analysis Consortium [CPTAC]). Our study reveals pan-cancer patterns of changes in protein acetylation and phosphorylation involved in hallmark cancer processes. These patterns revealed subsets of tumors, from different cancer types, including those with dysregulated DNA repair driven by phosphorylation, altered metabolic regulation associated with immune response driven by acetylation, affected kinase specificity by crosstalk between acetylation and phosphorylation, and modified histone regulation. Overall, this resource highlights the rich biology governed by PTMs and exposes potential new therapeutic avenues
    • 

    corecore