78 research outputs found

    READMem: Robust Embedding Association for a Diverse Memory in Unconstrained Video Object Segmentation

    Full text link
    We present READMem (Robust Embedding Association for a Diverse Memory), a modular framework for semi-automatic video object segmentation (sVOS) methods designed to handle unconstrained videos. Contemporary sVOS works typically aggregate video frames in an ever-expanding memory, demanding high hardware resources for long-term applications. To mitigate memory requirements and prevent near object duplicates (caused by information of adjacent frames), previous methods introduce a hyper-parameter that controls the frequency of frames eligible to be stored. This parameter has to be adjusted according to concrete video properties (such as rapidity of appearance changes and video length) and does not generalize well. Instead, we integrate the embedding of a new frame into the memory only if it increases the diversity of the memory content. Furthermore, we propose a robust association of the embeddings stored in the memory with query embeddings during the update process. Our approach avoids the accumulation of redundant data, allowing us in return, to restrict the memory size and prevent extreme memory demands in long videos. We extend popular sVOS baselines with READMem, which previously showed limited performance on long videos. Our approach achieves competitive results on the Long-time Video dataset (LV1) while not hindering performance on short sequences. Our code is publicly available.Comment: Accepted to BMVC 2023. Code @ https://github.com/Vujas-Eteph/READMe

    Integration of process design and control: A review

    Get PDF
    There is a large variety of methods in literature for process design and control, which can be classified into two main categories. The methods in the first category have a sequential approach in which, the control system is designed, only after the details of process design are decided. However, when process design is fixed, there is little room left for improving the control performance. Recognizing the interactions between process design and control, the methods in the second category integrate some control aspects into process design. With the aim of providing an exploration map and identifying the potential areas of further contributions, this paper presents a thematic review of the methods for integration of process design and control. The evolution paths of these methods are described and the advantages and disadvantages of each method are explained. The paper concludes with suggestions for future research activities

    An Uncertainty Visual Analytics Framework for Functional Magnetic Resonance Imaging

    Get PDF
    Improving understanding of the human brain is one of the leading pursuits of modern scientific research. Functional magnetic resonance imaging (fMRI) is a foundational technique for advanced analysis and exploration of the human brain. The modality scans the brain in a series of temporal frames which provide an indication of the brain activity either at rest or during a task. The images can be used to study the workings of the brain, leading to the development of an understanding of healthy brain function, as well as characterising diseases such as schizophrenia and bipolar disorder. Extracting meaning from fMRI relies on an analysis pipeline which can be broadly categorised into three phases: (i) data acquisition and image processing; (ii) image analysis; and (iii) visualisation and human interpretation. The modality and analysis pipeline, however, are hampered by a range of uncertainties which can greatly impact the study of the brain function. Each phase contains a set of required and optional steps, containing inherent limitations and complex parameter selection. These aspects lead to the uncertainty that impacts the outcome of studies. Moreover, the uncertainties that arise early in the pipeline, are compounded by decisions and limitations further along in the process. While a large amount of research has been undertaken to examine the limitations and variable parameter selection, statistical approaches designed to address the uncertainty have not managed to mitigate the issues. Visual analytics, meanwhile, is a research domain which seeks to combine advanced visual interfaces with specialised interaction and automated statistical processing designed to exploit human expertise and understanding. Uncertainty visual analytics (UVA) tools, which aim to minimise and mitigate uncertainties, have been proposed for a variety of data, including astronomical, financial, weather and crime. Importantly, UVA approaches have also seen success in medical imaging and analysis. However, there are many challenges surrounding the application of UVA to each research domain. Principally, these involve understanding what the uncertainties are and the possible effects so they may be connected to visualisation and interaction approaches. With fMRI, the breadth of uncertainty arising in multiple stages along the pipeline and the compound effects, make it challenging to propose UVAs which meaningfully integrate into pipeline. In this thesis, we seek to address this challenge by proposing a unified UVA framework for fMRI. To do so, we first examine the state-of-the-art landscape of fMRI uncertainties, including the compound effects, and explore how they are currently addressed. This forms the basis of a field we term fMRI-UVA. We then present our overall framework, which is designed to meet the requirements of fMRI visual analysis, while also providing an indication and understanding of the effects of uncertainties on the data. Our framework consists of components designed for the spatial, temporal and processed imaging data. Alongside the framework, we propose two visual extensions which can be used as standalone UVA applications or be integrated into the framework. Finally, we describe a conceptual algorithmic approach which incorporates more data into an existing measure used in the fMRI analysis pipeline

    Performance assessment and enhancement of precision controlled structures during conceptual design

    Get PDF
    Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 1999.Includes bibliographical references (p. 301-306).Future optical space systems such as interferometers and filled-aperture telescopes will extend the resolution and sensitivity offered by their on-orbit and ground-based predecessors. These systems face the challenge of achieving nanometer and milli-arcsecond precision control of stellar light passing through the optical train of a lightweight, flexible structure subjected to various disturbances. It is advantageous to assess the performance of initial concepts of these precision systems early in the design stage to aid in the requirements flowdown and resource allocation process. A complete end-to-end performance assessment methodology is developed which incorporates disturbance, sensitivity, and uncertainty analysis tools within a common state-space framework. The disturbance analysis is conducted using either a time-domain, frequency-domain, or Lyapunov approach to obtain nominal predictions of performance metric root-mean-square (RMS) values. Calculating power spectral density and cumulative RMS functions of the performance metrics allows critical system modes and frequencies to be identified, and in some instances, contributions from each of the disturbances can be determined. A Lagrange multiplier method is used to derive a governing equation for the sensitivities of the performance metrics with respect to model parameters. For a system whose structural dynamic equations are represented in modal form, the ensitivities can be calculated exactly and efficiently with respect to modal frequencies, masses, and damping ratios. The most critical modal parameters are carried into a parametric uncertainty analysis that seeks to identify the worst-case performance RMS values. A constrained optimization technique is described which searches for the worst-case performance over all allowable parameter values. When required, a performance enhancement approach is used to apply controlled structures technologies such as input/output isolation to achieve large performance changes. Structural modifications based on insight provided by a physical parameter sensitivity analysis are then employed to "fine tune" the performance to keep the worst-case values within the requirements. Analytical physical parameter sensitivities are experimentally validated on a truss structure and used to implement stiffness and mass perturbations that reduce the tip displacement of a flexible appendage. The overall framework is applied to an integrated model of the Space Interferometry Mission to demonstrate its practical use on a large order system.by Homero L. Gutierrez, Jr.Ph.D

    Parameter Estimation of Complex Systems from Sparse and Noisy Data

    Get PDF
    Mathematical modeling is a key component of various disciplines in science and engineering. A mathematical model which represents important behavior of a real system can be used as a substitute for the real process for many analysis and synthesis tasks. The performance of model based techniques, e.g. system analysis, computer simulation, controller design, sensor development, state filtering, product monitoring, and process optimization, is highly dependent on the quality of the model used. Therefore, it is very important to be able to develop an accurate model from available experimental data. Parameter estimation is usually formulated as an optimization problem where the parameter estimate is computed by minimizing the discrepancy between the model prediction and the experimental data. If a simple model and a large amount of data are available then the estimation problem is frequently well-posed and a small error in data fitting automatically results in an accurate model. However, this is not always the case. If the model is complex and only sparse and noisy data are available, then the estimation problem is often ill-conditioned and good data fitting does not ensure accurate model predictions. Many challenges that can often be neglected for estimation involving simple models need to be carefully considered for estimation problems involving complex models. To obtain a reliable and accurate estimate from sparse and noisy data, a set of techniques is developed by addressing the challenges encountered in estimation of complex models, including (1) model analysis and simplification which identifies the important sources of uncertainty and reduces the model complexity; (2) experimental design for collecting information-rich data by setting optimal experimental conditions; (3) regularization of estimation problem which solves the ill-conditioned large-scale optimization problem by reducing the number of parameters; (4) nonlinear estimation and filtering which fits the data by various estimation and filtering algorithms; (5) model verification by applying statistical hypothesis test to the prediction error. The developed methods are applied to different types of models ranging from models found in the process industries to biochemical networks, some of which are described by ordinary differential equations with dozens of state variables and more than a hundred parameters
    • …
    corecore