12 research outputs found

    A review: Simultaneous localization and mapping algorithms

    Get PDF
    Simultaneous Localization and Mapping (SLAM) involves creating an environmental map based on sensor data, while concurrently keeping track of the robot’s current position. Efficient and accurate SLAM is crucial for any mobile robot to perform robust navigation. It is also the keystone for higher-level tasks such as path planning and autonomous navigation. The past two decades have seen rapid and exciting progress in solving the SLAM problem together with many compelling implementations of SLAM methods. In this paper, we will review the two common families of SLAM algorithms: Kalman filter with its variations and particle filters. This article complements other surveys in this ?eld by reviewing the representative algorithms and the state-of-the-art in each family. It clearly identifies the inherent relationship between the state estimation via the KF versus PF techniques, all of which are derivations of Bayes rule

    Robotic Item Retrieval System

    Get PDF
    The Robotic Item Retrieval System is a household robot that could self-navigate in a room and retrieve items for the user. The robot is running on a wheeled base and it is equipped with an elevator platform and robotic arms for item retrieval. The robot is equipped with a surveillance camera and ultrasonic sensors for navigation and item identification. A computer graphical interface is used to control the robot. The robot will be operating wirelessly through 802.11g/n wireless network to perform bi-directional communication with the host computer GUI. Aside from manual remote control, the robot will support two automated navigation methods: line following and map generation. The line following method requires pre-placed markers on the floor to guide the robot through the room. The map generation method is more sophisticated, as it enables the robot to generate its own room map and use it for navigation later. Due to the time constraint of this project, our functional prototype may only support line following navigation. However, the map generation navigation will be implemented for the final product.&nbsp

    Military airborne and maritime application for cooperative behaviors.

    Full text link

    New History Matching Methodology for Two Phase Reservoir Using Expectation-Maximization (EM) Algorithm

    Get PDF
    The Expectation-Maximization (EM) Algorithm is a well-known method for estimating maximum likelihood and can be used to find missing numbers in an array. The EM Algorithm has been used extensively in Electrical and Electronics Engineering as well as in the Biometrics industries for image processing but very little use of the EM Algorithm has been seen in the Oil and Gas industry, especially for History Matching. History matching is a non-unique matching of oil rate, water rate, gas rate and bottom hole pressure data of a producing well (known as Producer) as well as the bottom hole pressure and liquid injection of an injecting well (known as Injector) by adjusting reservoir parameters such as permeability, porosity, Corey exponents, compressibility factor, and other pertinent reservoir parameters. EM Algorithm is a statistical method that guarantees convergence and is particularly useful when the likelihood function is a member of the exponential family. On the other hand, EM algorithm can be slow to converge, and may converge to a local optimum of the observed data log likelihood function, depending on the starting values. In this research, our objective is to develop an algorithm that can be used to successfully match the historical production data given sparse field data. Our approach will be to update the permeability multiplier, thereby updating the permeability of each unobserved grid cell that contributes to the production at one or more producing wells. The EM algorithm will be utilized to optimize the permeability multiplier of each contributing unobserved grid cell

    Interfaces graphiques tridimentionnelles de téléopération de plateformes robotiques mobiles

    Get PDF
    Les besoins grandissant en santé rendent des technologies comme la téléprésence à domicile de plus en plus intéressantes. Cependant, dans le domaine des interfaces humains-machines, il est souvent noté que négliger la façon dont est présentée l'information provenant du robot peut nuire à l'opérateur dans sa compréhension de la situation, ce qui entraîne une efficacité réduite. C'est en considérant la façon dont est traitée l'information chez l'opérateur que nous arriverons à développer une interface permettant d'allouer le maximum des capacités cognitives de l'opérateur à la tâche. De plus, les développements récents de matériel à haute performance et à coûts réduits nous permettent de mettre en oeuvre des techniques modernes de traitement d'images en temps réel. Nous proposons donc de développer un système flexible pour étudier les différentes façons de présenter l'information pertinente à la navigation efficace d'une plateforme robotique mobile. Ce système est basé sur une reconstruction en trois dimensions de l'environnement parcouru à partir des lectures de capteurs retrouvés couramment sur ces plateformes. De plus, l'utilisation d'une caméra vidéo stéréoscopique permet de reproduire l'effet de perspective tel qu'une personne sur place le percevrait. La présence d'un flux vidéo est souvent appréciée par les opérateurs et nous croyons que d'ajouter la profondeur dans notre reproduction de celui-ci est un avantage. Finalement, la caméra virtuelle de l'interface peut être continuellement réorientée de façon à fournir une perspective soit égocentrique, soit exocentrique, selon les préférences de l'opérateur. Nous validons l'utilisation de ce système en évaluant selon différentes métriques les performances d'opérateurs, autant néophytes qu'experts en robotique mobile, de façon à bien cibler les besoins fonctionnels de ce genre d'interfaces et leurs évaluations avec des populations-cibles. Nous croyons que la flexibilité quant au positionnement de la caméra virtuelle de l'interface demeure l'aspect le plus important du système. En effet, nous nous attendons â ce que cela permette à chaque opérateur d'adapter l'interface à ses préférences et les tâches en cours pour qu'il effectue son travail le plus efficacement possible. Bien que nous n'incluons pas de tâches spécifiques au domaine de la télésanté dans nos expérimentations, nous croyons que les observations de ce travail quant à la téléopération en général pourront s'appliquer éventuellement à ce domaine en particulier

    Autoencoder for clinical data analysis and classification : data imputation, dimensional reduction, and pattern recognition

    Get PDF
    Over the last decade, research has focused on machine learning and data mining to develop frameworks that can improve data analysis and output performance; to build accurate decision support systems that benefit from real-life datasets. This leads to the field of clinical data analysis, which has attracted a significant amount of interest in the computing, information systems, and medical fields. To create and develop models by machine learning algorithms, there is a need for a particular type of data for the existing algorithms to build an efficient model. Clinical datasets pose several issues that can affect the classification of the dataset: missing values, high dimensionality, and class imbalance. In order to build a framework for mining the data, it is necessary first to preprocess data, by eliminating patients’ records that have too many missing values, imputing missing values, addressing high dimensionality, and classifying the data for decision support.This thesis investigates a real clinical dataset to solve their challenges. Autoencoder is employed as a tool that can compress data mining methodology, by extracting features and classifying data in one model. The first step in data mining methodology is to impute missing values, so several imputation methods are analysed and employed. Then high dimensionality is demonstrated and used to discard irrelevant and redundant features, in order to improve prediction accuracy and reduce computational complexity. Class imbalance is manipulated to investigate the effect on feature selection algorithms and classification algorithms.The first stage of analysis is to investigate the role of the missing values. Results found that techniques based on class separation will outperform other techniques in predictive ability. The next stage is to investigate the high dimensionality and a class imbalance. However it was found a small set of features that can improve the classification performance, the balancing class does not affect the performance as much as imbalance class
    corecore