4,317 research outputs found

    Multicriteria ranking using weights which minimize the score range

    Get PDF
    Various schemes have been proposed for generating a set of non-subjective weights when aggregating multiple criteria for the purposes of ranking or selecting alternatives. The maximin approach chooses the weights which maximise the lowest score (assuming there is an upper bound to scores). This is equivalent to finding the weights which minimize the maximum deviation, or range, between the worst and best scores (minimax). At first glance this seems to be an equitable way of apportioning weight, and the Rawlsian theory of justice has been cited in its support.We draw a distinction between using the maximin rule for the purpose of assessing performance, and using it for allocating resources amongst the alternatives. We demonstrate that it has a number of drawbacks which make it inappropriate for the assessment of performance. Specifically, it is tantamount to allowing the worst performers to decide the worth of the criteria so as to maximise their overall score. Furthermore, when making a selection from a list of alternatives, the final choice is highly sensitive to the removal or inclusion of alternatives whose performance is so poor that they are clearly irrelevant to the choice at hand

    Coverage & cooperation: Completing complex tasks as quickly as possible using teams of robots

    Get PDF
    As the robotics industry grows and robots enter our homes and public spaces, they are increasingly expected to work in cooperation with each other. My thesis focuses on multirobot planning, specifically in the context of coverage robots, such as robotic lawnmowers and vacuum cleaners. Two problems unique to multirobot teams are task allocation and search. I present a task allocation algorithm which balances the workload amongst all robots in the team with the objective of minimizing the overall mission time. I also present a search algorithm which robots can use to find lost teammates. It uses a probabilistic belief of a target robot’s position to create a planning tree and then searches by following the best path in the tree. For robust multirobot coverage, I use both the task allocation and search algorithms. First the coverage region is divided into a set of small coverage tasks which minimize the number of turns the robots will need to take. These tasks are then allocated to individual robots. During the mission, robots replan with nearby robots to rebalance the workload and, once a robot has finished its tasks, it searches for teammates to help them finish their tasks faster

    Radar-based Feature Design and Multiclass Classification for Road User Recognition

    Full text link
    The classification of individual traffic participants is a complex task, especially for challenging scenarios with multiple road users or under bad weather conditions. Radar sensors provide an - with respect to well established camera systems - orthogonal way of measuring such scenes. In order to gain accurate classification results, 50 different features are extracted from the measurement data and tested on their performance. From these features a suitable subset is chosen and passed to random forest and long short-term memory (LSTM) classifiers to obtain class predictions for the radar input. Moreover, it is shown why data imbalance is an inherent problem in automotive radar classification when the dataset is not sufficiently large. To overcome this issue, classifier binarization is used among other techniques in order to better account for underrepresented classes. A new method to couple the resulting probabilities is proposed and compared to others with great success. Final results show substantial improvements when compared to ordinary multiclass classificationComment: 8 pages, 6 figure

    Improving Photometric Redshifts using GALEX Observations for the SDSS Stripe 82 and the Next Generation of SZ Cluster Surveys

    Get PDF
    Four large-area Sunyaev-Zeldovich (SZ) experiments -- APEX-SZ, SPT, ACT, and Planck -- promise to detect clusters of galaxies through the distortion of Cosmic Microwave Background photons by hot (> 10^6 K) cluster gas (the SZ effect) over thousands of square degrees. A large observational follow-up effort to obtain redshifts for these SZ-detected clusters is under way. Given the large area covered by these surveys, most of the redshifts will be obtained via the photometric redshift (photo-z) technique. Here we demonstrate, in an application using ~3000 SDSS stripe 82 galaxies with r<20, how the addition of GALEX photometry (FUV, NUV) greatly improves the photometric redshifts of galaxies obtained with optical griz or ugriz photometry. In the case where large spectroscopic training sets are available, empirical neural-network-based techniques (e.g., ANNz) can yield a photo-z scatter of σz=0.018(1+z)\sigma_z = 0.018 (1+z). If large spectroscopic training sets are not available, the addition of GALEX data makes possible the use simple maximum likelihood techniques, without resorting to Bayesian priors, and obtains σz=0.04(1+z)\sigma_z=0.04(1+z), accuracy that approaches the accuracy obtained using spectroscopic training of neural networks on ugriz observations. This improvement is especially notable for blue galaxies. To achieve these results, we have developed a new set of high resolution spectral templates based on physical information about the star formation history of galaxies. We envision these templates to be useful for the next generation of photo-z applications. We make our spectral templates and new photo-z catalogs available to the community at http://www.ice.csic.es/personal/jimenez/PHOTOZ .Comment: 10 pages, 8 figure

    Improving Influenced Outlierness(INFLO) Outlier Detection Method

    Get PDF
    Anomaly detection refers to the process of finding outlying records from a given dataset.This process is a subject of increasing interest among analysts. Anomaly detection is a subject of interest in various knowledge domains. As the size of data is doubling every three years there is a need to detect anomalies in large datasets as fast as possible. Another need is the availability of unsupervised methods for the same. This thesis aims at implement and comparing few of the state of art unsupervised outlier detection methods and propose a way to better them. This thesis goes in depth about the implementation and analysis of outlier detection algorithms such as Local Outlier Factor(LOF),Connectivity-Based Outlier Factor(COF),Local Distance-Based Outlier Factor and Influenced Outlierness. The concepts of these methods are then combined to propose a new method which better the previous mentioned ones in terms of speed and accuracy

    Predicting Post-Fire Change in West Virginia, USA from Remotely-Sensed Data

    Get PDF
    Prescribed burning is used in West Virginia, USA to return the important disturbance process of fire to oak and oak-pine forests. Species composition and structure are often the main goals for re-establishing fire with less emphasis on fuel reduction or reducing catastrophic wildfire. In planning prescribed fires land managers could benefit from the ability to predict mortality to overstory trees. In this study, wildfires and prescribed fires in West Virginia were examined to determine if specific landscape and terrain characteristics were associated with patches of high/moderate post-fire change. Using the ensemble machine learning approach of Random Forest, we determined that linear aspect was the most important variable associated with high/moderate post-fire change patches, followed by hillshade, aspect as class, heat load index, slope/aspect ratio (sine transformed), average roughness, and slope in degrees. These findings were then applied to a statewide spatial model for predicting post-fire change. Our results will help land managers contemplating the use of prescribed fire to spatially target landscape planning and restoration sites and better estimate potential post-fire effects

    Large-scale PIV surface flow measurements in shallow basins with different geometries

    Get PDF
    Shallow depth flow fields and low velocity magnitudes are often challenges for traditional velocity measuring instruments. As such, new techniques have been developed that provide more reliable velocity measurements under these circumstances. In the present study, the two-dimensional (2D) surface velocity field of shallow basins is assessed by means of Large-Scale Particle Image Velocimetry (LSPIV). The measurements are carried out at the water surface, which means that a laser light sheet is not needed. Depending on the time scales of the flow and the camera characteristics, it is even possible to work with a constant light source. An experimental application of this method is presented to analyze the effects of shallow basin geometry on flow characteristics in reservoirs where large coherent two-dimensional flow structures in the mixing layer dominate the flow characteristics. The flow and boundary conditions that give rise to asymmetric flow are presented. Asymmetric flow structures were observed starting from basin shape ratios that are less than or equal to 0.96. By decreasing the basin length and increasing the shape ratio to greater than 0.96, the flow structure generally tends towards a symmetric patter
    corecore