4,175 research outputs found

    Monte Carlo Method with Heuristic Adjustment for Irregularly Shaped Food Product Volume Measurement

    Get PDF
    Volume measurement plays an important role in the production and processing of food products. Various methods have been proposed to measure the volume of food products with irregular shapes based on 3D reconstruction. However, 3D reconstruction comes with a high-priced computational cost. Furthermore, some of the volume measurement methods based on 3D reconstruction have a low accuracy. Another method for measuring volume of objects uses Monte Carlo method. Monte Carlo method performs volume measurements using random points. Monte Carlo method only requires information regarding whether random points fall inside or outside an object and does not require a 3D reconstruction. This paper proposes volume measurement using a computer vision system for irregularly shaped food products without 3D reconstruction based on Monte Carlo method with heuristic adjustment. Five images of food product were captured using five cameras and processed to produce binary images. Monte Carlo integration with heuristic adjustment was performed to measure the volume based on the information extracted from binary images. The experimental results show that the proposed method provided high accuracy and precision compared to the water displacement method. In addition, the proposed method is more accurate and faster than the space carving method

    Unified Robust Path Planning and Optimal Trajectory Generation for Efficient 3D Area Coverage of Quadrotor UAVs

    Get PDF
    Area coverage is an important problem in robotics applications, which has been widely used in search and rescue, offshore industrial inspection, and smart agriculture. This paper demonstrates a novel unified robust path planning, optimal trajectory generation, and control architecture for a quadrotor coverage mission. To achieve safe navigation in uncertain working environments containing obstacles, the proposed algorithm applies a modified probabilistic roadmap to generating a connected search graph considering the risk of collision with the obstacles. Furthermore, a recursive node and link generation scheme determines a more efficient search graph without extra complexity to reduce the computational burden during the planning procedure. An optimal three-dimensional trajectory generation is then suggested to connect the optimal discrete path generated by the planning algorithm, and the robust control policy is designed based on the cascade NLH∞ framework. The integrated framework is capable of compensating for the effects of uncertainties and disturbances while accomplishing the area coverage mission. The feasibility, robustness and performance of the proposed framework are evaluated through Monte Carlo simulations, PX4 Software-In-the-Loop test facility, and real-world experiments

    A practical guide to multi-objective reinforcement learning and planning

    Get PDF
    Real-world sequential decision-making tasks are generally complex, requiring trade-offs between multiple, often conflicting, objectives. Despite this, the majority of research in reinforcement learning and decision-theoretic planning either assumes only a single objective, or that multiple objectives can be adequately handled via a simple linear combination. Such approaches may oversimplify the underlying problem and hence produce suboptimal results. This paper serves as a guide to the application of multi-objective methods to difficult problems, and is aimed at researchers who are already familiar with single-objective reinforcement learning and planning methods who wish to adopt a multi-objective perspective on their research, as well as practitioners who encounter multi-objective decision problems in practice. It identifies the factors that may influence the nature of the desired solution, and illustrates by example how these influence the design of multi-objective decision-making systems for complex problems. © 2022, The Author(s)

    Towards a model for managing uncertainty in logistics operations – A simulation modeling perspective

    Get PDF
    Uncertainty rules supply chains. Unexpected changes constantly occur on all levels; strategically through globalization, introduction of novel technology, mergers and acquisitions, volatile markets, and on an operational level through demand fluctuations, and events such as late arrival of in-bound material, machine equipment breakdown, and quality problems. The problem with uncertainty is increasing as the focus on cost reductions and efficiency in the industry tends to stretch supply chains to become longer and leaner, thus making them more vulnerable to disturbances. The aim of this thesis is to explore strategies for evaluating and managing uncertainties in a logistics context with the objectives; “to propose a method for modeling and analyzing the dynamics of logistics systems with an emphasize on risk management aspects”, and “to explore the impact of dynamic planning and execution in a logistics system”. Three main strategies for handling uncertainties are being discussed; robustness, reliability, and resilience. All three strategies carry an additional cost that must be weighed against the cost and risk of logistical disruptions. As an aid in making this trade-off, a hybrid simulation approach, based on discrete-event simulation and Monte Carlo simulation, is proposed. A combined analytical, and simulation approach is further used to explore the impact of dynamic planning and execution in a solid waste management case. Finally, a draft framework for how uncertainty can be managed in a logistics context is presented along with the key reasons why the proposed simulation approach has proven itself useful in the context of logistics systems

    Quality-Aware Learning to Prioritize Test Cases

    Get PDF
    Software applications evolve at a rapid rate because of continuous functionality extensions, changes in requirements, optimization of code, and fixes of faults. Moreover, modern software is often composed of components engineered with different programming languages by different internal or external teams. During this evolution, it is crucial to continuously detect unintentionally injected faults and continuously release new features. Software testing aims at reducing this risk by running a certain suite of test cases regularly or at each change of the source code. However, the large number of test cases makes it infeasible to run all test cases. Automated test case prioritization and selection techniques have been studied in order to reduce the cost and improve the efficiency of testing tasks. However, the current state-of-art techniques remain limited in some aspects. First, the existing test prioritization and selection techniques often assume that faults are equally distributed across the software components, which can lead to spending most of the testing budget on components less likely to fail rather than the ones highly to contain faults. Second, the existing techniques share a scalability problem not only in terms of the size of the selected test suite but also in terms of the round-trip time between code commits and engineer feedback on test cases failures in the context of Continuous Integration (CI) development environments. Finally, it is hard to algorithmically capture the domain knowledge of the human testers which is crucial in testing and release cycles. This thesis is a new take on the old problem of reducing the cost of software testing in these regards by presenting a data-driven lightweight approach for test case prioritization and execution scheduling that is being used (i) during CI cycles for quick and resource-optimal feedback to engineers, and (ii) during release planning by capturing the testers domain knowledge and release requirements. Our approach combines software quality metrics with code churn metrics to build a regressive model that predicts the fault density of each component and a classification model to discriminate faulty from non-faulty components. Both models are used to guide the testing effort to the components likely to contain the largest number of faults. The predictive models have been validated on eight industrial automotive software applications at Daimler, showing a classification accuracy of 89% and an accuracy of 85.7% for the regression model. The thesis develops a test cases prioritization model based on features of the code change, the tests execution history and the component development history. The model reduces the cost of CI by predicting whether a particular code change should trigger the individual test suites and their corresponding test cases. In order to algorithmically capture the domain knowledge and the preferences of the tester, our approach developed a test case execution scheduling model that consumes the testers preferences in the form of a probabilistic graph and solves the optimal test budget allocation problem both online in the context of CI cycles and offline when planning a release. Finally, the thesis presents a theoretical cost model that describes when our prioritization and scheduling approach is worthwhile. The overall approach is validated on two industrial analytical applications in the area of energy management and predictive maintenance, showing that over 95% of the test failures are still reported back to the engineers while only 43% of the total available test cases are being executed

    Secure and cost-effective operation of low carbon power systems under multiple uncertainties

    Get PDF
    Power system decarbonisation is driving the rapid deployment of renewable energy sources (RES) like wind and solar at the transmission and distribution level. Their differences from the synchronous thermal plants they are displacing make secure and efficient grid operation challenging. Frequency stability is of particular concern due to the current lack of provision of frequency ancillary services like inertia or response from RES generators. Furthermore, the weather dependency of RES generation coupled with the proliferation of distributed energy resources (DER) like small-scale solar or electric vehicles permeates future low-carbon systems with uncertainty under which legacy scheduling methods are inadequate. Overly cautious approaches to this uncertainty can lead to inefficient and expensive systems, whilst naive methods jeopardise system security. This thesis significantly advances the frequency-constrained scheduling literature by developing frameworks that explicitly account for multiple new uncertainties. This is in addition to RES forecast uncertainty which is the exclusive focus of most previous works. The frameworks take the form of convex constraints that are useful in many market and scheduling problems. The constraints equip system operators with tools to explicitly guarantee their preferred level of system security whilst unlocking substantial value from emerging and abundant DERs. A major contribution is to address the exclusion of DERs from the provision of ancillary services due to their intrinsic uncertainty from aggregation. This is done by incorporating the uncertainty into the system frequency dynamics, from which deterministic convex constraints are derived. In addition to managing uncertainty to facilitate emerging DERs to provide legacy frequency services, a novel frequency containment service is designed. The framework allows a small amount of load shedding to assist with frequency containment during high RES low inertia periods. The expected cost of this service is probabilistic as it is proportional to the probability of a contingency occurring. The framework optimally balances the potentially higher expected costs of an outage against the operational cost benefits of lower ancillary service requirements day-to-day. The developed frameworks are applied extensively to several case studies. These validate their security and demonstrate their significant economic and emission-saving benefits.Open Acces

    Complexity challenges in ATM

    Get PDF
    After more than 4 years of activity, the ComplexWorld Network, together with the projects and PhDs covered under the SESAR long-term research umbrella, have developed sound research material contributing to progress beyond the state of the art in fields such as resilience, uncertainty, multi-agent systems, metrics and data science. The achievements made by the ComplexWorld stakeholders have also led to the identification of new challenges that need to be addressed in the future. In order to pave the way for complexity science research in Air Traffic Management (ATM) in the coming years, ComplexWorld requested external assessments on how the challenges have been covered and where there are existing gaps. For that purpose, ComplexWorld, with the support of EUROCONTROL, established an expert panel to review selected documentation developed by the network and provide their assessment on their topic of expertise
    corecore