4 research outputs found

    A digital twin mixed-reality system for testing future advanced air mobility concepts: a prototype

    Get PDF
    The UK Future Flight Vision and Roadmap defines how aviation in the UK is envisioned to develop by 2030. As part of the Future Flight demonstration segment, project HADO (High-intensity Autonomous Drone Operations) will develop, test, and deploy fully automated Unmanned Aircraft System (UAS) operations at London Heathrow airport. The resource-demanding nature of real-world tests, however, suggests that developing and improving the reliability and efficiency of virtual environment-based testing methods is indispensable for the evolution of such operations. Nonetheless, developing a high-fidelity and real-time virtual environment that enables the safe, scalable, and sustainable development, verification, and validation of UAS operations remains a daunting task. Notably, the need to integrate physical and virtual elements with a high degree of correlation presents a significant challenge. Consequently, as part of the synthetic test environment work package within the HADO project, this paper proposes a Digital Twin (DT) system to enable mixed-reality tests in the context of autonomous UAS operations. This connects a physical world to its digital counterpart made up of five distinct layers and several digital elements to support enhanced mixed-reality functionality. The paper highlights how the static layers of the synthetic test environment are built, and presents a DT prototype that supports mixed-reality test capabilities. In particular, the ability to inject virtual obstacles into physical test environments is demonstrated, highlighting how the sharp boundaries between virtual environments and reality can be blurred for safe, flexible, efficient, and effective testing of UAS operations.UKRI: 1002481

    Co-simulation digital twin framework for testing future advanced air mobility concepts: a study with BlueSky and AirSim

    Get PDF
    The UK Future Flight Vision and Roadmap outlines the anticipated development of aviation in the UK by 2030. As part of the Future Flight demonstration segment, project HADO (High-intensity Autonomous Drone Operations) will develop, test, and deploy fully automated unmanned aircraft system (UAS) operations at London Heathrow Airport. Cranfield University is leading the synthetic test environment development within the HADO project, and a digital twin (DT) prototype was developed to enable mixed-reality tests for autonomous UAS operations. This paper enhances the existing DT by introducing new co-simulation capacities. Specifically, a co-simulation DT framework for autonomous UAS operations is proposed and tested through a demonstrative use case based on BlueSky and AirSim. This prototype integrates the traffic simulation capabilities of BlueSky with the 3D simulation capabilities of Airsim, to efficiently enhance the simulation capacities of the DT. Notably, the co-simulation framework can leverage the 3D visualization modules, UAS dynamics, and sensor models within external simulation tools to support a more realistic and high-fidelity simulation environment. Overall, the proposed co-simulation method can interface several simulation tools within a DT, thereby incorporating different communication protocols and realistic visualization capabilities. This creates unprecedented opportunities to combine different software applications and leverage the benefits of each tool

    Developing a digital twin for testing multi-agent systems in advanced air mobility: a case study of Cranfield University and airport

    Get PDF
    Emerging unmanned aircraft system (UAS) and advanced air mobility (AAM) ecosystems rely on the development, certification and deployment of new and potentially intelligent technologies and algorithms. To promote a more efficient development life cycle, this work presents a digital twin architecture and environment to support the rapid prototyping and testing of multi-agent solutions for UAS and AAM applications. It leverages the capabilities of Microsoft AirSim and Cesium as plugins within the Unreal Engine 3D visualisation tool, and consolidates the digital environment with a flexible and scalable Python-based architecture. Moreover, the architecture supports hardware-in-the-loop (HIL) and mixed-reality features for enhanced testing capabilities. The system is comprehensively documented and demonstrated through a series of use cases, deployed within a custom digital environment, comprising both indoor and outdoor areas at Cranfield University and Airport. These include collaborative surveillance, UTM flight authorisation and UTM conformance monitoring experiments, that showcase the modularity, scalability and functionality of the proposed architecture. All 3D models and experimental observations are critically evaluated and shown to exhibit promising results. This thereby represents a critical step forward in the development of a robust digital twin for UAS and AAM applications.UKRI: 1002481

    Autonomous navigation with taxiway crossings identification using camera vision and airport map

    No full text
    With increasing demands of unmanned aerial vehicle (UAV) operations envisioned for the future of aviation, the number of pilots will be much lower than the number of drones, necessitating an increased level of autonomy in drones to alleviate workload. Autonomous UAV taxiing enables autonomy to move on the ground, specifically from the gate to the runway and vice versa without human intervention. This study presents a lightweight vision-based autonomous taxiway navigation system, exploring the fusion of camera vision feed under the nose and airport map data to offer guidance and navigation. A sliding window mechanism is applied in centreline identification to detect line divergence. Centreline representations including divergence, direction and heading are cross-referenced with the airport database for localisation and generating navigation solutions. A simple proportional integral derivative (PID) controller is developed over aircraft dynamic models aligned with Eagle Dynamic’s Digital Combat Simulator to demonstrate the centreline following function. The overall system performance is assessed through simulations, encompassing individual functionality performance tests including centreline extraction test, line matching test, line-to-follow test, generalisation capability test, and computational complexity test. The performance evaluations indicate the promising potential of camera visions in enabling autonomous UAV taxiing with a 71% successful rate of detecting correct lines to follow and the remaining 29% as background. The proposed system also suggests a high generalization capability of more than a 67% success rate when testing over other paths. The source code of this proposition is open-sourced at https://github.com/DelQuentin/TaxiEye
    corecore