2 research outputs found
Sense-Assess-eXplain (SAX): Building Trust in Autonomous Vehicles in Challenging Real-World Driving Scenarios
This paper discusses ongoing work in demonstrating research in mobile
autonomy in challenging driving scenarios. In our approach, we address
fundamental technical issues to overcome critical barriers to assurance and
regulation for large-scale deployments of autonomous systems. To this end, we
present how we build robots that (1) can robustly sense and interpret their
environment using traditional as well as unconventional sensors; (2) can assess
their own capabilities; and (3), vitally in the purpose of assurance and trust,
can provide causal explanations of their interpretations and assessments. As it
is essential that robots are safe and trusted, we design, develop, and
demonstrate fundamental technologies in real-world applications to overcome
critical barriers which impede the current deployment of robots in economically
and socially important areas. Finally, we describe ongoing work in the
collection of an unusual, rare, and highly valuable dataset.Comment: accepted for publication at the IEEE Intelligent Vehicles Symposium
(IV), Workshop on Ensuring and Validating Safety for Automated Vehicles
(EVSAV), 2020, project URL:
https://ori.ox.ac.uk/projects/sense-assess-explain-sa
Fit for purpose? Predicting perception performance based on past experience
This paper explores the idea of predicting the likely performance of a robot’s perception system based on past experience in the same workspace. In particular, we propose to build a place-specific model of perception performance from observations gathered over time.We evaluate our method in a classical decision making scenario in which the robot must choose when and where to drive autonomously in 60km of driving data from an urban environment. We demonstrate that leveraging visual appearance within a state-of-the-art navigation framework increases the accuracy of our performance predictions