DESIGN AND VERIFICATION OF AUTONOMOUS SYSTEMS IN THE PRESENCE OF UNCERTAINTIES

Abstract

Autonomous Systems offer hope towards moving away from mechanized, unsafe, manual, often inefficient practices. The last decade has seen several small, but important, steps towards making this dream into reality. These advancements have helped us to achieve limited autonomy in several places, such as, driving, factory floors, surgeries, wearables, and home assistants, etc. Nevertheless, autonomous systems are required to operate in a wide range of environments with uncertainties (viz., sensor errors, timing errors, dynamic nature of the environment, etc.). Such environmental uncertainties, even when present in small amounts, can have drastic impact on the safety of the system—thus hampering the goal of achieving higher degree of autonomy, especially in safety critical domains. To this end, the dissertation shall discuss formaltechniques that are able to verify and design autonomous systems for safety, even under the presence of such uncertainties, allowing for their trustworthy deployment in the real world. Specifically, the dissertation shall discuss monitoring techniques for autonomous systems from available (noisy) logs, and safety-verification techniques of autonomous system controllers under timing uncertainties. Secondly, using heterogeneous learning-based cloud computing models that can balance uncertainty in output and computation cost, the dissertation will present techniques for designing safe and performance-optimal autonomous systems.Doctor of Philosoph

    Similar works