5 research outputs found

    Accountability, Responsibility and Robustness in Agent Organizations

    Get PDF

    Accountability as a Foundation for Requirements in Sociotechnical Systems

    Get PDF
    We understand sociotechnical systems (STSs) as uniting social and technical tiers to provide abstractions for capturing how autonomous principals interact with each other. Accountability is a foundational concept in STSs and an essential component of achieving ethical outcomes. In simple terms, accountability involves identifying who can call whom to account and who must provide an accounting of what and when. Although accountability is essential in any application involving autonomous parties, established methods do not support it. We formulate an accountability requirement as one where one principal is accountable to another regarding some conditional expectation. Our metamodel for STSs captures accountability requirements as relational constructs inspired from legal concepts, such as commitments, authorization, and prohibition. We apply our metamodel to a healthcare process and show how it helps address the problems of ineffective interaction identified in the original case study

    Resilience, reliability, and coordination in autonomous multi-agent systems

    Get PDF
    Acknowledgements The research reported in this paper was funded and supported by various grants over the years: Robotics and AI in Nuclear (RAIN) Hub (EP/R026084/1); Future AI and Robotics for Space (FAIR-SPACE) Hub (EP/R026092/1); Offshore Robotics for Certification of Assets (ORCA) Hub (EP/R026173/1); the Royal Academy of Engineering under the Chair in Emerging Technologies scheme; Trustworthy Autonomous Systems “Verifiability Node” (EP/V026801); Scrutable Autonomous Systems (EP/J012084/1); Supporting Security Policy with Effective Digital Intervention (EP/P011829/1); The International Technology Alliance in Network and Information Sciences.Peer reviewedPostprin
    corecore