4 research outputs found

    Report on blockchain technology & legitimacy

    Get PDF
    This report synthesizes the insights explored within the ERC BlockchainGov reading group on “Legitimacy in Blockchain,” taking place bi-weekly from July 2021 until June 2022. The report investigates the role of legitimacy in blockchain systems from descriptive, conceptual, and normative perspectives. It summarizes the discussions and provides recommendations concerning the role of legitimacy in blockchain systems drawing from the talks held by the reading group. The organizers of the reading group are part of several initiatives, including a five-year-long (2021-2026), EU-funded (ERC grant of €2M) on ‘Blockchain Gov’ at the CNRS (France)/EUI (Italy), a Future Fellowship project funded by the Australian Research Council on ‘Cooperation through Code’ at RMIT (Australia) and the Coalition of Automated Legal Applications (CO-ALA). The “Legitimacy in Blockchain” report is one of a series that includes the “Blockchain Technology, Trust, and Confidence” report (De Filippi et al., 2022) and “Blockchain Technology & Polycentric Gover-nance” report (De Filippi et al., forthcoming)

    Open Problems in DAOs

    Full text link
    Decentralized autonomous organizations (DAOs) are a new, rapidly-growing class of organizations governed by smart contracts. Here we describe how researchers can contribute to the emerging science of DAOs and other digitally-constituted organizations. From granular privacy primitives to mechanism designs to model laws, we identify high-impact problems in the DAO ecosystem where existing gaps might be tackled through a new data set or by applying tools and ideas from existing research fields such as political science, computer science, economics, law, and organizational science. Our recommendations encompass exciting research questions as well as promising business opportunities. We call on the wider research community to join the global effort to invent the next generation of organizations

    Safe and Responsible AI in Australia Discussion Paper: ADM+S Submission

    No full text
    The ADM+S is pleased to have this opportunity to engage with an important and complex question which confronts Australia: how should the Australian federal government take action - regulatory or otherwise -to promote artificial intelligence (and automated decision-making) that is safe and responsible? In our view, ‘responsible’ AI must also be inclusive, accountable, and genuinely beneficial -for Australia’s people, society, economy, and environment. In this submission – which is the product of research, inputs, and debate across our multi-disciplinary ARC Centre of Excellence on Automated Decision-Making and Society - we address this question in the following way
    corecore