8,437 research outputs found

    From Social Simulation to Integrative System Design

    Full text link
    As the recent financial crisis showed, today there is a strong need to gain "ecological perspective" of all relevant interactions in socio-economic-techno-environmental systems. For this, we suggested to set-up a network of Centers for integrative systems design, which shall be able to run all potentially relevant scenarios, identify causality chains, explore feedback and cascading effects for a number of model variants, and determine the reliability of their implications (given the validity of the underlying models). They will be able to detect possible negative side effect of policy decisions, before they occur. The Centers belonging to this network of Integrative Systems Design Centers would be focused on a particular field, but they would be part of an attempt to eventually cover all relevant areas of society and economy and integrate them within a "Living Earth Simulator". The results of all research activities of such Centers would be turned into informative input for political Decision Arenas. For example, Crisis Observatories (for financial instabilities, shortages of resources, environmental change, conflict, spreading of diseases, etc.) would be connected with such Decision Arenas for the purpose of visualization, in order to make complex interdependencies understandable to scientists, decision-makers, and the general public.Comment: 34 pages, Visioneer White Paper, see http://www.visioneer.ethz.c

    Measuring, Monitoring and Managing Legal Complexity

    Get PDF
    The American legal system is often accused of being “too complex.” For example, most Americans believe the Tax Code is too complex. But what does that mean, and how would one prove the Tax Code is too complex? Both the descriptive claim that an element of law is complex and the normative claim that it is too complex should be empirically testable hypotheses. Yet, in fact, very little is known about how to measure legal complexity, much less how to monitor and manage it. Legal scholars have begun to employ the science of complex adaptive systems, also known as complexity science, to probe these kinds of descriptive and normative questions about the legal system. This body of work has focused primarily on developing theories of legal complexity and positing reasons for, and ways of, managing it. Legal scholars thus have skipped the hard part—developing quantitative metrics and methods for measuring and monitoring law’s complexity. But the theory of legal complexity will remain stuck in theory until it moves to the empirical phase of study. Thinking about ways of managing legal complexity is pointless if there is no yardstick for deciding how complex the law should be. In short, the theory of legal complexity cannot be put to work without more robust empirical tools for identifying and tracking complexity in legal systems. This Article explores legal complexity at a depth not previously undertaken in legal scholarship. First, the Article orients the discussion by briefly reviewing complexity science scholarship to develop descriptive, prescriptive, and ethical theories of legal complexity. The Article then shifts to the empirical front, identifying potentially useful metrics and methods for studying legal complexity. It draws from complexity science to develop methods that have been or might be applied to measure different features of legal complexity. Next, the Article proposes methods for monitoring legal complexity over time, in particular by conceptualizing what we call Legal Maps—a multi-layered, active representation of the legal system network at work. Finally, the Article concludes with a preliminary examination of how the measurement and monitoring techniques could inform interventions designed to manage legal complexity by using currently available machine learning and user interface design technologies

    Measuring, Monitoring and Managing Legal Complexity

    Get PDF
    The American legal system is often accused of being “too complex.” For example, most Americans believe the Tax Code is too complex. But what does that mean, and how would one prove the Tax Code is too complex? Both the descriptive claim that an element of law is complex and the normative claim that it is too complex should be empirically testable hypotheses. Yet, in fact, very little is known about how to measure legal complexity, much less how to monitor and manage it. Legal scholars have begun to employ the science of complex adaptive systems, also known as complexity science, to probe these kinds of descriptive and normative questions about the legal system. This body of work has focused primarily on developing theories of legal complexity and positing reasons for, and ways of, managing it. Legal scholars thus have skipped the hard part—developing quantitative metrics and methods for measuring and monitoring law’s complexity. But the theory of legal complexity will remain stuck in theory until it moves to the empirical phase of study. Thinking about ways of managing legal complexity is pointless if there is no yardstick for deciding how complex the law should be. In short, the theory of legal complexity cannot be put to work without more robust empirical tools for identifying and tracking complexity in legal systems. This Article explores legal complexity at a depth not previously undertaken in legal scholarship. First, the Article orients the discussion by briefly reviewing complexity science scholarship to develop descriptive, prescriptive, and ethical theories of legal complexity. The Article then shifts to the empirical front, identifying potentially useful metrics and methods for studying legal complexity. It draws from complexity science to develop methods that have been or might be applied to measure different features of legal complexity. Next, the Article proposes methods for monitoring legal complexity over time, in particular by conceptualizing what we call Legal Maps—a multi-layered, active representation of the legal system network at work. Finally, the Article concludes with a preliminary examination of how the measurement and monitoring techniques could inform interventions designed to manage legal complexity by using currently available machine learning and user interface design technologies

    Complexity challenges in ATM

    Get PDF
    After more than 4 years of activity, the ComplexWorld Network, together with the projects and PhDs covered under the SESAR long-term research umbrella, have developed sound research material contributing to progress beyond the state of the art in fields such as resilience, uncertainty, multi-agent systems, metrics and data science. The achievements made by the ComplexWorld stakeholders have also led to the identification of new challenges that need to be addressed in the future. In order to pave the way for complexity science research in Air Traffic Management (ATM) in the coming years, ComplexWorld requested external assessments on how the challenges have been covered and where there are existing gaps. For that purpose, ComplexWorld, with the support of EUROCONTROL, established an expert panel to review selected documentation developed by the network and provide their assessment on their topic of expertise

    Mean-Field Theory of Meta-Learning

    Full text link
    We discuss here the mean-field theory for a cellular automata model of meta-learning. The meta-learning is the process of combining outcomes of individual learning procedures in order to determine the final decision with higher accuracy than any single learning method. Our method is constructed from an ensemble of interacting, learning agents, that acquire and process incoming information using various types, or different versions of machine learning algorithms. The abstract learning space, where all agents are located, is constructed here using a fully connected model that couples all agents with random strength values. The cellular automata network simulates the higher level integration of information acquired from the independent learning trials. The final classification of incoming input data is therefore defined as the stationary state of the meta-learning system using simple majority rule, yet the minority clusters that share opposite classification outcome can be observed in the system. Therefore, the probability of selecting proper class for a given input data, can be estimated even without the prior knowledge of its affiliation. The fuzzy logic can be easily introduced into the system, even if learning agents are build from simple binary classification machine learning algorithms by calculating the percentage of agreeing agents.Comment: 23 page

    Cyber-Physical Threat Intelligence for Critical Infrastructures Security

    Get PDF
    Modern critical infrastructures comprise of many interconnected cyber and physical assets, and as such are large scale cyber-physical systems. Hence, the conventional approach of securing these infrastructures by addressing cyber security and physical security separately is no longer effective. Rather more integrated approaches that address the security of cyber and physical assets at the same time are required. This book presents integrated (i.e. cyber and physical) security approaches and technologies for the critical infrastructures that underpin our societies. Specifically, it introduces advanced techniques for threat detection, risk assessment and security information sharing, based on leading edge technologies like machine learning, security knowledge modelling, IoT security and distributed ledger infrastructures. Likewise, it presets how established security technologies like Security Information and Event Management (SIEM), pen-testing, vulnerability assessment and security data analytics can be used in the context of integrated Critical Infrastructure Protection. The novel methods and techniques of the book are exemplified in case studies involving critical infrastructures in four industrial sectors, namely finance, healthcare, energy and communications. The peculiarities of critical infrastructure protection in each one of these sectors is discussed and addressed based on sector-specific solutions. The advent of the fourth industrial revolution (Industry 4.0) is expected to increase the cyber-physical nature of critical infrastructures as well as their interconnection in the scope of sectorial and cross-sector value chains. Therefore, the demand for solutions that foster the interplay between cyber and physical security, and enable Cyber-Physical Threat Intelligence is likely to explode. In this book, we have shed light on the structure of such integrated security systems, as well as on the technologies that will underpin their operation. We hope that Security and Critical Infrastructure Protection stakeholders will find the book useful when planning their future security strategies

    Consistency and Coherence of Turtle Conservation Policies in Venu Island Wildlife Sanctuary, Kaimana, West Papua

    Get PDF
    Policy management of turtle conservation in the region of Venu Island Wildlife Sanctuary, Kaimana, West Papua is indispensable. Therefore, this study aimed at examining the turtle conservation policy and it’s implementation, both the consistency and coherence. The analytical method used is content analysis and simple mathematic statistics. The consistency of turtle conservation management policies is found inconsistent in implementation. Similarly, coherence between the policy on turtle conservation management with government policy is found incoherent, as it is more oriented to "economic growth" compared to turtle conservation management. This is due to weak management, among others: communication, resources, attitudes and behavior, and bureaucratic structures. "Management Authority Multi-stakeholder" was found as a form of the collaborative approach in carrying out the turtle conservation among institutional managers with other key stakeholders. Consolidation of these institutions need to be carried out by non-governmental organizations, i.e., Conservation International (Indonesia) Corridor Kaimana and local government, i.e., the Department of Marine and Fisheries, Department of tourism and culture along with Indigenous Peoples. So policy integration among sectors as well as the consistency and coherence should be coherent in the approach to building synergy with policies of sustainable management of turtle conservation in the region of Venu Island Wildlife Sanctuary, Kaimana, West Papua
    • …
    corecore