9,216 research outputs found

    Detecting Evolving Patterns of Self-Organizing Networks by Flow Hierarchy Measurement

    Get PDF
    Hierarchies occur widely in evolving self-organizing ecological, biological, technological and social networks, but detecting and comparing hierarchies is difficult. Here we present a metric and technique to quantitatively assess the extent to which self-organizing directed networks exhibit a flow hierarchy. Flow hierarchy is a commonly observed but theoretically overlooked form of hierarchy in networks. We show that the ecological, neurobiological, economic and information processing networks are generally more hierarchical than their comparable random networks. We further discovered that hierarchy degree has increased over the course of the evolution of Linux kernels, confirming an early hypothesis by Herbert Simon on the emergence of hierarchy in evolutionary processes. Taken together, our results suggest that hierarchy is a central organizing feature of real-world evolving networks, and the measurement of hierarchy opens the way to understand the structural regimes and evolutionary patterns of self-organizing networks. Our measurement technique makes it possible to objectively compare hierarchies of different networks and of different evolutionary stages of a single network, and compare evolving patterns of different networks. It can be applied to various complex systems, which can be represented as directed networks

    Measuring, Monitoring and Managing Legal Complexity

    Get PDF
    The American legal system is often accused of being “too complex.” For example, most Americans believe the Tax Code is too complex. But what does that mean, and how would one prove the Tax Code is too complex? Both the descriptive claim that an element of law is complex and the normative claim that it is too complex should be empirically testable hypotheses. Yet, in fact, very little is known about how to measure legal complexity, much less how to monitor and manage it. Legal scholars have begun to employ the science of complex adaptive systems, also known as complexity science, to probe these kinds of descriptive and normative questions about the legal system. This body of work has focused primarily on developing theories of legal complexity and positing reasons for, and ways of, managing it. Legal scholars thus have skipped the hard part—developing quantitative metrics and methods for measuring and monitoring law’s complexity. But the theory of legal complexity will remain stuck in theory until it moves to the empirical phase of study. Thinking about ways of managing legal complexity is pointless if there is no yardstick for deciding how complex the law should be. In short, the theory of legal complexity cannot be put to work without more robust empirical tools for identifying and tracking complexity in legal systems. This Article explores legal complexity at a depth not previously undertaken in legal scholarship. First, the Article orients the discussion by briefly reviewing complexity science scholarship to develop descriptive, prescriptive, and ethical theories of legal complexity. The Article then shifts to the empirical front, identifying potentially useful metrics and methods for studying legal complexity. It draws from complexity science to develop methods that have been or might be applied to measure different features of legal complexity. Next, the Article proposes methods for monitoring legal complexity over time, in particular by conceptualizing what we call Legal Maps—a multi-layered, active representation of the legal system network at work. Finally, the Article concludes with a preliminary examination of how the measurement and monitoring techniques could inform interventions designed to manage legal complexity by using currently available machine learning and user interface design technologies

    Measuring, Monitoring and Managing Legal Complexity

    Get PDF
    The American legal system is often accused of being “too complex.” For example, most Americans believe the Tax Code is too complex. But what does that mean, and how would one prove the Tax Code is too complex? Both the descriptive claim that an element of law is complex and the normative claim that it is too complex should be empirically testable hypotheses. Yet, in fact, very little is known about how to measure legal complexity, much less how to monitor and manage it. Legal scholars have begun to employ the science of complex adaptive systems, also known as complexity science, to probe these kinds of descriptive and normative questions about the legal system. This body of work has focused primarily on developing theories of legal complexity and positing reasons for, and ways of, managing it. Legal scholars thus have skipped the hard part—developing quantitative metrics and methods for measuring and monitoring law’s complexity. But the theory of legal complexity will remain stuck in theory until it moves to the empirical phase of study. Thinking about ways of managing legal complexity is pointless if there is no yardstick for deciding how complex the law should be. In short, the theory of legal complexity cannot be put to work without more robust empirical tools for identifying and tracking complexity in legal systems. This Article explores legal complexity at a depth not previously undertaken in legal scholarship. First, the Article orients the discussion by briefly reviewing complexity science scholarship to develop descriptive, prescriptive, and ethical theories of legal complexity. The Article then shifts to the empirical front, identifying potentially useful metrics and methods for studying legal complexity. It draws from complexity science to develop methods that have been or might be applied to measure different features of legal complexity. Next, the Article proposes methods for monitoring legal complexity over time, in particular by conceptualizing what we call Legal Maps—a multi-layered, active representation of the legal system network at work. Finally, the Article concludes with a preliminary examination of how the measurement and monitoring techniques could inform interventions designed to manage legal complexity by using currently available machine learning and user interface design technologies

    Field-control, phase-transitions, and life's emergence

    Get PDF
    Instances of critical-like characteristics in living systems at each organizational level as well as the spontaneous emergence of computation (Langton), indicate the relevance of self-organized criticality (SOC). But extrapolating complex bio-systems to life's origins, brings up a paradox: how could simple organics--lacking the 'soft matter' response properties of today's bio-molecules--have dissipated energy from primordial reactions in a controlled manner for their 'ordering'? Nevertheless, a causal link of life's macroscopic irreversible dynamics to the microscopic reversible laws of statistical mechanics is indicated via the 'functional-takeover' of a soft magnetic scaffold by organics (c.f. Cairns-Smith's 'crystal-scaffold'). A field-controlled structure offers a mechanism for bootstrapping--bottom-up assembly with top-down control: its super-paramagnetic components obey reversible dynamics, but its dissipation of H-field energy for aggregation breaks time-reversal symmetry. The responsive adjustments of the controlled (host) mineral system to environmental changes would bring about mutual coupling between random organic sets supported by it; here the generation of long-range correlations within organic (guest) networks could include SOC-like mechanisms. And, such cooperative adjustments enable the selection of the functional configuration by altering the inorganic network's capacity to assist a spontaneous process. A non-equilibrium dynamics could now drive the kinetically-oriented system towards a series of phase-transitions with appropriate organic replacements 'taking-over' its functions.Comment: 54 pages, pdf fil

    A Critical Analysis of Payload Anomaly-Based Intrusion Detection Systems

    Get PDF
    Examining payload content is an important aspect of network security, particularly in today\u27s volatile computing environment. An Intrusion Detection System (IDS) that simply analyzes packet header information cannot adequately secure a network from malicious attacks. The alternative is to perform deep-packet analysis using n-gram language parsing and neural network technology. Self Organizing Map (SOM), PAYL over Self-Organizing Maps for Intrusion Detection (POSEIDON), Anomalous Payload-based Network Intrusion Detection (PAYL), and Anagram are next-generation unsupervised payload anomaly-based IDSs. This study examines the efficacy of each system using the design-science research methodology. A collection of quantitative data and qualitative features exposes their strengths and weaknesses

    Mapping Big Data into Knowledge Space with Cognitive Cyber-Infrastructure

    Full text link
    Big data research has attracted great attention in science, technology, industry and society. It is developing with the evolving scientific paradigm, the fourth industrial revolution, and the transformational innovation of technologies. However, its nature and fundamental challenge have not been recognized, and its own methodology has not been formed. This paper explores and answers the following questions: What is big data? What are the basic methods for representing, managing and analyzing big data? What is the relationship between big data and knowledge? Can we find a mapping from big data into knowledge space? What kind of infrastructure is required to support not only big data management and analysis but also knowledge discovery, sharing and management? What is the relationship between big data and science paradigm? What is the nature and fundamental challenge of big data computing? A multi-dimensional perspective is presented toward a methodology of big data computing.Comment: 59 page

    Cultural Transformation in Health Care

    Get PDF
    Describes the role of organizational culture in healthcare organizations. Recommends strategies for innovative approaches to improve the overall performance of the U.S. healthcare system
    • …
    corecore