3,862 research outputs found
Heterogeneous attachment strategies optimize the topology of dynamic wireless networks
In optimizing the topology of wireless networks built of a dynamic set of
spatially embedded agents, there are many trade-offs to be dealt with. The
network should preferably be as small (in the sense that the average, or
maximal, pathlength is short) as possible, it should be robust to failures, not
consume too much power, and so on. In this paper, we investigate simple models
of how agents can choose their neighbors in such an environment. In our model
of attachment, we can tune from one situation where agents prefer to attach to
others in closest proximity, to a situation where distance is ignored (and thus
attachments can be made to agents further away). We evaluate this scenario with
several performance measures and find that the optimal topologies, for most of
the quantities, is obtained for strategies resulting in a mix of most local and
a few random connections
Topological Performance Measures as Surrogates for Physical Flow Models for Risk and Vulnerability Analysis for Electric Power Systems
Critical infrastructure systems must be both robust and resilient in order to
ensure the functioning of society. To improve the performance of such systems,
we often use risk and vulnerability analysis to find and address system
weaknesses. A critical component of such analyses is the ability to accurately
determine the negative consequences of various types of failures in the system.
Numerous mathematical and simulation models exist which can be used to this
end. However, there are relatively few studies comparing the implications of
using different modeling approaches in the context of comprehensive risk
analysis of critical infrastructures. Thus in this paper, we suggest a
classification of these models, which span from simple topologically-oriented
models to advanced physical flow-based models. Here, we focus on electric power
systems and present a study aimed at understanding the tradeoffs between
simplicity and fidelity in models used in the context of risk analysis.
Specifically, the purpose of this paper is to compare performances measures
achieved with a spectrum of approaches typically used for risk and
vulnerability analysis of electric power systems and evaluate if more
simplified topological measures can be combined using statistical methods to be
used as a surrogate for physical flow models. The results of our work provide
guidance as to appropriate models or combination of models to use when
analyzing large-scale critical infrastructure systems, where simulation times
quickly become insurmountable when using more advanced models, severely
limiting the extent of analyses that can be performed
Biological Robustness: Paradigms, Mechanisms, and Systems Principles
Robustness has been studied through the analysis of data sets, simulations, and a variety of experimental techniques that each have their own limitations but together confirm the ubiquity of biological robustness. Recent trends suggest that different types of perturbation (e.g., mutational, environmental) are commonly stabilized by similar mechanisms, and system sensitivities often display a long-tailed distribution with relatively few perturbations representing the majority of sensitivities. Conceptual paradigms from network theory, control theory, complexity science, and natural selection have been used to understand robustness, however each paradigm has a limited scope of applicability and there has been little discussion of the conditions that determine this scope or the relationships between paradigms. Systems properties such as modularity, bow-tie architectures, degeneracy, and other topological features are often positively associated with robust traits, however common underlying mechanisms are rarely mentioned. For instance, many system properties support robustness through functional redundancy or through response diversity with responses regulated by competitive exclusion and cooperative facilitation. Moreover, few studies compare and contrast alternative strategies for achieving robustness such as homeostasis, adaptive plasticity, environment shaping, and environment tracking. These strategies share similarities in their utilization of adaptive and self-organization processes that are not well appreciated yet might be suggestive of reusable building blocks for generating robust behavior
Efficient vasculature investment in tissues can be determined without global information
Cells are the fundamental building blocks of organs and tissues. Information and mass flow through cellular contacts in these structures is vital for the orchestration of organ function. Constraints imposed by packing and cell immobility limit intercellular communication, particularly as organs and organisms scale up to greater sizes. In order to transcend transport limitations, delivery systems including vascular and respiratory systems evolved to facilitate the movement of matter and information. The construction of these delivery systems has an associated cost, as vascular elements do not perform the metabolic functions of the organs they are part of. This study investigates a fundamental trade-off in vascularization in multicellular tissues: the reduction of path lengths for communication versus the cost associated with producing vasculature. Biologically realistic generative models, using multicellular templates of different dimensionalities, revealed a limited advantage to the vascularization of two-dimensional tissues. Strikingly, scale-free improvements in transport efficiency can be achieved even in the absence of global knowledge of tissue organization. A point of diminishing returns in the investment of additional vascular tissue to the increased reduction of path length in 2.5- and three-dimensional tissues was identified. Applying this theory to experimentally determined biological tissue structures, we show the possibility of a co-dependency between the method used to limit path length and the organization of cells it acts upon. These results provide insight as to why tissues are or are not vascularized in nature, the robustness of developmental generative mechanisms and the extent to which vasculature is advantageous in the support of organ function
Statistical Physics of Design
Modern life increasingly relies on complex products that perform a variety of functions. The key difficulty of creating such products lies not in the manufacturing process, but in the design process. However, design problems are typically driven by multiple contradictory objectives and different stakeholders, have no obvious stopping criteria, and frequently prevent construction of prototypes or experiments. Such ill-defined, or "wicked" problems cannot be "solved" in the traditional sense with optimization methods. Instead, modern design techniques are focused on generating knowledge about the alternative solutions in the design space.
In order to facilitate such knowledge generation, in this dissertation I develop the "Systems Physics" framework that treats the emergent structures within the design space as physical objects that interact via quantifiable forces. Mathematically, Systems Physics is based on maximal entropy statistical mechanics, which allows both drawing conceptual analogies between design problems and collective phenomena and performing numerical calculations to gain quantitative understanding. Systems Physics operates via a Model-Compute-Learn loop, with each step refining our thinking of design problems.
I demonstrate the capabilities of Systems Physics in two very distinct case studies: Naval Engineering and self-assembly. For the Naval Engineering case, I focus on an established problem of arranging shipboard systems within the available hull space. I demonstrate the essential trade-off between minimizing the routing cost and maximizing the design flexibility, which can lead to abrupt phase transitions. I show how the design space can break into several locally optimal architecture classes that have very different robustness to external couplings. I illustrate how the topology of the shipboard functional network enters a tight interplay with the spatial constraints on placement. For the self-assembly problem, I show that the topology of self-assembled structures can be reliably encoded in the properties of the building blocks so that the structure and the blocks can be jointly designed.
The work presented here provides both conceptual and quantitative advancements. In order to properly port the language and the formalism of statistical mechanics to the design domain, I critically re-examine such foundational ideas as system-bath coupling, coarse graining, particle distinguishability, and direct and emergent interactions. I show that the design space can be packed into a special information structure, a tensor network, which allows seamless transition from graphical visualization to sophisticated numerical calculations.
This dissertation provides the first quantitative treatment of the design problem that is not reduced to the narrow goals of mathematical optimization. Using statistical mechanics perspective allows me to move beyond the dichotomy of "forward" and "inverse" design and frame design as a knowledge generation process instead. Such framing opens the way to further studies of the design space structures and the time- and path-dependent phenomena in design. The present work also benefits from, and contributes to the philosophical interpretations of statistical mechanics developed by the soft matter community in the past 20 years. The discussion goes far beyond physics and engages with literature from materials science, naval engineering, optimization problems, design theory, network theory, and economic complexity.PHDPhysicsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/163133/1/aklishin_1.pd
On the strengths of connectivity and robustness in general random intersection graphs
Random intersection graphs have received much attention for nearly two
decades, and currently have a wide range of applications ranging from key
predistribution in wireless sensor networks to modeling social networks. In
this paper, we investigate the strengths of connectivity and robustness in a
general random intersection graph model. Specifically, we establish sharp
asymptotic zero-one laws for -connectivity and -robustness, as well as
the asymptotically exact probability of -connectivity, for any positive
integer . The -connectivity property quantifies how resilient is the
connectivity of a graph against node or edge failures. On the other hand,
-robustness measures the effectiveness of local diffusion strategies (that
do not use global graph topology information) in spreading information over the
graph in the presence of misbehaving nodes. In addition to presenting the
results under the general random intersection graph model, we consider two
special cases of the general model, a binomial random intersection graph and a
uniform random intersection graph, which both have numerous applications as
well. For these two specialized graphs, our results on asymptotically exact
probabilities of -connectivity and asymptotic zero-one laws for
-robustness are also novel in the literature.Comment: This paper about random graphs appears in IEEE Conference on Decision
and Control (CDC) 2014, the premier conference in control theor
Distributed Integrated Circuits: An Alternative Approach to High-Frequency Design
Distributed integrated circuits are presented as a methodology to design high-frequency communication building blocks. Distributed circuits operate based on multiple parallel signal paths working in synchronization that can be used to enhance the frequency of operation, combine power, and enhance the robustness of the design. These multiple signal paths usually result in strong couplings inside the circuit that necessitate
a treatment spanning architecture, circuits, devices, and electromagnetic levels of abstraction
- …