995 research outputs found

    A contrasting look at self-organization in the Internet and next-generation communication networks

    Get PDF
    This article examines contrasting notions of self-organization in the Internet and next-generation communication networks, by reviewing in some detail recent evidence regarding several of the more popular attempts to explain prominent features of Internet structure and behavior as "emergent phenomena." In these examples, what might appear to the nonexpert as "emergent self-organization" in the Internet actually results from well conceived (albeit perhaps ad hoc) design, with explanations that are mathematically rigorous, in agreement with engineering reality, and fully consistent with network measurements. These examples serve as concrete starting points from which networking researchers can assess whether or not explanations involving self-organization are relevant or appropriate in the context of next-generation communication networks, while also highlighting the main differences between approaches to self-organization that are rooted in engineering design vs. those inspired by statistical physics

    An appreciative inquiry into the transformative learning experiences of students in a family literacy project

    Get PDF
    Educational discourse has often struggled to genuinely move beyond deficit-based language. Even action research, a predominant model for teacher development, starts with the identification of a problem (Cardno 2003). It would appear that the vocabulary for a hope-filled discourse which captures the imagination and infiuences our future educational activity seems to have escaped us. Moreover, we seem bereft of educational contexts where the experience for students is holistic and transformative

    Diversity of graphs with highly variable connectivity

    Get PDF
    A popular approach for describing the structure of many complex networks focuses on graph theoretic properties that characterize their large-scale connectivity. While it is generally recognized that such descriptions based on aggregate statistics do not uniquely characterize a particular graph and also that many such statistical features are interdependent, the relationship between competing descriptions is not entirely understood. This paper lends perspective on this problem by showing how the degree sequence and other constraints (e.g., connectedness, no self-loops or parallel edges) on a particular graph play a primary role in dictating many features, including its correlation structure. Building on recent work, we show how a simple structural metric characterizes key differences between graphs having the same degree sequence. More broadly, we show how the (often implicit) choice of a background set against which to measure graph features has serious implications for the interpretation and comparability of graph theoretic descriptions

    Contrasting Views of Complexity and Their Implications For Network-Centric Infrastructures

    Get PDF
    There exists a widely recognized need to better understand and manage complex “systems of systems,” ranging from biology, ecology, and medicine to network-centric technologies. This is motivating the search for universal laws of highly evolved systems and driving demand for new mathematics and methods that are consistent, integrative, and predictive. However, the theoretical frameworks available today are not merely fragmented but sometimes contradictory and incompatible. We argue that complexity arises in highly evolved biological and technological systems primarily to provide mechanisms to create robustness. However, this complexity itself can be a source of new fragility, leading to “robust yet fragile” tradeoffs in system design. We focus on the role of robustness and architecture in networked infrastructures, and we highlight recent advances in the theory of distributed control driven by network technologies. This view of complexity in highly organized technological and biological systems is fundamentally different from the dominant perspective in the mainstream sciences, which downplays function, constraints, and tradeoffs, and tends to minimize the role of organization and design

    Mathematics and the Internet: A Source of Enormous Confusion and Great Potential

    Get PDF
    Graph theory models the Internet mathematically, and a number of plausible mathematically intersecting network models for the Internet have been developed and studied. Simultaneously, Internet researchers have developed methodology to use real data to validate, or invalidate, proposed Internet models. The authors look at these parallel developments, particularly as they apply to scale-free network models of the preferential attachment type

    The magnitude distribution of earthquakes near Southern California faults

    Get PDF
    We investigate seismicity near faults in the Southern California Earthquake Center Community Fault Model. We search for anomalously large events that might be signs of a characteristic earthquake distribution. We find that seismicity near major fault zones in Southern California is well modeled by a Gutenberg-Richter distribution, with no evidence of characteristic earthquakes within the resolution limits of the modern instrumental catalog. However, the b value of the locally observed magnitude distribution is found to depend on distance to the nearest mapped fault segment, which suggests that earthquakes nucleating near major faults are likely to have larger magnitudes relative to earthquakes nucleating far from major faults

    More "normal" than normal: scaling distributions and complex systems

    Get PDF
    One feature of many naturally occurring or engineered complex systems is tremendous variability in event sizes. To account for it, the behavior of these systems is often described using power law relationships or scaling distributions, which tend to be viewed as "exotic" because of their unusual properties (e.g., infinite moments). An alternate view is based on mathematical, statistical, and data-analytic arguments and suggests that scaling distributions should be viewed as "more normal than normal". In support of this latter view that has been advocated by Mandelbrot for the last 40 years, we review in this paper some relevant results from probability theory and illustrate a powerful statistical approach for deciding whether the variability associated with observed event sizes is consistent with an underlying Gaussian-type (finite variance) or scaling-type (infinite variance) distribution. We contrast this approach with traditional model fitting techniques and discuss its implications for future modeling of complex systems

    Towards a Theory of Scale-Free Graphs: Definition, Properties, and Implications (Extended Version)

    Get PDF
    Although the ``scale-free'' literature is large and growing, it gives neither a precise definition of scale-free graphs nor rigorous proofs of many of their claimed properties. In fact, it is easily shown that the existing theory has many inherent contradictions and verifiably false claims. In this paper, we propose a new, mathematically precise, and structural definition of the extent to which a graph is scale-free, and prove a series of results that recover many of the claimed properties while suggesting the potential for a rich and interesting theory. With this definition, scale-free (or its opposite, scale-rich) is closely related to other structural graph properties such as various notions of self-similarity (or respectively, self-dissimilarity). Scale-free graphs are also shown to be the likely outcome of random construction processes, consistent with the heuristic definitions implicit in existing random graph approaches. Our approach clarifies much of the confusion surrounding the sensational qualitative claims in the scale-free literature, and offers rigorous and quantitative alternatives.Comment: 44 pages, 16 figures. The primary version is to appear in Internet Mathematics (2005

    On-field player workload exposure and knee injury risk monitoring via deep learning

    Full text link
    In sports analytics, an understanding of accurate on-field 3D knee joint moments (KJM) could provide an early warning system for athlete workload exposure and knee injury risk. Traditionally, this analysis has relied on captive laboratory force plates and associated downstream biomechanical modeling, and many researchers have approached the problem of portability by extrapolating models built on linear statistics. An alternative approach would be to capitalize on recent advances in deep learning. In this study, using the pre-trained CaffeNet convolutional neural network (CNN) model, multivariate regression of marker-based motion capture to 3D KJM for three sports-related movement types were compared. The strongest overall mean correlation to source modeling of 0.8895 was achieved over the initial 33 % of stance phase for sidestepping. The accuracy of these mean predictions of the three critical KJM associated with anterior cruciate ligament (ACL) injury demonstrate the feasibility of on-field knee injury assessment using deep learning in lieu of laboratory embedded force plates. This multidisciplinary research approach significantly advances machine representation of real-world physical models with practical application for both community and professional level athletes

    Optimal vaccination in a stochastic epidemic model of two non-interacting populations

    Get PDF
    Developing robust, quantitative methods to optimize resource allocations in response to epidemics has the potential to save lives and minimize health care costs. In this paper, we develop and apply a computationally efficient algorithm that enables us to calculate the complete probability distribution for the final epidemic size in a stochastic Susceptible-Infected-Recovered (SIR) model. Based on these results, we determine the optimal allocations of a limited quantity of vaccine between two non-interacting populations. We compare the stochastic solution to results obtained for the traditional, deterministic SIR model. For intermediate quantities of vaccine, the deterministic model is a poor estimate of the optimal strategy for the more realistic, stochastic case.Comment: 21 pages, 7 figure
    corecore