756 research outputs found
An appreciative inquiry into the transformative learning experiences of students in a family literacy project
Educational discourse has often struggled to genuinely move beyond deficit-based language. Even action research, a predominant model for teacher development, starts with the identification of a problem (Cardno 2003). It would appear that the vocabulary for a hope-filled discourse which captures the imagination and infiuences our future educational activity seems to have escaped us. Moreover, we seem bereft of educational contexts where the experience for students is holistic and transformative
A contrasting look at self-organization in the Internet and next-generation communication networks
This article examines contrasting notions of self-organization in the Internet and next-generation communication networks, by reviewing in some detail recent evidence regarding several of the more popular attempts to explain prominent features of Internet structure and behavior as "emergent phenomena." In these examples, what might appear to the nonexpert as "emergent self-organization" in the Internet actually results from well conceived (albeit perhaps ad hoc) design, with explanations that are mathematically rigorous, in agreement with engineering reality, and fully consistent with network measurements. These examples serve as concrete starting points from which networking researchers can assess whether or not explanations involving self-organization are relevant or appropriate in the context of next-generation communication networks, while also highlighting the main differences between approaches to self-organization that are rooted in engineering design vs. those inspired by statistical physics
Diversity of graphs with highly variable connectivity
A popular approach for describing the structure of many complex networks focuses on graph theoretic properties that characterize their large-scale connectivity. While it is generally recognized that such descriptions based on aggregate statistics do not uniquely characterize a particular graph and also that many such statistical features are interdependent, the relationship between competing descriptions is not entirely understood. This paper lends perspective on this problem by showing how the degree sequence and other constraints (e.g., connectedness, no self-loops or parallel edges) on a particular graph play a primary role in dictating many features, including its correlation structure. Building on recent work, we show how a simple structural metric characterizes key differences between graphs having the same degree sequence. More broadly, we show how the (often implicit) choice of a background set against which to measure graph features has serious implications for the interpretation and comparability of graph theoretic descriptions
Contrasting Views of Complexity and Their Implications For Network-Centric Infrastructures
There exists a widely recognized need to better understand
and manage complex “systems of systems,” ranging from
biology, ecology, and medicine to network-centric technologies.
This is motivating the search for universal laws of highly evolved
systems and driving demand for new mathematics and methods
that are consistent, integrative, and predictive. However, the theoretical
frameworks available today are not merely fragmented
but sometimes contradictory and incompatible. We argue that
complexity arises in highly evolved biological and technological
systems primarily to provide mechanisms to create robustness.
However, this complexity itself can be a source of new fragility,
leading to “robust yet fragile” tradeoffs in system design. We
focus on the role of robustness and architecture in networked
infrastructures, and we highlight recent advances in the theory
of distributed control driven by network technologies. This view
of complexity in highly organized technological and biological systems
is fundamentally different from the dominant perspective in
the mainstream sciences, which downplays function, constraints,
and tradeoffs, and tends to minimize the role of organization and
design
Mathematics and the Internet: A Source of Enormous Confusion and Great Potential
Graph theory models the Internet mathematically, and a number of plausible mathematically intersecting network models for the Internet have been developed and studied. Simultaneously, Internet researchers have developed methodology to use real data to validate, or invalidate, proposed Internet models. The authors look at these parallel developments, particularly as they apply to scale-free network models of the preferential attachment type
The magnitude distribution of earthquakes near Southern California faults
We investigate seismicity near faults in the Southern California Earthquake Center Community Fault Model. We search for anomalously large events that might be signs of a characteristic earthquake distribution. We find that seismicity near major fault zones in Southern California is well modeled by a Gutenberg-Richter distribution, with no evidence of characteristic earthquakes within the resolution limits of the modern instrumental catalog. However, the b value of the locally observed magnitude distribution is found to depend on distance to the nearest mapped fault segment, which suggests that earthquakes nucleating near major faults are likely to have larger magnitudes relative to earthquakes nucleating far from major faults
More "normal" than normal: scaling distributions and complex systems
One feature of many naturally occurring or engineered complex systems is tremendous variability in event sizes. To account for it, the behavior of these systems is often described using power law relationships or scaling distributions, which tend to be viewed as "exotic" because of their unusual properties (e.g., infinite moments). An alternate view is based on mathematical, statistical, and data-analytic arguments and suggests that scaling distributions should be viewed as "more normal than normal". In support of this latter view that has been advocated by Mandelbrot for the last 40 years, we review in this paper some relevant results from probability theory and illustrate a powerful statistical approach for deciding whether the variability associated with observed event sizes is consistent with an underlying Gaussian-type (finite variance) or scaling-type (infinite variance) distribution. We contrast this approach with traditional model fitting techniques and discuss its implications for future modeling of complex systems
Correlation Between Computed Contact Parameters and Wear Patterns on a Retrieved UHMWPE Tibial Insert
Throughout the life of a total knee arthroplasty implant repeated loading causes wear on the contact surfaces. Attempts have been made in the past to predict locations of wear through computational modeling and physical testing. This study examines a method of using computer modeling techniques to describe the kinematics of an implant, and to use kinematic data in finding areas of contact and internal shear stress that correlate to observed wear damage. A retrieved cruciate-retaining knee implant was reverse engineered and analyzed in one cycle of simulated gait using multibody dynamics and aligned according to resulting kinematic data for finite element analysis. Results showed a correlation between the predicted areas of contact and internal shear stresses and the observed wear damage
Connecting Carrier\u27s Liability for Loss or Damage to Shipments
Is a carrier liable for a shipment it did not receive? What is the situation when a carrier receives only part of the goods from the preceding carrier, or when it receives them all but in damaged condition? How is the carrier\u27s liability affected if the damage is latent or patent? Discussion of these questions will be limited to shipments in interstate commerce and in three basic areas: (1) carrier\u27s common law liability,1 (2) effect of federal enactments, and (3) establishment of a prima facie case
Contibutory Negligence in Medical Malpractice
Three categories of cases have been noted out of the mass of factually individualistic ones concerning medical malpractice and contributory negligence. The first, where a breach of duty owed the patient by the physician is lacking, involves an injury produced by the patient\u27s own negligence. In the second, the patient\u27s negligence directly contributes to the severity of an injury already present because of the physician\u27s negligence. The plaintiff-patient\u27s damages are not mitigated but rather entirely precluded in light of his acts. Thus a plea of contributory negligence is a complete defense. The third category includes those cases where a time lag exists between the separate negligent acts, each of which produces significant injury. The physician is chargeable only with the consequences of his own negligence, not subsequent acts of his patient. Recognizing these generalized differences, it would be beneficial to an understanding of this field of tort law to review the nature of contributory negligence and its application to medical malpractice litgation. Cases will follow to crystallize some of this fundamental law
- …