1,325 research outputs found
Towards a Structural View of Resilience
The result of resilience is persistence: the maintenance
of certain characteristic behavioral properties in the face
of stress, strain and surprise. But the origins of this resilient
behavior lie in the structure of the systems which
concern us. Our need as policy analysts may only be one of
comparative measures: Which system is more resilient? But
as active designers -- as engineers, managers, or responsible
policy advisors -- we need to be able to say what mechanisms
or relationships make a system resilient, and what actions we
can take to make it more or less so.
This need for a causal view of resilience led us to a
search for persistence-promoting (or "resilient") mechanisms
and relationships in a variety of natural and man-made systems
A Case Study of Forest Ecosystem Pest Management
The boreal forests of North America have, for centuries, experienced periodic outbreaks of a defoliating insect called the Spruce Budworm. In anyone outbreak cycle a major proportion of the mature softwood forest in effected areas can die, with major consequences to the economy and employment of regions like New Brunswick, which are highly dependent on the forest industry. An extensive insecticide spraying programme initiated in New Brunswick in 1951 has succeeded in minimizing tree mortality, but at the price of maintaining incipient outbreak conditions over an area considerably more extensive than in the past. The present management approach is, therefore, particularly sensitive to unexpected shifts in economic, social and regulatory constraints, and to unanticipated behavior of the forest ecosystem.
Most major environmental problems in the world today are characterized by similar basic ingredients: high variability in space and time, large scale, and a troubled management history. Because of their enormous complexity there has been little concerted effort to apply systems analysis techniques to the coordinated development of effective descriptions of, and prescriptions for, such problems. The Budworm-forest system seemed to present an admirable focus for a case study with two objectives. The first, of course, was to attempt to develop sets of alternate policies appropriate for the specific problem. But the more significant purpose was to see just how far we could stretch the state of the art capabilities in ecology, modeling, optimization, policy design and evaluation to apply them to complex ecosystem management problems.
Three principal issues in any resource environmental problem challenge existing techniques. The resources that provide the food, fibre and recreational opportunities for society are integral parts of ecosystems characterized by complex interrelationships of many species among each other and with the land, water and climate in which they live. The interactions of these systems are highly non-linear and have a significant spatial component. Events in anyone point in space, just as at any moment of time, can affect events at other points in space and time. The resulting high order of dimensionality becomes all the more significant as these ecological systems couple with complex social and economic ones.
The second prime challenge is that we have only partial knowledge of the variables and relationships governing the systems. A large body of theoretical and experimental analysis and data has led to an identification of the general form and kind of functional relations existing between organisms. nut only occasionally is there a rich body of data specific to anyone situation. To develop an analysis which implicitly or explicitly presumes sufficient knowledge is therefore to guarantee management policies that become more the source of the problem than the source of the solution. In a particularly challenging way present ecological management situations require concepts and techniques which cope creatively with the uncertainties and unknowns that in fact pervade most of our major social, economic and environmental problems.
The third and final challenge reflects the previous two: How can we design policies that achieve specific social objectives and yet are still "robust"? Policies which, once set in play, produce intelligently linked ecological, social and economic systems that can absorb the unexpected events and unknowns that will inevitably appear. These "unexpecteds" might be the one in a thousand year drought that perversely occurs this year; the appearance or disappearance of key species, the emergence of new economic and regulatory constrains or the shift of societal objectives. We must learn to design in a way which shifts our emphasis away from minimizing the probability of failure, towards minimizing the cost of those failures which will inevitably occur
Process Models, Equilibrium Structures, and Population Dynamics: On the Formulation and Testing of Realistic Theory in Ecology
This paper addresses problems in the formulation and testing of theory to relate structure and dynamic behaviour in complex natural ecosystems. Detailed studies of spruce budworm-coniferous forest interactions in eastern Canada provide a background for the analysis. We argue that the mixed spatial and temporal scales, low density phenomena, and nonlinear interactions characteristic of most ecosystems severely limit traditional statistical approaches to theory building, while rendering most kinds of observational data irrelevant to theory evaluation and testing. We describe an alternative tradition:
1. Cast the theory as a set of "dynamic life tables", bound together by basic ecological process modules; apply available data and field experience to the parameterization of these modules.
2. Compute the consequences of the resulting theory under a wide range of conditions: quantitatively through numerical simulation and qualitatively through the use of topological manifolds.
3. Employ the manifolds to identify key structure- (as opposed to parameter-) dependent predictions of the theory. Compare these with observation, emphasizing behaviour of the system and its theory in extreme natural or experimental situations
Lessons for Ecological Policy Design: A Case Study of Ecosystem Management
This paper explores the prospects for combining elements of the ecological and policy sciences to form a substantive and effective science of ecological policy design. This exploration is made through a case study whose specific focus is the management problem posed by competition between man and an insect (the spruce budworm, Choristoneura fumiferana) for utilization of coniferous forests in the Canadian Province of New Brunswick
High-resolution investigations of Transverse Aeolian Ridges on Mars
Transverse Aeolian Ridges (TARs) are the most pervasive aeolian feature on Mars. Their small size requires high-resolution data for thorough analyses. We have utilized Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) and High Resolution Image Stereo Experiment (HiRISE) images, along with MRO Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) spectroscopic data to study TARs in detail. TAR deposits, along with related dark dune material and layered terrains, have been mapped in six study areas in order to identify sediment pathways and determine whether TARs are sourced locally or from global wind-borne materials. TAR morphology and orientation were mapped in grids within each study area; the results show that TARs are probably locally sourced. We constructed four HiRISE Digital Terrain Models (DTMs) to measure TAR heights, widths, spacing, areas, symmetry, and to calculate sediment volumes. Results show that TARs have average heights of ∼1.5 m, are very symmetrical, and are similar in form to terrestrial megaripples. Orthorectified HiRISE images taken 3 years apart were analyzed for TAR movement and none was found. Superposed craters on equatorial TARs give ages of ∼2 Ma, suggesting that these are relatively ancient and generally inactive aeolian deposits. CRISM data were analyzed over TAR deposits, dark dune material, and light-toned terrains. Although the surfaces were somewhat obscured by dust cover, the results did not show any remarkable difference between TARs and other deposits. We conclude that TARs may be sourced from local materials and form in a similar way to terrestrial megaripples
A renormalizable SO(10) GUT scenario with spontaneous CP violation
We consider fermion masses and mixings in a renormalizable SUSY SO(10) GUT
with Yukawa couplings of scalar fields in the representation 10 + 120 + 126
bar. We investigate a scenario defined by the following assumptions: i) A
single large scale in the theory, the GUT scale. ii) Small neutrino masses
generated by the type I seesaw mechanism with negligible type II contributions.
iii) A suitable form of spontaneous CP breaking which induces hermitian mass
matrices for all fermion mass terms of the Dirac type. Our assumptions define
an 18-parameter scenario for the fermion mass matrices for 18 experimentally
known observables. Performing a numerical analysis, we find excellent fits to
all observables in the case of both the normal and inverted neutrino mass
spectrum.Comment: 16 pages, two eps figure
Project Status Report: Ecology and Environment Project
We present here the extended outline and copies of the illustrations used in the Status Report of the IIASA Ecology and Environment Project, presented at Schloss Laxenburg on 21 June 1974.
Section 1., "General Review", is covered in the outline. Section 2., "A Case Study of Ecosystem Management", is the subject of a major monograph now in preparation. Section 3., on Selected Conceptual Developments, is in part documented in IIASA Research Reports RR-73-3 and RR-74-3
Mode of presentation and mortality amongst patients hospitalized with heart failure? A report from The First Euro Heart Failure Survey
Background:
Heart failure is heterogeneous in aetiology, pathophysiology, and presentation. Despite this diversity, clinical trials of patients hospitalized for HF deal with this problem as a single entity, which may be one reason for repeated failures.
Methods:
The first EuroHeart Failure Survey screened consecutive deaths and discharges of patients with suspected heart failure during 2000–2001. Patients were sorted into seven mutually exclusive hierarchical presentations: (1) with cardiac arrest/ventricular arrhythmia; (2) with acute coronary syndrome; (3) with rapid atrial fibrillation; (4) with acute breathlessness; (5) with other symptoms/signs such as peripheral oedema; (6) with stable symptoms; and (7) others in whom the contribution of HF to admission was not clear.
Results:
The 10,701 patients enrolled were classified into the above seven presentations as follows: 260 (2%), 560 (5%), 799 (8%), 2479 (24%), 1040 (10%), 703 (7%), and 4691 (45%) for which index-admission mortality was 26%, 20%, 10%, 8%, 6%, 6%, and 4%, respectively. Compared to those in group 7, the hazard ratios for death during the index admission were 4.9 (p ≤ 0.001), 4.0 (p < 0.001), 2.2 (p < 0.001), 2.1 (p < 0.001), 1.4 (p < 0.04) and 1.4 (p = 0.04), respectively. These differences were no longer statistically significant by 12 weeks.
Conclusion:
There is great diversity in the presentation of heart failure that is associated with very different short-term outcomes. Only a minority of hospitalizations associated with suspected heart failure are associated with acute breathlessness. This should be taken into account in the design of future clinical trials
Unique Molecular Identifiers and Multiplexing Amplicons Maximize the Utility of Deep Sequencing To Critically Assess Population Diversity in RNA Viruses
Next generation sequencing (NGS)/deep sequencing has become an important tool in the study of viruses. The use of unique molecular identifiers (UMI) can overcome the limitations of PCR errors and PCR-mediated recombination and reveal the true sampling depth of a viral population being sequenced in an NGS experiment. This approach of enhanced sequence data represents an ideal tool to study both high and low abundance drug resistance mutations and more generally to explore the genetic structure of viral populations. Central to the use of the UMI/Primer ID approach is the creation of a template consensus sequence (TCS) for each genome sequenced. Here we describe a series of experiments to validate several aspects of the Multiplexed Primer ID (MPID) sequencing approach using the MiSeq platform. We have evaluated how multiplexing of cDNA synthesis and amplicons affects the sampling depth of the viral population for each individual cDNA and amplicon to understand the relationship between broader genome coverage versus maximal sequencing depth. We have validated reproducibility of the MPID assay in the detection of minority mutations in viral genomes. We have also examined the determinants that allow sequencing reads of PCR recombinants to contaminate the final TCS data set and show how such contamination can be limited. Finally, we provide several examples where we have applied MPID to analyze features of minority variants and describe limits on their detection in viral populations of HIV-1 and SARS-CoV-2 to demonstrate the generalizable utility of this approach with any RNA virus
A ballistic motion disrupted by quantum reflections
I study a Lindblad dynamics modeling a quantum test particle in a Dirac comb
that collides with particles from a background gas. The main result is a
homogenization theorem in an adiabatic limiting regime involving large initial
momentum for the test particle. Over the time interval considered, the particle
would exhibit essentially ballistic motion if either the singular periodic
potential or the kicks from the gas were removed. However, the particle behaves
diffusively when both sources of forcing are present. The conversion of the
motion from ballistic to diffusive is generated by occasional quantum
reflections that result when the test particle's momentum is driven through a
collision near to an element of the half-spaced reciprocal lattice of the Dirac
comb.Comment: 54 pages. I rewrote the introduction and simplified some of the
presentatio
- …