12,957 research outputs found

    Utilizing remote sensing of thematic mapper data to improve our understanding of estuarine processes and their influence on the productivity of estuarine-dependent fisheries

    Get PDF
    The land-water interface of coastal marshes may influence the production of estuarine-dependent fisheries more than the area of these marshes. To test this hypothesis, a spatial model was created to explore the dynamic relationship between marshland-water interface and level of disintegration in the decaying coastal marshes of Louisiana's Barataria, Terrebonne, and Timbalier basins. Calibrating the model with Landsat Thematic Mapper satellite imagery, a parabolic relationship was found between land-water interface and marsh disintegration. Aggregated simulation data suggest that interface in the study area will soon reach its maximum and then decline. A statistically significant positive linear relationship was found between brown shrimp catch and total interface length over the past 28 years. This relationship suggests that shrimp yields will decline when interface declines, possibly beginning about 1995

    Utilizing remote sensing of Thematic Mapper data to improve our understanding of estuarine processes and their influence on the productivity of estuarine-dependent fisheries

    Get PDF
    The purpose of the project is to refine and validate a probabilistic spatial computer model through the analyses of thematic mapper imagery. The model is designed to determine how the interface between marshland and water changes as marshland is converted to water in a disintegrating marsh. Coastal marshland in Louisiana is disintegrating at the rate of approximately 40 sq mi a year, and an evaluation of the potential impact of this loss on the landings of estuarine-dependent fisheries is needed by fisheries managers. Understanding how marshland-water interface changes as coastal marshland is lost is essential to the process of evaluating fisheries effects, because several studies suggest that the production of estuarine-dependent fish and shellfish may be more closely related to the interface between marshland and water than to acreage of marshland. The need to address this practical problem has provided an opportunity to apply some scientifically interesting new techniques to the analyses of satellite imagery. Progress with the development of these techniques is the subject of this report

    Utilizing remote sensing of Thematic Mapper data to improve our understanding of estuarine processes and their influence on the productivity of estuarine-dependent fisheries

    Get PDF
    LANDSAT thematic mapper (TM) data are being used to refine and validate a stochastic spatial computer model to be applied to coastal resource management problems in Louisiana. Two major aspects of the research are: (1) the measurement of area of land (or emergent vegetation) and water and the length of the interface between land and water in TM imagery of selected coastal wetlands (sample marshes); and (2) the comparison of spatial patterns of land and water in the sample marshes of the imagery to that in marshes simulated by a computer model. In addition to activities in these two areas, the potential use of a published autocorrelation statistic is analyzed

    Analytic Behaviour of Competition among Three Species

    Full text link
    We analyse the classical model of competition between three species studied by May and Leonard ({\it SIAM J Appl Math} \textbf{29} (1975) 243-256) with the approaches of singularity analysis and symmetry analysis to identify values of the parameters for which the system is integrable. We observe some striking relations between critical values arising from the approach of dynamical systems and the singularity and symmetry analyses.Comment: 14 pages, to appear in Journal of Nonlinear Mathematical Physic

    Is the dynamics of open quantum systems always linear?

    Full text link
    We study the influence of the preparation of an open quantum system on its reduced time evolution. In contrast to the frequently considered case of an initial preparation where the total density matrix factorizes into a product of a system density matrix and a bath density matrix the time evolution generally is no longer governed by a linear map nor is this map affine. Put differently, the evolution is truly nonlinear and cannot be cast into the form of a linear map plus a term that is independent of the initial density matrix of the open quantum system. As a consequence, the inhomogeneity that emerges in formally exact generalized master equations is in fact a nonlinear term that vanishes for a factorizing initial state. The general results are elucidated with the example of two interacting spins prepared at thermal equilibrium with one spin subjected to an external field. The second spin represents the environment. The field allows the preparation of mixed density matrices of the first spin that can be represented as a convex combination of two limiting pure states, i.e. the preparable reduced density matrices make up a convex set. Moreover, the map from these reduced density matrices onto the corresponding density matrices of the total system is affine only for vanishing coupling between the spins. In general, the set of the accessible total density matrices is nonconvex.Comment: 19 pages, 3 figures, minor changes to improve readability, discussion on Mori's linear regime and references adde

    Better Nonlinear Models from Noisy Data: Attractors with Maximum Likelihood

    Full text link
    A new approach to nonlinear modelling is presented which, by incorporating the global behaviour of the model, lifts shortcomings of both least squares and total least squares parameter estimates. Although ubiquitous in practice, a least squares approach is fundamentally flawed in that it assumes independent, normally distributed (IND) forecast errors: nonlinear models will not yield IND errors even if the noise is IND. A new cost function is obtained via the maximum likelihood principle; superior results are illustrated both for small data sets and infinitely long data streams.Comment: RevTex, 11 pages, 4 figure

    From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument

    Get PDF
    <b>Background</b> Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field.<p></p> <b>Methods</b> A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals.<p></p> <b>Results</b> The developed instrument was pre-tested in two professional samples (N = 46; N = 231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts.<p></p> <b>Conclusions</b> To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study

    Epidemic dynamics in finite size scale-free networks

    Get PDF
    Many real networks present a bounded scale-free behavior with a connectivity cut-off due to physical constraints or a finite network size. We study epidemic dynamics in bounded scale-free networks with soft and hard connectivity cut-offs. The finite size effects introduced by the cut-off induce an epidemic threshold that approaches zero at increasing sizes. The induced epidemic threshold is very small even at a relatively small cut-off, showing that the neglection of connectivity fluctuations in bounded scale-free networks leads to a strong over-estimation of the epidemic threshold. We provide the expression for the infection prevalence and discuss its finite size corrections. The present work shows that the highly heterogeneous nature of scale-free networks does not allow the use of homogeneous approximations even for systems of a relatively small number of nodes.Comment: 4 pages, 2 eps figure
    • …
    corecore