12,337 research outputs found

    The Role of Landscape Connectivity in Planning and Implementing Conservation and Restoration Priorities. Issues in Ecology

    Get PDF
    Landscape connectivity, the extent to which a landscape facilitates the movements of organisms and their genes, faces critical threats from both fragmentation and habitat loss. Many conservation efforts focus on protecting and enhancing connectivity to offset the impacts of habitat loss and fragmentation on biodiversity conservation, and to increase the resilience of reserve networks to potential threats associated with climate change. Loss of connectivity can reduce the size and quality of available habitat, impede and disrupt movement (including dispersal) to new habitats, and affect seasonal migration patterns. These changes can lead, in turn, to detrimental effects for populations and species, including decreased carrying capacity, population declines, loss of genetic variation, and ultimately species extinction. Measuring and mapping connectivity is facilitated by a growing number of quantitative approaches that can integrate large amounts of information about organisms’ life histories, habitat quality, and other features essential to evaluating connectivity for a given population or species. However, identifying effective approaches for maintaining and restoring connectivity poses several challenges, and our understanding of how connectivity should be designed to mitigate the impacts of climate change is, as yet, in its infancy. Scientists and managers must confront and overcome several challenges inherent in evaluating and planning for connectivity, including: •characterizing the biology of focal species; •understanding the strengths and the limitations of the models used to evaluate connectivity; •considering spatial and temporal extent in connectivity planning; •using caution in extrapolating results outside of observed conditions; •considering non-linear relationships that can complicate assumed or expected ecological responses; •accounting and planning for anthropogenic change in the landscape; •using well-defined goals and objectives to drive the selection of methods used for evaluating and planning for connectivity; •and communicating to the general public in clear and meaningful language the importance of connectivity to improve awareness and strengthen policies for ensuring conservation. Several aspects of connectivity science deserve additional attention in order to improve the effectiveness of design and implementation. Research on species persistence, behavioral ecology, and community structure is needed to reduce the uncertainty associated with connectivity models. Evaluating and testing connectivity responses to climate change will be critical to achieving conservation goals in the face of the rapid changes that will confront many communities and ecosystems. All of these potential areas of advancement will fall short of conservation goals if we do not effectively incorporate human activities into connectivity planning. While this Issue identifies substantial uncertainties in mapping connectivity and evaluating resilience to climate change, it is also clear that integrating human and natural landscape conservation planning to enhance habitat connectivity is essential for biodiversity conservation

    Hunter-gatherers in a howling wilderness: Neoliberal capitalism as a language that speaks itself

    Get PDF
    The 'self-referential' character of evolutionary process noted by Goldenfeld and Woese (2010) can be restated in the context of a generalized Darwinian theory applied to economic process through a 'language' model: The underlying inherited and learned culture of the firm, the short-time cognitive response of the firm to patterns of threat and opportunity that is sculpted by that culture, and the embedding socioeconomic environment, are represented as interacting information sources constrained by the asymptotic limit theorems of information theory. If unregulated, the larger, compound, source that characterizes high probability evolutionary paths of this composite then becomes, literally, a self-dynamic language that speaks itself. Such a structure is, for those enmeshed in it, more akin to a primitive hunter-gatherer society at the mercy of internal ecological dynamics than to, say, a neolithic agricultural community in which a highly ordered, deliberately adapted, ecosystem is consciously farmed so as to match its productivity to human needs

    Robustness - a challenge also for the 21st century: A review of robustness phenomena in technical, biological and social systems as well as robust approaches in engineering, computer science, operations research and decision aiding

    Get PDF
    Notions on robustness exist in many facets. They come from different disciplines and reflect different worldviews. Consequently, they contradict each other very often, which makes the term less applicable in a general context. Robustness approaches are often limited to specific problems for which they have been developed. This means, notions and definitions might reveal to be wrong if put into another domain of validity, i.e. context. A definition might be correct in a specific context but need not hold in another. Therefore, in order to be able to speak of robustness we need to specify the domain of validity, i.e. system, property and uncertainty of interest. As proofed by Ho et al. in an optimization context with finite and discrete domains, without prior knowledge about the problem there exists no solution what so ever which is more robust than any other. Similar to the results of the No Free Lunch Theorems of Optimization (NLFTs) we have to exploit the problem structure in order to make a solution more robust. This optimization problem is directly linked to a robustness/fragility tradeoff which has been observed in many contexts, e.g. 'robust, yet fragile' property of HOT (Highly Optimized Tolerance) systems. Another issue is that robustness is tightly bounded to other phenomena like complexity for which themselves exist no clear definition or theoretical framework. Consequently, this review rather tries to find common aspects within many different approaches and phenomena than to build a general theorem for robustness, which anyhow might not exist because complex phenomena often need to be described from a pluralistic view to address as many aspects of a phenomenon as possible. First, many different robustness problems have been reviewed from many different disciplines. Second, different common aspects will be discussed, in particular the relationship of functional and structural properties. This paper argues that robustness phenomena are also a challenge for the 21st century. It is a useful quality of a model or system in terms of the 'maintenance of some desired system characteristics despite fluctuations in the behaviour of its component parts or its environment' (s. [Carlson and Doyle, 2002], p. 2). We define robustness phenomena as solution with balanced tradeoffs and robust design principles and robustness measures as means to balance tradeoffs. --

    Exploiting Large Neuroimaging Datasets to Create Connectome-Constrained Approaches for more Robust, Efficient, and Adaptable Artificial Intelligence

    Full text link
    Despite the progress in deep learning networks, efficient learning at the edge (enabling adaptable, low-complexity machine learning solutions) remains a critical need for defense and commercial applications. We envision a pipeline to utilize large neuroimaging datasets, including maps of the brain which capture neuron and synapse connectivity, to improve machine learning approaches. We have pursued different approaches within this pipeline structure. First, as a demonstration of data-driven discovery, the team has developed a technique for discovery of repeated subcircuits, or motifs. These were incorporated into a neural architecture search approach to evolve network architectures. Second, we have conducted analysis of the heading direction circuit in the fruit fly, which performs fusion of visual and angular velocity features, to explore augmenting existing computational models with new insight. Our team discovered a novel pattern of connectivity, implemented a new model, and demonstrated sensor fusion on a robotic platform. Third, the team analyzed circuitry for memory formation in the fruit fly connectome, enabling the design of a novel generative replay approach. Finally, the team has begun analysis of connectivity in mammalian cortex to explore potential improvements to transformer networks. These constraints increased network robustness on the most challenging examples in the CIFAR-10-C computer vision robustness benchmark task, while reducing learnable attention parameters by over an order of magnitude. Taken together, these results demonstrate multiple potential approaches to utilize insight from neural systems for developing robust and efficient machine learning techniques.Comment: 11 pages, 4 figure

    Without magic bullets: the biological basis for public health interventions against protein folding disorders

    Get PDF
    Protein folding disorders of aging like Alzheimer's and Parkinson's diseases currently present intractable medical challenges. 'Small molecule' interventions - drug treatments - often have, at best, palliative impact, failing to alter disease course. The design of individual or population level interventions will likely require a deeper understanding of protein folding and its regulation than currently provided by contemporary 'physics' or culture-bound medical magic bullet models. Here, a topological rate distortion analysis is applied to the problem of protein folding and regulation that is similar in spirit to Tlusty's (2010a) elegant exploration of the genetic code. The formalism produces large-scale, quasi-equilibrium 'resilience' states representing normal and pathological protein folding regulation under a cellular-level cognitive paradigm similar to that proposed by Atlan and Cohen (1998) for the immune system. Generalization to long times produces diffusion models of protein folding disorders in which epigenetic or life history factors determine the rate of onset of regulatory failure, in essence, a premature aging driven by familiar synergisms between disjunctions of resource allocation and need in the context of socially or physiologically toxic exposures and chronic powerlessness at individual and group scales. Application of an HPA axis model is made to recent observed differences in Alzheimer's onset rates in White and African American subpopulations as a function of an index of distress-proneness

    Many-Task Computing and Blue Waters

    Full text link
    This report discusses many-task computing (MTC) generically and in the context of the proposed Blue Waters systems, which is planned to be the largest NSF-funded supercomputer when it begins production use in 2012. The aim of this report is to inform the BW project about MTC, including understanding aspects of MTC applications that can be used to characterize the domain and understanding the implications of these aspects to middleware and policies. Many MTC applications do not neatly fit the stereotypes of high-performance computing (HPC) or high-throughput computing (HTC) applications. Like HTC applications, by definition MTC applications are structured as graphs of discrete tasks, with explicit input and output dependencies forming the graph edges. However, MTC applications have significant features that distinguish them from typical HTC applications. In particular, different engineering constraints for hardware and software must be met in order to support these applications. HTC applications have traditionally run on platforms such as grids and clusters, through either workflow systems or parallel programming systems. MTC applications, in contrast, will often demand a short time to solution, may be communication intensive or data intensive, and may comprise very short tasks. Therefore, hardware and software for MTC must be engineered to support the additional communication and I/O and must minimize task dispatch overheads. The hardware of large-scale HPC systems, with its high degree of parallelism and support for intensive communication, is well suited for MTC applications. However, HPC systems often lack a dynamic resource-provisioning feature, are not ideal for task communication via the file system, and have an I/O system that is not optimized for MTC-style applications. Hence, additional software support is likely to be required to gain full benefit from the HPC hardware

    Invariant Distribution of Promoter Activities in Escherichia coli

    Get PDF
    Cells need to allocate their limited resources to express a wide range of genes. To understand how Escherichia coli partitions its transcriptional resources between its different promoters, we employ a robotic assay using a comprehensive reporter strain library for E. coli to measure promoter activity on a genomic scale at high-temporal resolution and accuracy. This allows continuous tracking of promoter activity as cells change their growth rate from exponential to stationary phase in different media. We find a heavy-tailed distribution of promoter activities, with promoter activities spanning several orders of magnitude. While the shape of the distribution is almost completely independent of the growth conditions, the identity of the promoters expressed at different levels does depend on them. Translation machinery genes, however, keep the same relative expression levels in the distribution across conditions, and their fractional promoter activity tracks growth rate tightly. We present a simple optimization model for resource allocation which suggests that the observed invariant distributions might maximize growth rate. These invariant features of the distribution of promoter activities may suggest design constraints that shape the allocation of transcriptional resources
    • …
    corecore