3,884 research outputs found

    The number and probability of canalizing functions

    Full text link
    Canalizing functions have important applications in physics and biology. For example, they represent a mechanism capable of stabilizing chaotic behavior in Boolean network models of discrete dynamical systems. When comparing the class of canalizing functions to other classes of functions with respect to their evolutionary plausibility as emergent control rules in genetic regulatory systems, it is informative to know the number of canalizing functions with a given number of input variables. This is also important in the context of using the class of canalizing functions as a constraint during the inference of genetic networks from gene expression data. To this end, we derive an exact formula for the number of canalizing Boolean functions of n variables. We also derive a formula for the probability that a random Boolean function is canalizing for any given bias p of taking the value 1. In addition, we consider the number and probability of Boolean functions that are canalizing for exactly k variables. Finally, we provide an algorithm for randomly generating canalizing functions with a given bias p and any number of variables, which is needed for Monte Carlo simulations of Boolean networks

    Techniques for clustering gene expression data

    Get PDF
    Many clustering techniques have been proposed for the analysis of gene expression data obtained from microarray experiments. However, choice of suitable method(s) for a given experimental dataset is not straightforward. Common approaches do not translate well and fail to take account of the data profile. This review paper surveys state of the art applications which recognises these limitations and implements procedures to overcome them. It provides a framework for the evaluation of clustering in gene expression analyses. The nature of microarray data is discussed briefly. Selected examples are presented for the clustering methods considered

    Cartografía de severidad de incendios forestales a partir de la combinación del modelo de mezclas espectrales y la clasificación basada en objetos

    Get PDF
    This study shows an accurate and fast methodology in order to evaluate fire severity classes of large forest fires. A single Landsat Enhanced Thematic Mapper multispectral image was utilized in this study with the aim of mapping fire severity classes (high, moderate and low) using a combined-approach based in an spectral mixing model and object-based image analysis. A large wildfire in the Northwest of Spain is used to test the model. Fraction images obtained by Landsat unmixing were used as input data in the object-based image analysis. A multilevel segmentation and a classification were carried out by using membership functions. This method was compared with other simplest ones in order to evaluate the suitability to distinguish between the three fire severity classes above mentioned. McNemar’s test was used to evaluate the statistical significance of the difference between approaches tested in this study. The combined approach achieved the highest accuracy reaching 97.32% and kappa index of agreement of 95.96% and improving accuracy of individual classes.Este estudio presenta una metodología rápida y precisa para la evaluación de los niveles de severidad que afectan a grandes incendios forestales. El trabajo combina un modelo de mezclas espectrales y un análisis de imágenes basado en objetos con el objetivo de cartografiar distintos niveles de severidad (alto, moderado y bajo) empleando una imagen multiespectral Landsat Enhanced Thematic Mapper. Este modelo es testado en un gran incendio forestal ocurrido en el noroeste de España. Las imágenes fracción obtenidas tras aplicar el modelo de mezclas a la imagen Landsat fueron utilizadas como datos de entrada en el análisis basado en objetos. En este se llevó a cabo una segmentación multinivel y una posterior clasificación usando funciones de pertenencia. Esta metodología fue comparada con otras más simples con el fin de evaluar su conveniencia a al hora de distinguir entre los tres niveles de severidad anteriormente mencionados. El test de McNemar fue empleado para evaluar la significancia estadística de la diferencia entre los métodos testados en el estudio. El método combinado alcanzó la más alta precisión con un 97,32% y un índice Kappa del 95,96%, además de mejorar la precisión de los niveles individualmente

    How Sample Completeness Affects Gamma-Ray Burst Classification

    Full text link
    Unsupervised pattern recognition algorithms support the existence of three gamma-ray burst classes; Class I (long, large fluence bursts of intermediate spectral hardness), Class II (short, small fluence, hard bursts), and Class III (soft bursts of intermediate durations and fluences). The algorithms surprisingly assign larger membership to Class III than to either of the other two classes. A known systematic bias has been previously used to explain the existence of Class III in terms of Class I; this bias allows the fluences and durations of some bursts to be underestimated (Hakkila et al., ApJ 538, 165, 2000). We show that this bias primarily affects only the longest bursts and cannot explain the bulk of the Class III properties. We resolve the question of Class III existence by demonstrating how samples obtained using standard trigger mechanisms fail to preserve the duration characteristics of small peak flux bursts. Sample incompleteness is thus primarily responsible for the existence of Class III. In order to avoid this incompleteness, we show how a new dual timescale peak flux can be defined in terms of peak flux and fluence. The dual timescale peak flux preserves the duration distribution of faint bursts and correlates better with spectral hardness (and presumably redshift) than either peak flux or fluence. The techniques presented here are generic and have applicability to the studies of other transient events. The results also indicate that pattern recognition algorithms are sensitive to sample completeness; this can influence the study of large astronomical databases such as those found in a Virtual Observatory.Comment: 29 pages, 6 figures, 3 tables, Accepted for publication in The Astrophysical Journa

    Techniques for automatic large scale change analysis of temporal multispectral imagery

    Get PDF
    Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst\u27s job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring in large area and high resolution image sequences. The change detection and analysis algorithm developed could be adapted to many potential image change scenarios to perform automatic large scale analysis of change

    A maritime decision support system to assess risk in the presence of environmental uncertainties: the REP10 experiment

    Get PDF
    The aim of this work is to report on an activity carried out during the 2010 Recognized Environmental Picture experiment, held in the Ligurian Sea during summer 2010. The activity was the first at-sea test of the recently developed decision support system (DSS) for operation planning, which had previously been tested in an artificial experiment. The DSS assesses the impact of both environmental conditions (meteorological and oceanographic) and non-environmental conditions (such as traffic density maps) on people and assets involved in the operation and helps in deciding a course of action that allows safer operation. More precisely, the environmental variables (such as wind speed, current speed and significant wave height) taken as input by the DSS are the ones forecasted by a super-ensemble model, which fuses the forecasts provided by multiple forecasting centres. The uncertainties associated with the DSS's inputs (generally due to disagreement between forecasts) are propagated through the DSS's output by using the unscented transform. In this way, the system is not only able to provide a traffic light map (run/not run the operation), but also to specify the confidence level associated with each action. This feature was tested on a particular type of operation with underwater gliders: the glider surfacing for data transmission. It is also shown how the availability of a glider path prediction tool provides surfacing options along the predicted path. The applicability to different operations is demonstrated by applying the same system to support diver operations

    Dynamics of phytoplankton community composition in the western Gulf of Maine

    Get PDF
    This dissertation is founded on the importance of phytoplankton community composition to marine biogeochemistry and ecosystem processes and motivated by the need to understand their distributions on regional to global scales. The ultimate goal was to predict surface phytoplankton communities using satellite remote sensing by relating marine habitats--defined through a statistical description of environmental properties--to different phytoplankton communities. While phytoplankton community composition is governed by the interplay of abiotic and biotic interactions, the strategy adopted here was to focus on the physical abiotic factors. This allowed for the detection of habitats from ocean satellites based on abiotic factors that were linked to associated phytoplankton communities. The research entailed three studies that addressed different aspects of the main goal using a dataset collected in the western Gulf of Maine over a 3-year period. The first study evaluated a chemotaxonomic method that quantified phytoplankton composition from pigment data. This enabled the characterization of three phytoplankton communities, which were defined by the relative abundance of diatoms and flagellates. The second study examined the cycles of these communities along with environmental variables, and the results revealed that the three phytoplankton communities exhibited an affinity to different hydrographic regimes. The third study focused on the implementation of a classifier that predicted phytoplankton communities from environmental variables. Its ability to differentiate communities dominated by diatoms versus flagellates was shown to be high. However, the increase in data imprecision when using satellite data led to lowered performance and favored an approach that incorporated fuzzy logic. The fuzzy method is well suited to characterize the uncertainties in phytoplankton community prediction, and provides a measure of confidence on predicted communities. The final product of the overall dissertation was a time series of maps generated from satellite observations depicting the likelihood of three phytoplankton communities. This dissertation reached the main goal and, moreover, demonstrated that improvements in the predictive power of the method can be achieved with increased precision and more advanced satellite-derived products. The results of this research can benefit present bio-optical and primary productivity models, and ecosystem-based models of the marine environment
    • …
    corecore