717 research outputs found

    Evolutionary approaches to signal decomposition in an application service management system

    Get PDF
    The increased demand for autonomous control in enterprise information systems has generated interest on efficient global search methods for multivariate datasets in order to search for original elements in time-series patterns, and build causal models of systems interactions, utilization dependencies, and performance characteristics. In this context, activity signals deconvolution is a necessary step to achieve effective adaptive control in Application Service Management. The paper investigates the potential of population-based metaheuristic algorithms, particularly variants of particle swarm, genetic algorithms and differential evolution methods, for activity signals deconvolution when the application performance model is unknown a priori. In our approach, the Application Service Management System is treated as a black- or grey-box, and the activity signals deconvolution is formulated as a search problem, decomposing time-series that outline relations between action signals and utilization-execution time of resources. Experiments are conducted using a queue-based computing system model as a test-bed under different load conditions and search configurations. Special attention was put on high-dimensional scenarios, testing effectiveness for large-scale multivariate data analyses that can obtain a near-optimal signal decomposition solution in a short time. The experimental results reveal benefits, qualities and drawbacks of the various metaheuristic strategies selected for a given signal deconvolution problem, and confirm the potential of evolutionary-type search to effectively explore the search space even in high-dimensional cases. The approach and the algorithms investigated can be useful in support of human administrators, or in enhancing the effectiveness of feature extraction schemes that feed decision blocks of autonomous controllers

    Experimental Comparison and Evaluation of the Affymetrix Exon and U133Plus2 GeneChip Arrays

    Get PDF
    Affymetrix exon arrays offer scientists the only solution for exon-level expression profiling at the whole-genome scale on a single array. These arrays feature a new chip design with no mismatch probes and a radically new random primed protocol to generate sense DNA targets along the entire length of the transcript. In addition to these changes, a limited number of validating experiments and virtually no experimental data to rigorously address the comparability of all-exon arrays with conventional 3'-arrays result in a natural reluctance to replace conventional expression arrays with the new all-exon platform.Using commercially available Affymetrix arrays, we assess the performance of the Human Exon 1.0 ST (HuEx) and U133 Plus 2.0 (U133Plus2) platforms directly through a series of 'spike-in' hybridizations containing 25 transcripts in the presence of a fixed eukaryotic background. Specifically, we compare the measures of expression for HuEx and U133Plus2 arrays to evaluate the precision of these measures as well as the specificity and sensitivity of the measures' ability to detect differential expression.This study presents an experimental comparison and systematic cross-validation of Affymetrix exon arrays and establishes high comparability of expression changes and probe performance characteristics between Affymetrix conventional and exon arrays. In addition, this study offers a reliable benchmark data set for the comparison of competing exon expression measures, the selection of methods suitable for mapping exon array measures to the wealth of previously generated microarray data, as well as the development of more advanced methods for exon- and transcript-level expression summarization

    Interaction imaging with amplitude-dependence force spectroscopy

    Full text link
    Knowledge of surface forces is the key to understanding a large number of processes in fields ranging from physics to material science and biology. The most common method to study surfaces is dynamic atomic force microscopy (AFM). Dynamic AFM has been enormously successful in imaging surface topography, even to atomic resolution, but the force between the AFM tip and the surface remains unknown during imaging. Here, we present a new approach that combines high accuracy force measurements and high resolution scanning. The method, called amplitude-dependence force spectroscopy (ADFS) is based on the amplitude-dependence of the cantilever's response near resonance and allows for separate determination of both conservative and dissipative tip-surface interactions. We use ADFS to quantitatively study and map the nano-mechanical interaction between the AFM tip and heterogeneous polymer surfaces. ADFS is compatible with commercial atomic force microscopes and we anticipate its wide-spread use in taking AFM toward quantitative microscopy

    On the Adaptive Partition Approach to the Detection of Multiple Change-Points

    Get PDF
    With an adaptive partition procedure, we can partition a “time course” into consecutive non-overlapped intervals such that the population means/proportions of the observations in two adjacent intervals are significantly different at a given level . However, the widely used recursive combination or partition procedures do not guarantee a global optimization. We propose a modified dynamic programming algorithm to achieve a global optimization. Our method can provide consistent estimation results. In a comprehensive simulation study, our method shows an improved performance when it is compared to the recursive combination/partition procedures. In practice, can be determined based on a cross-validation procedure. As an application, we consider the well-known Pima Indian Diabetes data. We explore the relationship among the diabetes risk and several important variables including the plasma glucose concentration, body mass index and age

    Why do dogs (Canis familiaris) select the empty container in an observational learning task?

    Get PDF
    Many argue that dogs show unique susceptibility to human communicative signals that make them suitable for being engaged in complex co-operation with humans. It has also been revealed that socially provided information is particularly effective in influencing the behaviour of dogs even when the human’s action demonstration conveys inefficient or mistaken solution of task. It is unclear, however, how the communicative nature of the demonstration context and the presence of the human demonstrator affect the dogs’ object-choice behaviour in observational learning situations. In order to unfold the effects of these factors, 76 adult pet dogs could observe a communicative or a non-communicative demonstration in which the human retrieved a tennis ball from under an opaque container while manipulating another distant and obviously empty (transparent) one. Subjects were then allowed to choose either in the presence of the demonstrator or after she left the room. Results showed a significant main effect of the demonstration context (presence or absence of the human’s communicative signals), and we also found some evidence for the response-modifying effect of the presence of the human demonstrator during the dogs’ choice. That is, dogs predominantly chose the baited container, but if the demonstration context was communicative and the human was present during the dogs’ choice, subjects’ tendency to select the baited container has been reduced. In agreement with the studies showing sensitivity to human’s communicative signals in dogs, these findings point to a special form of social influence in observational learning situations when it comes to learning about causally opaque and less efficient (compared to what comes natural to the dog) action demonstrations

    How to determine life expectancy change of air pollution mortality: a time series study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Information on life expectancy (LE) change is of great concern for policy makers, as evidenced by discussions of the "harvesting" (or "mortality displacement") issue, i.e. how large an LE loss corresponds to the mortality results of time series (TS) studies. Whereas loss of LE attributable to chronic air pollution exposure can be determined from cohort studies, using life table methods, conventional TS studies have identified only deaths due to acute exposure, during the immediate past (typically the preceding one to five days), and they provide no information about the LE loss per death.</p> <p>Methods</p> <p>We show how to obtain information on population-average LE loss by extending the observation window (largest "lag") of TS to include a sufficient number of "impact coefficients" for past exposures ("lags"). We test several methods for determining these coefficients. Once all of the coefficients have been determined, the LE change is calculated as time integral of the relative risk change after a permanent step change in exposure.</p> <p>Results</p> <p>The method is illustrated with results for daily data of non-accidental mortality from Hong Kong for 1985 - 2005, regressed against PM<sub>10 </sub>and SO<sub>2 </sub>with observation windows up to 5 years. The majority of the coefficients is statistically significant. The magnitude of the SO<sub>2 </sub>coefficients is comparable to those for PM<sub>10</sub>. But a window of 5 years is not sufficient and the results for LE change are only a lower bound; it is consistent with what is implied by other studies of long term impacts.</p> <p>Conclusions</p> <p>A TS analysis can determine the LE loss, but if the observation window is shorter than the relevant exposures one obtains only a lower bound.</p

    TreeDyn: towards dynamic graphics and annotations for analyses of trees

    Get PDF
    BACKGROUND: Analyses of biomolecules for biodiversity, phylogeny or structure/function studies often use graphical tree representations. Many powerful tree editors are now available, but existing tree visualization tools make little use of meta-information related to the entities under study such as taxonomic descriptions or gene functions that can hardly be encoded within the tree itself (if using popular tree formats). Consequently, a tedious manual analysis and post-processing of the tree graphics are required if one needs to use external information for displaying or investigating trees. RESULTS: We have developed TreeDyn, a tool using annotations and dynamic graphical methods for editing and analyzing multiple trees. The main features of TreeDyn are 1) the management of multiple windows and multiple trees per window, 2) the export of graphics to several standard file formats with or without HTML encapsulation and a new format called TGF, which enables saving and restoring graphical analysis, 3) the projection of texts or symbols facing leaf labels or linked to nodes, through manual pasting or by using annotation files, 4) the highlight of graphical elements after querying leaf labels (or annotations) or by selection of graphical elements and information extraction, 5) the highlight of targeted trees according to a source tree browsed by the user, 6) powerful scripts for automating repetitive graphical tasks, 7) a command line interpreter enabling the use of TreeDyn through CGI scripts for online building of trees, 8) the inclusion of a library of packages dedicated to specific research fields involving trees. CONCLUSION: TreeDyn is a tree visualization and annotation tool which includes tools for tree manipulation and annotation and uses meta-information through dynamic graphical operators or scripting to help analyses and annotations of single trees or tree collections

    rnaSeqMap: a Bioconductor package for RNA sequencing data exploration

    Get PDF
    BACKGROUND: The throughput of commercially available sequencers has recently significantly increased. It has reached the point where measuring the RNA expression by the depth of coverage has become feasible even for largest genomes. The development of software tools is constantly following the progress of biological hardware. In particular, as RNA sequencing software can be regarded genome browsers, exon junction tools and statistical tools operating on counts of reads in predefined regions. The library rnaSeqMap, freely available via Bioconductor, is an RNA sequencing software which is independent of any biological hardware platform. It is based upon standard Bioconductor infrastructure for sequencing data and includes several novel features focused on deeper understanding of coverage expression profiles and discovery of novel transcription regions. RESULTS: rnaSeqMap is a toolbox for analyses that may be performed with the use of gene annotations or alternatively, in an unsupervised mode, on any genomic region to find novel or non-standard transcripts. The data back-end may be a MySQL database or a set of files in standard BAM format. The processing in R can be run on a machine without any particular hardware requirements, and scales linearly with the number of genomic loci and number of samples analyzed. The main features of rnaSeqMap include coverage operations, discovering irreducible regions of high expression, significance search and splicing analyses with nucleotide granularity. CONCLUSIONS: This software may be used for a range of applications related to RNA sequencing by building customized analysis pipelines. The applicability and precision is expected to increase in parallel with the progress of the genome coverage in sequencers

    Patient dose reduction during voiding cystourethrography

    Get PDF
    Voiding cystourethrography (VCUG) is a commonly performed examination in a pediatric uroradiology practice. This article contains suggestions on how the radiation dose to a child from VCUG can be made ‘as low as reasonably achievable–(ALARA). The pediatric radiologist should consider the appropriateness of the clinical indication before performing VCUG and utilize radiation exposure techniques and parameters during VCUG to reduce radiation exposure to a child. The medical physicist and fluoroscope manufacturer can also work together to optimize a pulsed-fluoroscopy unit and further reduce the radiation exposure. Laboratory and clinical research is necessary to investigate methods that reduce radiation exposures during VCUG, and current research is presented here
    corecore