399 research outputs found

    WaveCNV: allele-specific copy number alterations in primary tumors and xenograft models from next-generation sequencing.

    Get PDF
    MotivationCopy number variations (CNVs) are a major source of genomic variability and are especially significant in cancer. Until recently microarray technologies have been used to characterize CNVs in genomes. However, advances in next-generation sequencing technology offer significant opportunities to deduce copy number directly from genome sequencing data. Unfortunately cancer genomes differ from normal genomes in several aspects that make them far less amenable to copy number detection. For example, cancer genomes are often aneuploid and an admixture of diploid/non-tumor cell fractions. Also patient-derived xenograft models can be laden with mouse contamination that strongly affects accurate assignment of copy number. Hence, there is a need to develop analytical tools that can take into account cancer-specific parameters for detecting CNVs directly from genome sequencing data.ResultsWe have developed WaveCNV, a software package to identify copy number alterations by detecting breakpoints of CNVs using translation-invariant discrete wavelet transforms and assign digitized copy numbers to each event using next-generation sequencing data. We also assign alleles specifying the chromosomal ratio following duplication/loss. We verified copy number calls using both microarray (correlation coefficient 0.97) and quantitative polymerase chain reaction (correlation coefficient 0.94) and found them to be highly concordant. We demonstrate its utility in pancreatic primary and xenograft sequencing data.Availability and implementationSource code and executables are available at https://github.com/WaveCNV. The segmentation algorithm is implemented in MATLAB, and copy number assignment is implemented [email protected] informationSupplementary data are available at Bioinformatics online

    Développement d'un béton de terre autoplaçant (SCEC) avec des performances multifonctionnelles améliorées pour une construction verte

    Get PDF
    Abstract: The earth construction has been used as a sustainable construction method since ancient time. Rammed earth (RE) walls are among the most common elements for earth construction. This technique is identified as a green construction by using locally available materials with low environmental impacts and superior thermal performance. Earthen constructions were neglected for a period; however, it is grasping great interest in the last years to fulfill the environmental constraints. Besides all the advantages of this construction material, the casting process can be very time and energy consuming due to the nature of dynamic compaction. This study aimed to evaluate the feasibility of achieving self-consolidating earth concrete (SCEC) with introduction of its potential challenges and investigating its rheo-thermomechanical performance. As the first challenge, the presence of fine particles in earth, especially clay, can hinder the flowability of SCEC. Promoting the setting time is the second challenge due to the low cement content. The last challenge comes from the diversity of earth with different behaviors which makes it difficult to propose a comprehensive guideline to design SCEC. This study was conducted in four different and complementary phases. Phase 1 involved a comprehensive literature review with introduction of advantages and disadvantages of earthen constructions. In addition, potential solutions were introduced to achieve SCEC and address the inefficiency of earth materials. In the Phase 2, the main objective was to understand the efficiency of different chemical admixtures in presence of various ternary powder systems (i.e., clay, silt, and cement). The Taguchi method was used to evaluate the workability and compressive strength of self-consolidating earth paste (SCEP) mixtures and evaluate the significance of the influencing parameters, including clay type, cement content, water-to-powder ratio (W/P), cement-to-clay ratio (Ce/Cl), and admixture type. Several statistical models were introduced to predict the workability and compressive strength of SCEP mixtures. These models were also used to establish a new guideline to design SCEP considering the characterization parameters of earth (i.e., Atterberg limits). A new concrete-equivalent mortar (CEM) approach was introduced in the Phase 3. Accordingly, the CEM and concrete mixtures were investigated to verify the feasibility of SCEC. The workability, compressive strength, and drying shrinkage of the investigated mixtures were assessed. Beside the evaluation of the influencing parameters on each response, this multiscale investigation allowed to propose good relationships to predict CEM and SCEC performance from their corresponding paste and CEM mixtures, respectively. Rheology of paste, CEM, and SCEC mixtures was investigated in Phase 4. The importance of different mixture constituents and admixture types resulted in different rheological behaviors. Furthermore, new relationships between the rheological parameters and workability were recommended for paste, CEM and concrete. In the last phase (Phase 5), the hygrothermal and microstructural characteristics of the selected SCEC mixtures were investigated. This novel material offers a new microstructural system, hence leading to a different hygrothermal performance compared to conventional earthen materials. The hygrothermal tests included sorption isotherm, water vapor and gas permeability, heat capacity, and thermal performance (i.e., conductivity, diffusivity, and effusivity). These results along with the pore size distributions resulted in meaningful correlations between cumulative pore volumes in different classes with permeability, drying shrinkage, and compressive strength of the investigated SCEC mixtures.La construction en terre est utilisée comme méthode de construction durable depuis les temps anciens. Les murs en terre battue (TB) sont parmi les éléments les plus courants de la construction en terre. Cette technique est considérée comme une construction verte en utilisant des matériaux disponibles localement avec un faible impact environnemental et une performance thermique supérieure. Les constructions en terre ont été négligées pendant un certain temps; cependant, elles suscitent un grand intérêt ces dernières années pour répondre aux contraintes environnementales. Malgré les avantages de ce matériau de construction, le processus de coulage est très consommateur de temps et d'énergie à cause de l’application de la compaction dynamique. Cette étude a pour but d'évaluer la faisabilité de formuler un béton de terre autoplaçant (BTAP) tout en étudiant ses performances rhéo-thermomécaniques et en identifiant les différents défis. Le premier défi est la présence de particules fines dans la terre, en particulier les particules argileuses, qui peuvent entraver la fluidité du BTAP. L’augmentation du temps de prise est le deuxième défi en raison de la faible teneur en ciment. Le dernier défi vient de la diversité des terres avec des comportements différents, ce qui rend difficile la proposition d'une ligne directrice complète pour la conception des BTAP. Cette étude a été menée en quatre différentes phases complémentaires. La Phase 1 a consisté en une revue complète de la littérature, notamment les avantages et des inconvénients des constructions en terre. En outre, des solutions potentielles ont été introduites pour réaliser le BTAP afin de remédier à l'inefficacité des matériaux en terre. À la Phase 2, il s’agit essentiellement de comprendre l'efficacité de différents adjuvants chimiques en présence des systèmes ternaires (c'est-à-dire l'argile, le limon et le ciment). La méthode de Taguchi a été utilisée pour évaluer l’effet des paramètres de formulation et leur signification sur la maniabilité et la résistance à la compression des pâtes de terre autoplaçantes (PTAP). Le type d'argile, la teneur en ciment, le rapport eau/poudre (E/P), le rapport ciment/argile (Ci/Ar) et le type d'adjuvant sont étudiés. Des modèles statistiques ont été développés pour prédire la maniabilité et la résistance à la compression des mélanges de PTAP et une nouvelle approche a été proposée pour concevoir les PTAP en tenant compte des caractéristiques de la terre (c'est-à-dire les limites d'Atterberg). Une nouvelle approche de mortier du béton équivalent (MBE) a été introduite à la Phase 3. En conséquence, le MBE et les formulations de béton ont été étudiés pour vérifier la faisabilité du BTAP. La maniabilité, la résistance à la compression et le retrait de séchage des mélanges étudiés ont été évalués. En plus de l'évaluation des paramètres de formulation sur les propriétés étudiées, cette étude multiéchelle a permis de proposer des corrélations pour prédire la performance du MBE et du BTAP à partir de celle de leur pâte et de leur MBE correspondants. La rhéologie des pâtes, des MBE et des BTAP a également été étudiée à la Phase 4. Les résultats obtenus montrent que l’effet important des différents constituants du mélange et du type d'adjuvant a entraîné sur le comportement rhéologique des formulations étudiées. En outre, de nouvelles corrélations entre les paramètres rhéologiques et l’ouvrabilité des différentes formulations étudiées ont été recommandées. À la dernière phase (Phase 5), les caractéristiques hygrothermiques et microstructurales des BTAP optimisés ont été étudiées. Ce nouveau matériau offre une microstructure conduisant à une performance hygrothermique différente de celle des matériaux en terre conventionnels. Les essais hygrothermiques comprenaient l'isotherme de sorption, la perméabilité à la vapeur d'eau et au gaz, la capacité thermique et la performance thermique (c'est-à-dire la conductivité, la diffusivité et l'effusivité). Ces résultats hygrothermiques et la distribution des tailles de pores ont permis d'établir des corrélations significatives entre les volumes de pores cumulés et la perméabilité, le retrait au séchage et la résistance à la compression des BTAP étudiés

    Densifying Hydration Products of Alite by a Bio-Inspired Admixture

    Get PDF
    A bio-inspired, plant-derived polyphenol, tannic acid (TA) was identified as a renewable admixture to improve the compressive strength of concretes. Aiming to understand the underlying mechanism responsible for this strength improvement, this study examines how TA mediates the hydration of tricalcium silicate (alite). Experimental study shows that TA can form complex with calcium ions through chelating, retarding the hydration of the alite and changing of the hydration products. Particularly, X-ray diffraction analysis shows that TA makes calcium hydroxide preferentially grow on the [0 0 1] face. Fourier-transform infrared spectroscopy and 29Si MAS NMR results reveal that the mean chain length of calcium silicate hydrate (C[sbnd]S[sbnd]H) is reduced by TA. More importantly, mercury intrusion porosimetry testing reveals that pores with size near 30 nm was almost eliminated by adding TA, leading to higher elastic modulus of the produced C[sbnd]S[sbnd]H and higher compressive strength of the produced concrete

    Detection and quantification of delamination in concrete via time-lapse thermography with machine learning

    Get PDF
    This study developed a framework to automatically extract sub-surface defects from time-lapse thermography (TLT) images of reinforced concrete bridge components. Traditional approaches for processing TLT data typically require manual interventions that are not easily scaled to a large network of concrete bridges. A backbone of robust algorithms for detecting and analyzing deep sub-surface defects in concrete is needed to support condition assessment of concrete structures such as bridges. The current study leverages advances in adaptive signal and image processing to develop a fully automated TLT data processing pipeline that is capable of efficiently detecting defects at different depths in concrete. The methodology decomposes raw TLT datasets into narrow band time-frequency domains via a multiscale data analysis approach called a Wavelet Transform. The resulting decomposed modes are mined to extract defect information using thermal contrast enhancement routines. An objective measure of effectiveness based on signal-to-noise ratio was developed and used to compare the current framework with traditional approaches for processing TLT data. Active contour models were also designed to automatically extract the boundary location and geometric properties of the sub-surface defects. The results of this study show that the detection of deeper defects (3 in. and beyond) can be improved by analyzing the time-frequency response of surface temperature variations over a period of time. Compared to traditional lock-in algorithms and conventional infrared thermography images, the proposed framework is more effective at removing noisy information and produces images with greater contrast between intact and defective areas of concrete. Furthermore, a new process has been established to predict depths of delamination in reinforced concrete bridge components. For previous works, traditional approaches were adopted to quantify depths in active thermography, which mainly depend on estimated models as a function of time, frequency, phase contrast, material properties of specimens. This work deals with the passive thermography that is affected by several environmental parameters such as solar heating, daytime or nighttime, wind speed, clouds, shadow. The current work has employed the Machine Learning (ML) technology to estimate defect depths in concrete block. Features, such as phases, amplitudes, frequencies, have been extracted by utilizing the Fast Fourier Transform (FFT) in a stage of analysis. Furthermore, additional subfeatures, minor features, have been added to the ML analysis, for instance average and/or subtraction values between the maxima and minima features, to attain an acceptable learning performance. Support vector machine (SVM) and k-Nearest Neighbor (KNN) classifiers have been trained by using crossvalidation with different folds and hold validations. The predicted models have achieved an improved accuracy in estimating delamination depths in the concrete specimens with a good agreement.Includes bibliographical references

    Topics on Multiresolution Signal Processing and Bayesian Modeling with Applications in Bioinformatics

    Get PDF
    Analysis of multi-resolution signals and time-series data has wide applications in biology, medicine, engineering, etc. In many cases, the large-scale (low-frequency) features of a signal including basic descriptive statistics, trends, smoothed functional estimates, do not carry useful information about the phenomenon of interest. On the other hand, the study of small-scale (high-frequency) features that look like noise may be more informative even though extracting such informative features are not always straightforward. In this dissertation we try to address some of the issues pertaining to high-frequency features extraction and denoising of noisy signals. Another topic studied in this dissertation is focused on the integration of genome data with transatlantic voyage data of enslaved people from Africa to determine the ancestry origin of Afro-Americans. Chapter 2. Assessment of Scaling by Auto-Correlation Shells. In this chapter, we utilize the Auto-Correlation (AC) Shell to propose a feature extraction method that can effectively capture small-scale information of a signal. The AC Shell is a redundant shift-invariant and symmetric representation of the signal that is obtained by using Auto-Correlation function of compactly supported wavelets. The small-scale features are extracted by computing the energy of AC Shell coefficients at different levels of decomposition as well as the slope of the line fitted to these energy values using AC Shell spectra. We discuss the theoretical properties and verify them using extensive simulations. We compare the extracted features from AC Shell with those of Wavelets in terms of bias, variance, and mean square error (MSE). The results indicate that the AC Shell features tend to have smaller variance, hence more reliable. Moreover, to show its effectiveness, we validate our feature extraction method in the context of classification to identify patients with ovarian cancer through the analysis of their blood mass spectrum. For this study, we use the features extracted by AC Shell spectra along with a support vector machine classifier to distinguish control from cancer cases. Chapter 3. Bayesian Binary Regressions in Wavelet-based Function Estimation. Wavelet shrinkage has been widely used in nonparametric statistics and signal processing for a variety of purposes including denoising noisy signals and images, dimension reduction, and variable/feature selection. Although the traditional wavelet shrinkage methods are effective and popular, they have one major drawback. In these methods the shrinkage process only relies on the information of the coefficient being thresholded and the information contained in the neighboring coefficients is ignored. Similarly, the standard AC Shell denoising methods shrink the empirical coefficients independently, by comparing their magnitudes with a threshold value. The information of other coefficients has no influence on behavior of a particular coefficients. However, due to redundant representation of signals and coefficients obtained by AC Shells, the dependency of neighboring coefficients and the amount of shared information between them increases. Therefore, it would be vital to propose a new thresholding approach for AC Shells coefficients that considers the information of neighboring coefficients. In this chapter, we develop a new Bayesian denoising for AC Shell coefficients approach that integrates logistic regression, universal thresholding and Bayesian inference. We validate the proposed method using extensive simulations with various types of smooth and non-smooth signals. The results indicate that for all signal types including the neighbor coefficients would improve the denoising process, resulting in lower MSEs. Moreover, we applied our proposed methodology to a case study of denoising Atomic Force Microscopy (AFM) signals measuring the adhesion strength between two materials at the nano-newton scale to correctly identify the cantilever detachment point. Chapter 4. Bayesian Method in Combining Genetic and Historical Records of Transatlantic Slave Trade in the Americas. In the era between 1515 and 1865, more than 12 million people were enslaved and forced to move from Africa to North and Latin America. The shipping documents have recorded the origin and disembarkation of enslaved people. Traditionally, genealogy study has been done via the exploration of historical records, family tress and birth certificates. Due to recent advancements in the field of genetics, genealogy has been revolutionized and become more accurate. Although these methods can provide continental differentiation, they have poor spatial resolution that makes it hard to localize ancestry assignment as these markers are distributed across different sub-continental regions. To overcome the foregoing drawbacks, in this chapter, we propose a hybrid approach that combines the genetic markers results with the historical records of transatlantic voyage of enslaved people. Addition of the journey data can provide with substantially increased resolution in ancestry assignment, using a Bayesian modeling framework. The proposed Bayesian framework uses the voyage data from historical records available in the transatlantic slave trade database as prior probabilities and combine them with genetic markers of Afro-Americans, considered as the likelihood information to estimate the posterior (updated) probabilities of their ancestry assignments to geographical regions in Africa. We applied the proposed methodology to 60 Afro-American individuals and show that the prior information has increased the assignment probabilities obtained by the posterior distributions for some of the regions.Ph.D

    Wavelets and their use

    Get PDF
    This review paper is intended to give a useful guide for those who want to apply discrete wavelets in their practice. The notion of wavelets and their use in practical computing and various applications are briefly described, but rigorous proofs of mathematical statements are omitted, and the reader is just referred to corresponding literature. The multiresolution analysis and fast wavelet transform became a standard procedure for dealing with discrete wavelets. The proper choice of a wavelet and use of nonstandard matrix multiplication are often crucial for achievement of a goal. Analysis of various functions with the help of wavelets allows to reveal fractal structures, singularities etc. Wavelet transform of operator expressions helps solve some equations. In practical applications one deals often with the discretized functions, and the problem of stability of wavelet transform and corresponding numerical algorithms becomes important. After discussing all these topics we turn to practical applications of the wavelet machinery. They are so numerous that we have to limit ourselves by some examples only. The authors would be grateful for any comments which improve this review paper and move us closer to the goal proclaimed in the first phrase of the abstract.Comment: 63 pages with 22 ps-figures, to be published in Physics-Uspekh
    • …
    corecore