552 research outputs found

    CCA: An R Package to Extend Canonical Correlation Analysis

    Get PDF
    Canonical correlations analysis (CCA) is an exploratory statistical method to highlight correlations between two data sets acquired on the same experimental units. The cancor() function in R (R Development Core Team 2007) performs the core of computations but further work was required to provide the user with additional tools to facilitate the interpretation of the results. We implemented an R package, CCA, freely available from the Comprehensive R Archive Network (CRAN, http://CRAN.R-project.org/), to develop numerical and graphical outputs and to enable the user to handle missing values. The CCA package also includes a regularized version of CCA to deal with data sets with more variables than units. Illustrations are given through the analysis of a data set coming from a nutrigenomic study in the mouse.

    Clustering Time-Series Gene Expression Data Using Smoothing Spline Derivatives

    Get PDF
    Microarray data acquired during time-course experiments allow the temporal variations in gene expression to be monitored. An original postprandial fasting experiment was conducted in the mouse and the expression of 200 genes was monitored with a dedicated macroarray at 11 time points between 0 and 72 hours of fasting. The aim of this study was to provide a relevant clustering of gene expression temporal profiles. This was achieved by focusing on the shapes of the curves rather than on the absolute level of expression. Actually, we combined spline smoothing and first derivative computation with hierarchical and partitioning clustering. A heuristic approach was proposed to tune the spline smoothing parameter using both statistical and biological considerations. Clusters are illustrated a posteriori through principal component analysis and heatmap visualization. Most results were found to be in agreement with the literature on the effects of fasting on the mouse liver and provide promising directions for future biological investigations

    Characterization of plug and slug multiphase flows by means of image analysis

    Get PDF
    Multiphase flow is involved in a wide range of applications, and among the flow patterns that a multiphase mixture may develop in its flow, the intermittent one is particularly complex both in behaviour and for analysis. Experimental analysis about the characteristics of the flow structures (plugs and slugs) is therefore still mandatory for a detailed description of the phenomenon. In this work an image-based technique for the determination of the plug/slug characteristics was applied to air-water, oil-air and three-phase oil-water-air flows in horizontal ducts with different diameters, with superficial velocities of the phases in the range 0.2-2.1 m/s. The technique is based on the acquisition of a video of the flow and the conversion of each frame (or part of it) into a Boolean signal, in which the non-zero part represents the structure of interest. Concatenation of such signals along the singleton dimension creates a space-time representation of the flow, from which information about the flow velocities, the structure lengths and frequencies and the void fraction can be extracted. Focus here is particularly on the performances of the technique when using high-speed videos. The results were also compared with the predictions of the drift-flux model

    The early recognition of environmental impacts

    Get PDF
    In developing countries problems concerning water quality have agravated during the last decade. While in industrialized countries the traditional and modern types of water pollution (e.g. domestic, industrial, nutrients) occured in over a 100- year period, in developing countries however they have occured within one generation [WHO, 1989]. Short time technical measures have important immediate effects, but for achieving sustainability it is critical to develop tools for long term planning which allow a better understanding of how different strategies affect outcomes and how strategies are sensitive to different levels and types of financing [Bower, 1989]. In industrialized countries the method of Material Flux Analysis (MFA), has been shown to be a suitable instrument for early recognition of environmental problems and evaluation of environmental measures [Baccini and Brunner, 1991]. It has been shown that it is possible to combine data from market research on one hand with data from urban waste management on the other hand to observe the metabolic dynamics of a region [Baccini et al. 1993]. However, this method has not been applied yet in Developing Countries due to the low data availability and the poor data quality. The aim of this paper is to show how the method of MFA was applied to a region in a Developing Country with regard to water resource management

    Incremental Material Flow Analysis with Bayesian Inference

    Get PDF
    Material Flow Analysis (MFA) is widely used to study the life-cycles of materials from production, through use, to reuse, recycling or disposal, in order to identify environmental impacts and opportunities to address them. However, development of this type of analysis is often constrained by limited data, which may be uncertain, contradictory, missing or over-aggregated. This article proposes a Bayesian approach, in which uncertain knowledge about material flows is described by probability distributions. If little data is initially available, the model predictions will be rather vague. As new data is acquired, it is systematically incorporated to reduce the level of uncertainty. After reviewing previous approaches to uncertainty in MFA, the Bayesian approach is introduced, and a general recipe for its application to Material Flow Analysis is developed. This is applied to map global production of steel, using Markov Chain Monte Carlo simulations. As well as aiding the analyst, who can get started in the face of incomplete data, this incremental approach to MFA also supports efforts to improve communication of results by transparently accounting for uncertainty throughout.ngineering and Physical Sciences Research Council. Grant Numbers: EP/K039326/1, EP/N02351x/

    Weight‐of‐Evidence Approach for Assessing Removal of Metals from the Water Column for Chronic Environmental Hazard Classification

    Full text link
    The United Nations and the European Union have developed guidelines for the assessment of long‐term (chronic) chemical environmental hazards. This approach recognizes that these hazards are often related to spillage of chemicals into freshwater environments. The goal of the present study was to examine the concept of metal ion removal from the water column in the context of hazard assessment and classification. We propose a weight‐of‐evidence approach that assesses several aspects of metals including the intrinsic properties of metals, the rate at which metals bind to particles in the water column and settle, the transformation of metals to nonavailable and nontoxic forms, and the potential for remobilization of metals from sediment. We developed a test method to quantify metal removal in aqueous systems: the extended transformation/dissolution protocol (T/DP‐E). The method is based on that of the Organisation for Economic Co‐operation and Development (OECD). The key element of the protocol extension is the addition of substrate particles (as found in nature), allowing the removal processes to occur. The present study focused on extending this test to support the assessment of metal removal from aqueous systems, equivalent to the concept of “degradability” for organic chemicals. Although the technical aspects of our proposed method are different from the OECD method for organics, its use for hazard classification is equivalent. Models were developed providing mechanistic insight into processes occurring during the T/DP‐E method. Some metals, such as copper, rapidly decreased (within 96 h) under the 70% threshold criterion, whereas others, such as strontium, did not. A variety of method variables were evaluated and optimized to allow for a reproducible, realistic hazard classification method that mimics reasonable worst‐case scenarios. We propose that this method be standardized for OECD hazard classification via round robin (ring) testing to ascertain its intra‐ and interlaboratory variability. Environ Toxicol Chem 2019;38:1839–1849. © 2019 SETAC.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/151334/1/etc4470_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/151334/2/etc4470.pd

    Mapping and monitoring carbon stocks with satellite observations: a comparison of methods

    Get PDF
    Mapping and monitoring carbon stocks in forested regions of the world, particularly the tropics, has attracted a great deal of attention in recent years as deforestation and forest degradation account for up to 30% of anthropogenic carbon emissions, and are now included in climate change negotiations. We review the potential for satellites to measure carbon stocks, specifically aboveground biomass (AGB), and provide an overview of a range of approaches that have been developed and used to map AGB across a diverse set of conditions and geographic areas. We provide a summary of types of remote sensing measurements relevant to mapping AGB, and assess the relative merits and limitations of each. We then provide an overview of traditional techniques of mapping AGB based on ascribing field measurements to vegetation or land cover type classes, and describe the merits and limitations of those relative to recent data mining algorithms used in the context of an approach based on direct utilization of remote sensing measurements, whether optical or lidar reflectance, or radar backscatter. We conclude that while satellite remote sensing has often been discounted as inadequate for the task, attempts to map AGB without satellite imagery are insufficient. Moreover, the direct remote sensing approach provided more coherent maps of AGB relative to traditional approaches. We demonstrate this with a case study focused on continental Africa and discuss the work in the context of reducing uncertainty for carbon monitoring and markets
    • 

    corecore