1,078 research outputs found

    "Hook"-calibration of GeneChip-microarrays: Theory and algorithm

    Get PDF
    Abstract Background: The improvement of microarray calibration methods is an essential prerequisite for quantitative expression analysis. This issue requires the formulation of an appropriate model describing the basic relationship between the probe intensity and the specific transcript concentration in a complex environment of competing interactions, the estimation of the magnitude these effects and their correction using the intensity information of a given chip and, finally the development of practicable algorithms which judge the quality of a particular hybridization and estimate the expression degree from the intensity values. Results: We present the so-called hook-calibration method which co-processes the log-difference (delta) and -sum (sigma) of the perfect match (PM) and mismatch (MM) probe-intensities. The MM probes are utilized as an internal reference which is subjected to the same hybridization law as the PM, however with modified characteristics. After sequence-specific affinity correction the method fits the Langmuir-adsorption model to the smoothed delta-versus-sigma plot. The geometrical dimensions of this so-called hook-curve characterize the particular hybridization in terms of simple geometric parameters which provide information about the mean non-specific background intensity, the saturation value, the mean PM/MM-sensitivity gain and the fraction of absent probes. This graphical summary spans a metrics system for expression estimates in natural units such as the mean binding constants and the occupancy of the probe spots. The method is single-chip based, i.e. it separately uses the intensities for each selected chip. Conclusion: The hook-method corrects the raw intensities for the non-specific background hybridization in a sequence-specific manner, for the potential saturation of the probe-spots with bound transcripts and for the sequence-specific binding of specific transcripts. The obtained chip characteristics in combination with the sensitivity corrected probe-intensity values provide expression estimates scaled in natural units which are given by the binding constants of the particular hybridization.</p

    "Hook"-calibration of GeneChip-microarrays: Chip characteristics and expression measures

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Microarray experiments rely on several critical steps that may introduce biases and uncertainty in downstream analyses. These steps include mRNA sample extraction, amplification and labelling, hybridization, and scanning causing chip-specific systematic variations on the raw intensity level. Also the chosen array-type and the up-to-dateness of the genomic information probed on the chip affect the quality of the expression measures. In the accompanying publication we presented theory and algorithm of the so-called hook method which aims at correcting expression data for systematic biases using a series of new chip characteristics.</p> <p>Results</p> <p>In this publication we summarize the essential chip characteristics provided by this method, analyze special benchmark experiments to estimate transcript related expression measures and illustrate the potency of the method to detect and to quantify the quality of a particular hybridization. It is shown that our single-chip approach provides expression measures responding linearly on changes of the transcript concentration over three orders of magnitude. In addition, the method calculates a detection call judging the relation between the signal and the detection limit of the particular measurement. The performance of the method in the context of different chip generations and probe set assignments is illustrated. The hook method characterizes the RNA-quality in terms of the 3'/5'-amplification bias and the sample-specific calling rate. We show that the proper judgement of these effects requires the disentanglement of non-specific and specific hybridization which, otherwise, can lead to misinterpretations of expression changes. The consequences of modifying probe/target interactions by either changing the labelling protocol or by substituting RNA by DNA targets are demonstrated.</p> <p>Conclusion</p> <p>The single-chip based hook-method provides accurate expression estimates and chip-summary characteristics using the natural metrics given by the hybridization reaction with the potency to develop new standards for microarray quality control and calibration.</p

    CRANKITE: a fast polypeptide backbone conformation sampler

    Get PDF
    Background: CRANKITE is a suite of programs for simulating backbone conformations of polypeptides and proteins. The core of the suite is an efficient Metropolis Monte Carlo sampler of backbone conformations in continuous three-dimensional space in atomic details. Methods: In contrast to other programs relying on local Metropolis moves in the space of dihedral angles, our sampler utilizes local crankshaft rotations of rigid peptide bonds in Cartesian space. Results: The sampler allows fast simulation and analysis of secondary structure formation and conformational changes for proteins of average length

    The Global Omnivore: Identifying Musical Taste Groups in Austria, England, Israel and Serbia

    Get PDF
    This research offers a unique opportunity to revisit the omnivore hypothesis under a unified method of cross-national analysis. To accomplish this, we interpret omnivourism as a special case of cultural eclecticism (Ollivier, 2008; Ollivier, Gauthier and Truong, 2009). Our methodological approach incorporates the simultaneous analysis of locally produced and globally known musical genres. Its objective is to verify whether cultural omnivourism is a widespread phenomenon, and to determine to what extent any conclusions can be generalised across countries with different social structures and different levels of cultural openness. To truly understand the scope of the omnivourism hypothesis, we argue that it is essential to perform a cross-national comparison to test the hypothesis within a range of social, political and cultural contexts, and a reflection of different historical and cultural repertoires (Lamont, 1992)

    Hazard Analysis of Critical Control Points Assessment as a Tool to Respond to Emerging Infectious Disease Outbreaks

    Get PDF
    Highly pathogenic avian influenza virus (HPAI) strain H5N1 has had direct and indirect economic impacts arising from direct mortality and control programmes in over 50 countries reporting poultry outbreaks. HPAI H5N1 is now reported as the most widespread and expensive zoonotic disease recorded and continues to pose a global health threat. The aim of this research was to assess the potential of utilising Hazard Analysis of Critical Control Points (HACCP) assessments in providing a framework for a rapid response to emerging infectious disease outbreaks. This novel approach applies a scientific process, widely used in food production systems, to assess risks related to a specific emerging health threat within a known zoonotic disease hotspot. We conducted a HACCP assessment for HPAI viruses within Vietnam’s domestic poultry trade and relate our findings to the existing literature. Our HACCP assessment identified poultry flock isolation, transportation, slaughter, preparation and consumption as critical control points for Vietnam’s domestic poultry trade. Introduction of the preventative measures highlighted through this HACCP evaluation would reduce the risks posed by HPAI viruses and pressure on the national economy. We conclude that this HACCP assessment provides compelling evidence for the future potential that HACCP analyses could play in initiating a rapid response to emerging infectious diseases

    Ranking differentially expressed genes from Affymetrix gene expression data: methods with reproducibility, sensitivity, and specificity

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>To identify differentially expressed genes (DEGs) from microarray data, users of the Affymetrix GeneChip system need to select both a preprocessing algorithm to obtain expression-level measurements and a way of ranking genes to obtain the most plausible candidates. We recently recommended suitable combinations of a preprocessing algorithm and gene ranking method that can be used to identify DEGs with a higher level of sensitivity and specificity. However, in addition to these recommendations, researchers also want to know which combinations enhance reproducibility.</p> <p>Results</p> <p>We compared eight conventional methods for ranking genes: weighted average difference (WAD), average difference (AD), fold change (FC), rank products (RP), moderated <it>t </it>statistic (modT), significance analysis of microarrays (samT), shrinkage <it>t </it>statistic (shrinkT), and intensity-based moderated <it>t </it>statistic (ibmT) with six preprocessing algorithms (PLIER, VSN, FARMS, multi-mgMOS (mmgMOS), MBEI, and GCRMA). A total of 36 real experimental datasets was evaluated on the basis of the area under the receiver operating characteristic curve (AUC) as a measure for both sensitivity and specificity. We found that the RP method performed well for VSN-, FARMS-, MBEI-, and GCRMA-preprocessed data, and the WAD method performed well for mmgMOS-preprocessed data. Our analysis of the MicroArray Quality Control (MAQC) project's datasets showed that the FC-based gene ranking methods (WAD, AD, FC, and RP) had a higher level of reproducibility: The percentages of overlapping genes (POGs) across different sites for the FC-based methods were higher overall than those for the <it>t</it>-statistic-based methods (modT, samT, shrinkT, and ibmT). In particular, POG values for WAD were the highest overall among the FC-based methods irrespective of the choice of preprocessing algorithm.</p> <p>Conclusion</p> <p>Our results demonstrate that to increase sensitivity, specificity, and reproducibility in microarray analyses, we need to select suitable combinations of preprocessing algorithms and gene ranking methods. We recommend the use of FC-based methods, in particular RP or WAD.</p

    Influence of long-range dipolar interactions on the phase stability and hysteresis shapes of ferroelectric and antiferroelectric multilayers

    Get PDF
    Phase transition and field driven hysteresis evolution of a two-dimensional Ising grid consisting of ferroelectric-antiferroelectric multilayers that take into account the long range dipolar interactions were simulated by a Monte-Carlo method. Simulations were carried out for a 1+1 bilayer and a 5+5 superlattice. Phase stabilities of components comprising the structures with an electrostatic-like coupling term were also studied. An electrostatic-like coupling, in the absence of an applied field, can drive the ferroelectric layers towards 180Âș domains with very flat domain interfaces mainly due to the competition between this term and the dipole-dipole interaction. The antiferroelectric layers do not undergo an antiferroelectric-to-ferroelectric transition under the influence of an electrostatic-like coupling between layers as the ferroelectric layer splits into periodic domains at the expense of the domain wall energy. The long-range interactions become significant near the interfaces. For high periodicity structures with several interfaces, the interlayer long-range interactions substantially impact the configuration of the ferroelectric layers while the antiferroelectric layers remain quite stable unless these layers are near the Neel temperature. In systems investigated with several interfaces, the hysteresis loops do not exhibit a clear presence of antiferroelectricity that could be expected in the presence of anti-parallel dipoles, i. e., the switching takes place abruptly. Some recent experimental observations in ferroelectric-antiferroelectric multilayers are discussed where we conclude that the different electrical properties of bilayers and superlattices are not only due to strain effects alone but also long-range interactions. The latter manifests itself particularly in superlattices where layers are periodically exposed to each other at the interfaces

    On staying grounded and avoiding Quixotic dead ends

    Get PDF
    The 15 articles in this special issue on The Representation of Concepts illustrate the rich variety of theoretical positions and supporting research that characterize the area. Although much agreement exists among contributors, much disagreement exists as well, especially about the roles of grounding and abstraction in conceptual processing. I first review theoretical approaches raised in these articles that I believe are Quixotic dead ends, namely, approaches that are principled and inspired but likely to fail. In the process, I review various theories of amodal symbols, their distortions of grounded theories, and fallacies in the evidence used to support them. Incorporating further contributions across articles, I then sketch a theoretical approach that I believe is likely to be successful, which includes grounding, abstraction, flexibility, explaining classic conceptual phenomena, and making contact with real-world situations. This account further proposes that (1) a key element of grounding is neural reuse, (2) abstraction takes the forms of multimodal compression, distilled abstraction, and distributed linguistic representation (but not amodal symbols), and (3) flexible context-dependent representations are a hallmark of conceptual processing
    • 

    corecore