117,988 research outputs found

    Enabling High-Dimensional Hierarchical Uncertainty Quantification by ANOVA and Tensor-Train Decomposition

    Get PDF
    Hierarchical uncertainty quantification can reduce the computational cost of stochastic circuit simulation by employing spectral methods at different levels. This paper presents an efficient framework to simulate hierarchically some challenging stochastic circuits/systems that include high-dimensional subsystems. Due to the high parameter dimensionality, it is challenging to both extract surrogate models at the low level of the design hierarchy and to handle them in the high-level simulation. In this paper, we develop an efficient ANOVA-based stochastic circuit/MEMS simulator to extract efficiently the surrogate models at the low level. In order to avoid the curse of dimensionality, we employ tensor-train decomposition at the high level to construct the basis functions and Gauss quadrature points. As a demonstration, we verify our algorithm on a stochastic oscillator with four MEMS capacitors and 184 random parameters. This challenging example is simulated efficiently by our simulator at the cost of only 10 minutes in MATLAB on a regular personal computer.Comment: 14 pages (IEEE double column), 11 figure, accepted by IEEE Trans CAD of Integrated Circuits and System

    Review of the mathematical foundations of data fusion techniques in surface metrology

    Get PDF
    The recent proliferation of engineered surfaces, including freeform and structured surfaces, is challenging current metrology techniques. Measurement using multiple sensors has been proposed to achieve enhanced benefits, mainly in terms of spatial frequency bandwidth, which a single sensor cannot provide. When using data from different sensors, a process of data fusion is required and there is much active research in this area. In this paper, current data fusion methods and applications are reviewed, with a focus on the mathematical foundations of the subject. Common research questions in the fusion of surface metrology data are raised and potential fusion algorithms are discussed

    Genomics clarifies taxonomic boundaries in a difficult species complex.

    Get PDF
    Efforts to taxonomically delineate species are often confounded with conflicting information and subjective interpretation. Advances in genomic methods have resulted in a new approach to taxonomic identification that stands to greatly reduce much of this conflict. This approach is ideal for species complexes, where divergence times are recent (evolutionarily) and lineages less well defined. The California Roach/Hitch fish species complex is an excellent example, experiencing a convoluted geologic history, diverse habitats, conflicting species designations and potential admixture between species. Here we use this fish complex to illustrate how genomics can be used to better clarify and assign taxonomic categories. We performed restriction-site associated DNA (RAD) sequencing on 255 Roach and Hitch samples collected throughout California to discover and genotype thousands of single nucleotide polymorphism (SNPs). Data were then used in hierarchical principal component, admixture, and FST analyses to provide results that consistently resolved a number of ambiguities and provided novel insights across a range of taxonomic levels. At the highest level, our results show that the CA Roach/Hitch complex should be considered five species split into two genera (4 + 1) as opposed to two species from distinct genera (1 +1). Subsequent levels revealed multiple subspecies and distinct population segments within identified species. At the lowest level, our results indicate Roach from a large coastal river are not native but instead introduced from a nearby river. Overall, this study provides a clear demonstration of the power of genomic methods for informing taxonomy and serves as a model for future studies wishing to decipher difficult species questions. By allowing for systematic identification across multiple scales, taxonomic structure can then be tied to historical and contemporary ecological, geographic or anthropogenic factors

    Calculation of Generalized Polynomial-Chaos Basis Functions and Gauss Quadrature Rules in Hierarchical Uncertainty Quantification

    Get PDF
    Stochastic spectral methods are efficient techniques for uncertainty quantification. Recently they have shown excellent performance in the statistical analysis of integrated circuits. In stochastic spectral methods, one needs to determine a set of orthonormal polynomials and a proper numerical quadrature rule. The former are used as the basis functions in a generalized polynomial chaos expansion. The latter is used to compute the integrals involved in stochastic spectral methods. Obtaining such information requires knowing the density function of the random input {\it a-priori}. However, individual system components are often described by surrogate models rather than density functions. In order to apply stochastic spectral methods in hierarchical uncertainty quantification, we first propose to construct physically consistent closed-form density functions by two monotone interpolation schemes. Then, by exploiting the special forms of the obtained density functions, we determine the generalized polynomial-chaos basis functions and the Gauss quadrature rules that are required by a stochastic spectral simulator. The effectiveness of our proposed algorithm is verified by both synthetic and practical circuit examples.Comment: Published by IEEE Trans CAD in May 201

    Holographic Reduced Representations for Oscillator Recall: A Model of Phonological Production

    Get PDF
    This paper describes a new computational model of phonological production, Holographic Reduced Representations for Oscillator Recall, or HORROR. HORROR's architecture accounts for phonological speech error patterns by combining the hierarchical oscillating context signal of the OSCAR serial-order model~\cite{VousdenEtAl:2000,BrownEtAl:2000} with a holographic associative memory~\cite{Plate:1995}. The resulting model is novel in a number of ways. Most importantly, all of the noise needed to generate errors is intrinsic to the system, instead of being generated by an external process. The model features fully-distributed hierarchical phoneme representations and a single distributed associative memory. Using fewer parameters and a more parsimonious design than OSCAR, HORROR accounts for error type proportions, the syllable-position constraint, and other constraints seen in the human speech error data
    corecore