361,056 research outputs found

    Non-parametric Bayesian modeling of complex networks

    Full text link
    Modeling structure in complex networks using Bayesian non-parametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This paper provides a gentle introduction to non-parametric Bayesian modeling of complex networks: Using an infinite mixture model as running example we go through the steps of deriving the model as an infinite limit of a finite parametric model, inferring the model parameters by Markov chain Monte Carlo, and checking the model's fit and predictive performance. We explain how advanced non-parametric models for complex networks can be derived and point out relevant literature

    Finding Structural Information of RF Power Amplifiers using an Orthogonal Non-Parametric Kernel Smoothing Estimator

    Full text link
    A non-parametric technique for modeling the behavior of power amplifiers is presented. The proposed technique relies on the principles of density estimation using the kernel method and is suited for use in power amplifier modeling. The proposed methodology transforms the input domain into an orthogonal memory domain. In this domain, non-parametric static functions are discovered using the kernel estimator. These orthogonal, non-parametric functions can be fitted with any desired mathematical structure, thus facilitating its implementation. Furthermore, due to the orthogonality, the non-parametric functions can be analyzed and discarded individually, which simplifies pruning basis functions and provides a tradeoff between complexity and performance. The results show that the methodology can be employed to model power amplifiers, therein yielding error performance similar to state-of-the-art parametric models. Furthermore, a parameter-efficient model structure with 6 coefficients was derived for a Doherty power amplifier, therein significantly reducing the deployment's computational complexity. Finally, the methodology can also be well exploited in digital linearization techniques.Comment: Matlab sample code (15 MB): https://dl.dropboxusercontent.com/u/106958743/SampleMatlabKernel.zi

    Kneadings, Symbolic Dynamics and Painting Lorenz Chaos. A Tutorial

    Full text link
    A new computational technique based on the symbolic description utilizing kneading invariants is proposed and verified for explorations of dynamical and parametric chaos in a few exemplary systems with the Lorenz attractor. The technique allows for uncovering the stunning complexity and universality of bi-parametric structures and detect their organizing centers - codimension-two T-points and separating saddles in the kneading-based scans of the iconic Lorenz equation from hydrodynamics, a normal model from mathematics, and a laser model from nonlinear optics.Comment: Journal of Bifurcations and Chaos, 201

    Parametric PDEs: Sparse or Low-Rank Approximations?

    Full text link
    We consider adaptive approximations of the parameter-to-solution map for elliptic operator equations depending on a large or infinite number of parameters, comparing approximation strategies of different degrees of nonlinearity: sparse polynomial expansions, general low-rank approximations separating spatial and parametric variables, and hierarchical tensor decompositions separating all variables. We describe corresponding adaptive algorithms based on a common generic template and show their near-optimality with respect to natural approximability assumptions for each type of approximation. A central ingredient in the resulting bounds for the total computational complexity are new operator compression results for the case of infinitely many parameters. We conclude with a comparison of the complexity estimates based on the actual approximability properties of classes of parametric model problems, which shows that the computational costs of optimized low-rank expansions can be significantly lower or higher than those of sparse polynomial expansions, depending on the particular type of parametric problem

    Variational Inference of Joint Models using Multivariate Gaussian Convolution Processes

    Full text link
    We present a non-parametric prognostic framework for individualized event prediction based on joint modeling of both longitudinal and time-to-event data. Our approach exploits a multivariate Gaussian convolution process (MGCP) to model the evolution of longitudinal signals and a Cox model to map time-to-event data with longitudinal data modeled through the MGCP. Taking advantage of the unique structure imposed by convolved processes, we provide a variational inference framework to simultaneously estimate parameters in the joint MGCP-Cox model. This significantly reduces computational complexity and safeguards against model overfitting. Experiments on synthetic and real world data show that the proposed framework outperforms state-of-the art approaches built on two-stage inference and strong parametric assumptions

    Holographic Complexity of Einstein-Maxwell-Dilaton Gravity

    Full text link
    We study the holographic complexity of Einstein-Maxwell-Dilaton gravity using the recently proposed "complexity = volume" and "complexity = action" dualities. The model we consider has a ground state that is represented in the bulk via a so-called hyperscaling violating geometry. We calculate the action growth of the Wheeler-DeWitt patch of the corresponding black hole solution at non-zero temperature and find that, in the presence of violations of hyperscaling, there is a parametric enhancement of the action growth rate. We partially match this behavior to simple tensor network models which can capture aspects of hyperscaling violation. We also exhibit the switchback effect in complexity growth using shockwave geometries and comment on a subtlety of our action calculations when the metric is discontinuous at a null surface.Comment: 30 pages; v2: Fixed a technical error. Corrected result no longer has a logarithmic divergence in the action growth rate associated with the singularity. Conjectured complexity growth rate now also matches better with tensor network model
    corecore