1,228 research outputs found

    Hierarchical Models for Independence Structures of Networks

    Get PDF
    We introduce a new family of network models, called hierarchical network models, that allow us to represent in an explicit manner the stochastic dependence among the dyads (random ties) of the network. In particular, each member of this family can be associated with a graphical model defining conditional independence clauses among the dyads of the network, called the dependency graph. Every network model with dyadic independence assumption can be generalized to construct members of this new family. Using this new framework, we generalize the Erd\"os-R\'enyi and beta-models to create hierarchical Erd\"os-R\'enyi and beta-models. We describe various methods for parameter estimation as well as simulation studies for models with sparse dependency graphs.Comment: 19 pages, 7 figure

    Bayesian optimization of the PC algorithm for learning Gaussian Bayesian networks

    Full text link
    The PC algorithm is a popular method for learning the structure of Gaussian Bayesian networks. It carries out statistical tests to determine absent edges in the network. It is hence governed by two parameters: (i) The type of test, and (ii) its significance level. These parameters are usually set to values recommended by an expert. Nevertheless, such an approach can suffer from human bias, leading to suboptimal reconstruction results. In this paper we consider a more principled approach for choosing these parameters in an automatic way. For this we optimize a reconstruction score evaluated on a set of different Gaussian Bayesian networks. This objective is expensive to evaluate and lacks a closed-form expression, which means that Bayesian optimization (BO) is a natural choice. BO methods use a model to guide the search and are hence able to exploit smoothness properties of the objective surface. We show that the parameters found by a BO method outperform those found by a random search strategy and the expert recommendation. Importantly, we have found that an often overlooked statistical test provides the best over-all reconstruction results

    Localizing the Latent Structure Canonical Uncertainty: Entropy Profiles for Hidden Markov Models

    Get PDF
    This report addresses state inference for hidden Markov models. These models rely on unobserved states, which often have a meaningful interpretation. This makes it necessary to develop diagnostic tools for quantification of state uncertainty. The entropy of the state sequence that explains an observed sequence for a given hidden Markov chain model can be considered as the canonical measure of state sequence uncertainty. This canonical measure of state sequence uncertainty is not reflected by the classic multivariate state profiles computed by the smoothing algorithm, which summarizes the possible state sequences. Here, we introduce a new type of profiles which have the following properties: (i) these profiles of conditional entropies are a decomposition of the canonical measure of state sequence uncertainty along the sequence and makes it possible to localize this uncertainty, (ii) these profiles are univariate and thus remain easily interpretable on tree structures. We show how to extend the smoothing algorithms for hidden Markov chain and tree models to compute these entropy profiles efficiently.Comment: Submitted to Journal of Machine Learning Research; No RR-7896 (2012

    Bayesian Networks for Max-linear Models

    Full text link
    We study Bayesian networks based on max-linear structural equations as introduced in Gissibl and Kl\"uppelberg [16] and provide a summary of their independence properties. In particular we emphasize that distributions for such networks are generally not faithful to the independence model determined by their associated directed acyclic graph. In addition, we consider some of the basic issues of estimation and discuss generalized maximum likelihood estimation of the coefficients, using the concept of a generalized likelihood ratio for non-dominated families as introduced by Kiefer and Wolfowitz [21]. Finally we argue that the structure of a minimal network asymptotically can be identified completely from observational data.Comment: 18 page

    A Bayesian Approach to Inverse Quantum Statistics

    Full text link
    A nonparametric Bayesian approach is developed to determine quantum potentials from empirical data for quantum systems at finite temperature. The approach combines the likelihood model of quantum mechanics with a priori information over potentials implemented in form of stochastic processes. Its specific advantages are the possibilities to deal with heterogeneous data and to express a priori information explicitly, i.e., directly in terms of the potential of interest. A numerical solution in maximum a posteriori approximation was feasible for one--dimensional problems. Using correct a priori information turned out to be essential.Comment: 4 pages, 6 figures, revte

    A Neural Circuit Arbitrates between Persistence and Withdrawal in Hungry Drosophila

    No full text
    In pursuit of food, hungry animals mobilize significant energy resources and overcome exhaustion and fear. How need and motivation control the decision to continue or change behavior is not understood. Using a single fly treadmill, we show that hungry flies persistently track a food odor and increase their effort over repeated trials in the absence of reward suggesting that need dominates negative experience. We further show that odor tracking is regulated by two mushroom body output neurons (MBONs) connecting the MB to the lateral horn. These MBONs, together with dopaminergic neurons and Dop1R2 signaling, control behavioral persistence. Conversely, an octopaminergic neuron, VPM4, which directly innervates one of the MBONs, acts as a brake on odor tracking by connecting feeding and olfaction. Together, our data suggest a function for the MB in internal state-dependent expression of behavior that can be suppressed by external inputs conveying a competing behavioral drive

    Microscopic Origin of Quantum Chaos in Rotational Damping

    Full text link
    The rotational spectrum of 168^{168}Yb is calculated diagonalizing different effective interactions within the basis of unperturbed rotational bands provided by the cranked shell model. A transition between order and chaos taking place in the energy region between 1 and 2 MeV above the yrast line is observed, associated with the onset of rotational damping. It can be related to the higher multipole components of the force acting among the unperturbed rotational bands.Comment: 7 pages, plain TEX, YITP/K-99

    Steinberg modules and Donkin pairs

    Full text link
    We prove that in positive characteristic a module with good filtration for a group of type E6 restricts to a module with good filtration for a subgroup of type F4. (Recall that a filtration of a module for a semisimple algebraic group is called good if its layers are dual Weyl modules.) Our result confirms a conjecture of Brundan for one more case. The method relies on the canonical Frobenius splittings of Mathieu. Next we settle the remaining cases, in characteristic not 2, with a computer-aided variation on the old method of Donkin.Comment: 16 pages; proof of Brundan's conjecture adde

    Semiclassical Quantisation Using Diffractive Orbits

    Full text link
    Diffraction, in the context of semiclassical mechanics, describes the manner in which quantum mechanics smooths over discontinuities in the classical mechanics. An important example is a billiard with sharp corners; its semiclassical quantisation requires the inclusion of diffractive periodic orbits in addition to classical periodic orbits. In this paper we construct the corresponding zeta function and apply it to a scattering problem which has only diffractive periodic orbits. We find that the resonances are accurately given by the zeros of the diffractive zeta function.Comment: Revtex document. Submitted to PRL. Figures available on reques
    • …
    corecore