611 research outputs found

    Bayesian interpretation of periodograms

    Full text link
    The usual nonparametric approach to spectral analysis is revisited within the regularization framework. Both usual and windowed periodograms are obtained as the squared modulus of the minimizer of regularized least squares criteria. Then, particular attention is paid to their interpretation within the Bayesian statistical framework. Finally, the question of unsupervised hyperparameter and window selection is addressed. It is shown that maximum likelihood solution is both formally achievable and practically useful

    Discriminating different classes of biological networks by analyzing the graphs spectra distribution

    Get PDF
    The brain's structural and functional systems, protein-protein interaction, and gene networks are examples of biological systems that share some features of complex networks, such as highly connected nodes, modularity, and small-world topology. Recent studies indicate that some pathologies present topological network alterations relative to norms seen in the general population. Therefore, methods to discriminate the processes that generate the different classes of networks (e.g., normal and disease) might be crucial for the diagnosis, prognosis, and treatment of the disease. It is known that several topological properties of a network (graph) can be described by the distribution of the spectrum of its adjacency matrix. Moreover, large networks generated by the same random process have the same spectrum distribution, allowing us to use it as a "fingerprint". Based on this relationship, we introduce and propose the entropy of a graph spectrum to measure the "uncertainty" of a random graph and the Kullback-Leibler and Jensen-Shannon divergences between graph spectra to compare networks. We also introduce general methods for model selection and network model parameter estimation, as well as a statistical procedure to test the nullity of divergence between two classes of complex networks. Finally, we demonstrate the usefulness of the proposed methods by applying them on (1) protein-protein interaction networks of different species and (2) on networks derived from children diagnosed with Attention Deficit Hyperactivity Disorder (ADHD) and typically developing children. We conclude that scale-free networks best describe all the protein-protein interactions. Also, we show that our proposed measures succeeded in the identification of topological changes in the network while other commonly used measures (number of edges, clustering coefficient, average path length) failed

    A Bayesian geoadditive relative survival analysis of registry data on breast cancer mortality

    Get PDF
    In this paper we develop a so called relative survival analysis, that is used to model the excess risk of a certain subpopulation relative to the natural mortality risk, i.e. the base risk that is present in the whole population. Such models are typically used in the area of clinical studies, that aim at identifying prognostic factors for disease specific mortality with data on specific causes of death being not available. Our work has been motivated by continuous-time spatially referenced survival data on breast cancer where causes of death are not known. This paper forms an extension of the analyses presented in Sauleau et al. (2007), where those data are analysed via a geoadditive, semiparametric approach, however without allowance to incorporate natural mortality. The usefulness of this relative survival approach is supported by means of a simulated data set

    The Principle of Proportionality in European Union Law as a Prerequisite for Penalization

    Get PDF
    The paper analysis the principle of proportionality, which is widely applied in the EU legal order and is therefore one of the fundamental principles of the system of the European Union. It is one of the legal principles that govern decision-making processes and common strategic objectives, and which are applicable when establishing European Union legislation and transposing it into national law, including in the area of criminal law, although the current analyses do not often focus on discussing this aspect. Due to its complexity and significance for the processes of establishing and applying the law, the principle of proportionality requires detailed and separate discussion, especially in the context of European criminal law

    The Physics of Communicability in Complex Networks

    Full text link
    A fundamental problem in the study of complex networks is to provide quantitative measures of correlation and information flow between different parts of a system. To this end, several notions of communicability have been introduced and applied to a wide variety of real-world networks in recent years. Several such communicability functions are reviewed in this paper. It is emphasized that communication and correlation in networks can take place through many more routes than the shortest paths, a fact that may not have been sufficiently appreciated in previously proposed correlation measures. In contrast to these, the communicability measures reviewed in this paper are defined by taking into account all possible routes between two nodes, assigning smaller weights to longer ones. This point of view naturally leads to the definition of communicability in terms of matrix functions, such as the exponential, resolvent, and hyperbolic functions, in which the matrix argument is either the adjacency matrix or the graph Laplacian associated with the network. Considerable insight on communicability can be gained by modeling a network as a system of oscillators and deriving physical interpretations, both classical and quantum-mechanical, of various communicability functions. Applications of communicability measures to the analysis of complex systems are illustrated on a variety of biological, physical and social networks. The last part of the paper is devoted to a review of the notion of locality in complex networks and to computational aspects that by exploiting sparsity can greatly reduce the computational efforts for the calculation of communicability functions for large networks.Comment: Review Article. 90 pages, 14 figures. Contents: Introduction; Communicability in Networks; Physical Analogies; Comparing Communicability Functions; Communicability and the Analysis of Networks; Communicability and Localization in Complex Networks; Computability of Communicability Functions; Conclusions and Prespective

    A computer code for forward calculation and inversion of the H/V spectral ratio under the diffuse field assumption

    Get PDF
    During a quarter of a century, the main characteristics of the horizontal-to-vertical spectral ratio of ambient noise HVSRN have been extensively used for site effect assessment. In spite of the uncertainties about the optimum theoretical model to describe these observations, several schemes for inversion of the full HVSRN curve for near surface surveying have been developed over the last decade. In this work, a computer code for forward calculation of H/V spectra based on the diffuse field assumption (DFA) is presented and tested.It takes advantage of the recently stated connection between the HVSRN and the elastodynamic Green's function which arises from the ambient noise interferometry theory. The algorithm allows for (1) a natural calculation of the Green's functions imaginary parts by using suitable contour integrals in the complex wavenumber plane, and (2) separate calculation of the contributions of Rayleigh, Love, P-SV and SH waves as well. The stability of the algorithm at high frequencies is preserved by means of an adaptation of the Wang's orthonormalization method to the calculation of dispersion curves, surface-waves medium responses and contributions of body waves. This code has been combined with a variety of inversion methods to make up a powerful tool for passive seismic surveying.Comment: Published in Computers & Geosciences 97, 67-7

    Machine Learning Approach to Personality Type Prediction Based on the Myers–Briggs Type Indicator¼

    Get PDF
    Neuro Linguistic Programming (NLP) is a collection of techniques for personality development. Meta programmes, which are habitual ways of inputting, sorting and filtering the information found in the world around us, are a vital factor in NLP. Differences in meta programmes result in significant differences in behaviour from one person to another. Personality types can be recognized through utilizing and analysing meta programmes. There are different methods to predict personality types based on meta programmes. The Myers–Briggs Type Indicator¼ (MBTI) is currently considered as one of the most popular and reliable methods. In this study, a new machine learning method has been developed for personality type prediction based on the MBTI. The performance of the new methodology presented in this study has been compared to other existing methods and the results show better accuracy and reliability. The results of this study can assist NLP practitioners and psychologists in regards to identification of personality types and associated cognitive processes

    The Copula Approach to Sample Selection Modelling: An Application to the Recreational Value of Forests

    Get PDF
    The sample selection model is based upon a bivariate or a multivariate structure, and distributional assumptions are in this context more severe than in univariate settings, due to the limited availability of tractable multivariate distributions. While the standard FIML estimation of the selectivity model assumes normality of the joint distribution, alternative approaches require less stringent distributional hypotheses. As shown by Smith (2003), copulas allow great flexibility also in FIML models. The copula model is very useful in situations where the applied researcher has a prior on the distributional form of the margins, since it allows separating their modelling from that of the dependence structure. In the present paper the copula approach to sample selection is first compared to the semiparametric approach and to the standard FIML, bivariate normal model, in an illustrative application on female work data. Then its performance is analysed more thoroughly in an application to Contingent Valuation data on recreational values of forests.Contingent valuation, Selectivity bias, Bivariate models, Copulas
    • 

    corecore