3 research outputs found

    Detecting Emerging Areas in Social Streams

    Get PDF
    Detecting the emerging areas becomes interest by the fast development of social networks. As the information exchanged in social networks post include not only the text but also images, URLs and video therefore conventional-term-frequency-based approaches may not be appropriate in this context. Emergence of areas is focused by social aspects of these networks. To detect the emergence of new areas from the hundreds of users based on the responds in social network posts. A probability model is proposed for mentioning behavior of social networks by the number of mentions per post and the occurrence of users taking place in the mentions. The basic assumption is that a new emerging topic is something people feel like discussing, stating or forwarding the data further to their friends. In the proposed system the link anomaly model is combined with word based and text based approach. DOI: 10.17762/ijritcc2321-8169.15039

    On Statistical Modelling and Hypothesis Testing by Information Theoretic Methods

    Get PDF
    The main objective of this thesis is to study various information theoretic methods and criteria in the context of statistical model selection. The focus in this research is on Rissanen’s Minimum Description Length (MDL) principle and its variants, with a special emphasis on the Normalized Maximum Likelihood (NML).We extend the Rissanen methodology for coping with infinite parametric complexity and discuss two particular cases. This is applied for deriving four NMLcriteria and investigate their performance. Furthermore, we find the connection between Stochastic Complexity (SC), defined as minus logarithm of NML, and other model selection criteria.We also study the use of information theoretic criteria (ITC) for selecting the order of autoregressive (AR) models in the presence of nonstationarity. In particular, we give a modified version of Sequentially NML (SNML) when the model parameters are estimated by forgetting factor LS algorithm.Another contribution of the thesis is in connection with the new approach for composite hypothesis testing using Optimally Distinguishable Distributions (ODD). The ODD-detector for subspace signals in Gaussian noise is introduced and its performance is evaluated.Additionally, we exploit the Kolmogorov Structure Function (KSF) to derive a new criterion for cepstral nulling, which has been recently applied to the problem of periodogram smoothing.Finally, the problem of fairness in multiaccess communication systems is investigated and a new method is proposed. The new approach is based on partitioning the network into subnetworks and employing two different multiple-access schemes within and across subnetworks. It is also introduced an algorithm for selecting optimally the subnetworks such that to achieve the max-min fairness

    Greedy adaptive algorithms for sparse representations

    Get PDF
    A vector or matrix is said to be sparse if the number of non-zero elements is significantly smaller than the number of zero elements. In estimation theory the vectors of model parameters can be known in advance to have a sparse structure, and solving an estimation problem taking into account this constraint can improve substantially the accuracy of the solution. The theory of sparse models has advanced significantly in recent years providing many results that can guarantee certain properties of the sparse solutions. These performance guarantees can be very powerful in applications and they have no correspondent in the estimation theory for non-sparse models. Model sparsity is an inherent characteristic of many applications (image compressing, wireless channel estimation, direction of arrival) in signal processing and other related areas.Due to the continuous technological advances that allow faster numerical computations, optimization problems, too complex to be solved in the past, are now able to provide better solutions by considering also sparsity constraints. However, an exhaustive search to finding sparse solutions generally requires a combinatorial search for the correct support, a very limiting factor due to the huge numerical complexity. This motivated a growing interest towards developing batch sparsity aware algorithms in the past twenty years. More recently, the main goal for the continuous research related to sparsity is the quest for faster, less computational intensive, adaptive methods able to recursively update the solution. In this thesis we present several such algorithms. They are greedy in nature and minimize the least squares criterion under the constraint that the solution is sparse. Similarly to other greedy sparse methods, two main steps are performed once new data are available: update the sparse support by changing the positions that contribute to the solution; compute the coefficients towards the minimization of the least squares criterion restricted to the current support. Two classes of adaptive algorithms were proposed. The first is derived from the batch matching pursuit algorithm. It uses a coordinate descent approach to update the solution, each coordinate being selected by a criterion similar to the one used by matching pursuit. We devised two algorithms that use a cyclic update strategy to improve the solution at each time instant. Since the solution support and coefficient values are assumed to vary slowly, a faster and better performing approach is later proposed by spreading the coordinate descent update in time. It was also adapted to work in a distributed setup in which different nodes communicate with their neighbors to improve their local solution towards a global optimum. The second direction can be linked to the batch orthogonal least squares. The algorithms maintain a partial QR decomposition with pivoting and require a permutation based support selection strategy to ensure a low complexity while allowing the tracking of slow variations in the support. Two versions of the algorithm were proposed. They allow past data to be forgotten by using an exponential or a sliding window, respectively. The former was modified to improve the solution in a structured sparsity case, when the solution is group sparse. We also propose mechanisms for estimating online the sparsity level. They are based on information theoretic criteria, namely the predictive least squares and the Bayesian information criterion. The main contributions are the development of the adaptive greedy algorithms and the use of the information theoretic criteria enabling the algorithms to behave robustly. The algorithms have good performance, require limited prior information and are computationally efficient. Generally, the configuration parameters, if they exist, can be easily chosen as a tradeoff between the stationary error and the convergence speed
    corecore