24 research outputs found

    Improving Recommendation Quality by Merging Collaborative Filtering and Social Relationships

    Get PDF
    Matrix Factorization techniques have been successfully applied to raise the quality of suggestions generated\ud by Collaborative Filtering Systems (CFSs). Traditional CFSs\ud based on Matrix Factorization operate on the ratings provided\ud by users and have been recently extended to incorporate\ud demographic aspects such as age and gender. In this paper we\ud propose to merge CF techniques based on Matrix Factorization\ud and information regarding social friendships in order to\ud provide users with more accurate suggestions and rankings\ud on items of their interest. The proposed approach has been\ud evaluated on a real-life online social network; the experimental\ud results show an improvement against existing CF approaches.\ud A detailed comparison with related literature is also presen

    Multiscale statistical analysis of coronal solar activity

    Full text link
    Multi-filter images from the solar corona are used to obtain temperature maps which are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions we show that the multiscale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also be extracted from the analysis.Comment: 24 pages, 18 figure

    LATENT SEMANTIC ANALYSIS (LSA) DAN AUTOMATIC TEXT SUMMARIZATION (ATS) DALAM OPTIMASI PENCARIAN ARTIKEL COVID 19

    Get PDF
    AbstrakThe Covid 19 pandemic has given much awareness to all people around the world about the importance of maintaining health and changing lifestyles and lifestyles to be healthier. Clear, correct and precise information is indispensable to provide insight into this respiratory virus. Digital media is widely used by the public to find links about the covid19 virus. Health topics about covid 19 from several sites will be collected by scrapping method, and the data retrieval results will be processed to become an automatic summary using Latent Semantic Analysis (LSA), where this method, will help to find the hidden meaning of a collection of sentences. The formation of the summary is assisted by the cross method. The system also has an article search to allow users to find the right information. The results of this study showed that LSA method assisted by the cross method could be used in automatic summary shrinking well, test results in f-measure and recall values on average of 90.68% and 85% with the percentage of trained data: test data is 90:10. Data collection conducted during February-June 2020 was taken 120 training documents, and 12 test documents. Testing is done with a compression rate of 30%Keywords: automatic summary, health article, scrapping, latent semantic analysis, singular value decomposition, cross methodPandemi Covid 19 telah memberikan banyak penyadaran pada seluruh masyarakat dunia mengenai pentingnya menjaga kesehatan dan merubah pola hidup dan gaya hidup menjadi lebih sehat. Informasi yang jelas, benar dan tepat sangat diperlukan untuk memberi wawasan tentang virus pernafasan ini. Media digital banyak dipakai oleh masyarakat untuk mencari tautan mengenai virus covid19. Topik kesehatan mengenai covid 19 dari beberapa situs akan dikumpulkan dengan metode scrapping, dan hasil pengambilan data akan diolah untuk menjadi sebuah ringkasan otomatis dengan menggunakan Latent Semantic Analysis(LSA), dimana metode ini, akan membantu untuk menemukan makna tersembunyi dari sebuah kumpulan kalimat.Pembentukan ringkasan dibantu dengan metode cross method. Sistem ini juga memiliki sebuah pencarian artikel, untuk membuat pengguna dapat menemukan informasi secarap tepat. Hasil dari penelitian ini menunjukan bahwa metode LSA yang dibantu dengan cross method dapat digunakan dalam penyusan ringkasan otomatis dengan baik, Hasil pengujian menghasilkan nilai f-measure dan recall rata-rata sebesar 90.68% dan 85% dengan presentase data latih: data uji adalah 90:10. Pengumpulan data dilakukan selama bulan Februari-Juni 2020 diambil 120 dokumen latih, dan 12 dokumen uji. Pengujian dilakukan dengan compression rate sebesar 30%Kata kunci: ringkasan otomatis, berita kesehatan, scrapping, Latent Semantic Analysis,Singular Value Decomposition, Cross Method

    Credit Scoring for M-Shwari using Hidden Markov Model

    Get PDF
    The introduction of mobile based Micro-credit facility, M-Shwari, has heightened the need to develop a proper decision support system to classify the customers based on their credit scores. This arises due to lack of proper information on the poor and unbanked as they are locked out of the formal banking sector. A classification technique, the hidden Markov model, is used. The poor customers’ scanty deposits and withdrawal dynamics in the M-Shwari account estimate the credit risk factors that are used in training and learning the hidden Markov model. The data is generated through simulation and customers categorized in terms of their credit scores and credit quality levels. The model classifies over 80 percent of the customers as having average and good credit quality level. This approach offers a simple and novice method to cater for the unbanked and poor with minimal or no financial history thus increasing financial inclusion in Kenya

    Simple stopping criteria for information theoretic feature selection

    Full text link
    Feature selection aims to select the smallest feature subset that yields the minimum generalization error. In the rich literature in feature selection, information theory-based approaches seek a subset of features such that the mutual information between the selected features and the class labels is maximized. Despite the simplicity of this objective, there still remain several open problems in optimization. These include, for example, the automatic determination of the optimal subset size (i.e., the number of features) or a stopping criterion if the greedy searching strategy is adopted. In this paper, we suggest two stopping criteria by just monitoring the conditional mutual information (CMI) among groups of variables. Using the recently developed multivariate matrix-based Renyi's \alpha-entropy functional, which can be directly estimated from data samples, we showed that the CMI among groups of variables can be easily computed without any decomposition or approximation, hence making our criteria easy to implement and seamlessly integrated into any existing information theoretic feature selection methods with a greedy search strategy.Comment: Paper published in the journal of Entrop

    Augmented Computational Design: Methodical Application of Artificial Intelligence in Generative Design

    Full text link
    This chapter presents methodological reflections on the necessity and utility of artificial intelligence in generative design. Specifically, the chapter discusses how generative design processes can be augmented by AI to deliver in terms of a few outcomes of interest or performance indicators while dealing with hundreds or thousands of small decisions. The core of the performance-based generative design paradigm is about making statistical or simulation-driven associations between these choices and consequences for mapping and navigating such a complex decision space. This chapter will discuss promising directions in Artificial Intelligence for augmenting decision-making processes in architectural design for mapping and navigating complex design spaces.Comment: This is the author's version of the book chapter Augmented Computational Design: Methodical Application of Artificial Intelligence in Generative Design. In Artificial Intelligence in Performance-Driven Design: Theories, Methods, and Tools Towards Sustainability, edited by Narjes Abbasabadi and Mehdi Ashayeri. Wiley, 202

    Blind source separation for clutter and noise suppression in ultrasound imaging:review for different applications

    Get PDF
    Blind source separation (BSS) refers to a number of signal processing techniques that decompose a signal into several 'source' signals. In recent years, BSS is increasingly employed for the suppression of clutter and noise in ultrasonic imaging. In particular, its ability to separate sources based on measures of independence rather than their temporal or spatial frequency content makes BSS a powerful filtering tool for data in which the desired and undesired signals overlap in the spectral domain. The purpose of this work was to review the existing BSS methods and their potential in ultrasound imaging. Furthermore, we tested and compared the effectiveness of these techniques in the field of contrast-ultrasound super-resolution, contrast quantification, and speckle tracking. For all applications, this was done in silico, in vitro, and in vivo. We found that the critical step in BSS filtering is the identification of components containing the desired signal and highlighted the value of a priori domain knowledge to define effective criteria for signal component selection

    Multilayer Networks

    Full text link
    In most natural and engineered systems, a set of entities interact with each other in complicated patterns that can encompass multiple types of relationships, change in time, and include other types of complications. Such systems include multiple subsystems and layers of connectivity, and it is important to take such "multilayer" features into account to try to improve our understanding of complex systems. Consequently, it is necessary to generalize "traditional" network theory by developing (and validating) a framework and associated tools to study multilayer systems in a comprehensive fashion. The origins of such efforts date back several decades and arose in multiple disciplines, and now the study of multilayer networks has become one of the most important directions in network science. In this paper, we discuss the history of multilayer networks (and related concepts) and review the exploding body of work on such networks. To unify the disparate terminology in the large body of recent work, we discuss a general framework for multilayer networks, construct a dictionary of terminology to relate the numerous existing concepts to each other, and provide a thorough discussion that compares, contrasts, and translates between related notions such as multilayer networks, multiplex networks, interdependent networks, networks of networks, and many others. We also survey and discuss existing data sets that can be represented as multilayer networks. We review attempts to generalize single-layer-network diagnostics to multilayer networks. We also discuss the rapidly expanding research on multilayer-network models and notions like community structure, connected components, tensor decompositions, and various types of dynamical processes on multilayer networks. We conclude with a summary and an outlook.Comment: Working paper; 59 pages, 8 figure

    Dimensionality Reduction of Hyperspectral Imagery Using Random Projections

    Get PDF
    Hyperspectral imagery is often associated with high storage and transmission costs. Dimensionality reduction aims to reduce the time and space complexity of hyperspectral imagery by projecting data into a low-dimensional space such that all the important information in the data is preserved. Dimensionality-reduction methods based on transforms are widely used and give a data-dependent representation that is unfortunately costly to compute. Recently, there has been a growing interest in data-independent representations for dimensionality reduction; of particular prominence are random projections which are attractive due to their computational efficiency and simplicity of implementation. This dissertation concentrates on exploring the realm of computationally fast and efficient random projections by considering projections based on a random Hadamard matrix. These Hadamard-based projections are offered as an alternative to more widely used random projections based on dense Gaussian matrices. Such Hadamard matrices are then coupled with a fast singular value decomposition in order to implement a two-stage dimensionality reduction that marries the computational benefits of the data-independent random projection to the structure-capturing capability of the data-dependent singular value transform. Finally, random projections are applied in conjunction with nonnegative least squares to provide a computationally lightweight methodology for the well-known spectral-unmixing problem. Overall, it is seen that random projections offer a computationally efficient framework for dimensionality reduction that permits hyperspectral-analysis tasks such as unmixing and classification to be conducted in a lower-dimensional space without sacrificing analysis performance while reducing computational costs significantly

    Learning the Parametric Transfer Function of Unitary Operations for Real-Time Evaluation of Manufacturing Processes Involving Operations Sequencing

    Get PDF
    For better designing manufacturing processes, surrogate models were widely considered in the past, where the effect of different material and process parameters was considered from the use of a parametric solution. The last contains the solution of the model describing the system under study, for any choice of the selected parameters. These surrogate models, also known as meta-models, virtual charts or computational vademecum, in the context of model order reduction, were successfully employed in a variety of industrial applications. However, they remain confronted to a major difficulty when the number of parameters grows exponentially. Thus, processes involving trajectories or sequencing entail a combinatorial exposition (curse of dimensionality) not only due to the number of possible combinations, but due to the number of parameters needed to describe the process. The present paper proposes a promising route for circumventing, or at least alleviating that difficulty. The proposed technique consists of a parametric transfer function that, as soon as it is learned, allows for, from a given state, inferring the new state after the application of a unitary operation, defined as a step in the sequenced process. Thus, any sequencing can be evaluated almost in real time by chaining that unitary transfer function, whose output becomes the input of the next operation. The benefits and potential of such a technique are illustrated on a problem of industrial relevance, the one concerning the induced deformation on a structural part when printing on it a series of stiffeners
    corecore