1,051 research outputs found

    Spatio-Temporal Multiway Data Decomposition Using Principal Tensor Analysis on k-Modes: The R Package PTAk

    Get PDF
    The purpose of this paper is to describe the R package {PTAk and how the spatio-temporal context can be taken into account in the analyses. Essentially PTAk() is a multiway multidimensional method to decompose a multi-entries data-array, seen mathematically as a tensor of any order. This PTAk-modes method proposes a way of generalizing SVD (singular value decomposition), as well as some other well known methods included in the R package, such as PARAFAC or CANDECOMP and the PCAn-modes or Tucker-n model. The example datasets cover different domains with various spatio-temporal characteristics and issues: (i)~medical imaging in neuropsychology with a functional MRI (magnetic resonance imaging) study, (ii)~pharmaceutical research with a pharmacodynamic study with EEG (electro-encephaloegraphic) data for a central nervous system (CNS) drug, and (iii)~geographical information system (GIS) with a climatic dataset that characterizes arid and semi-arid variations. All the methods implemented in the R package PTAk also support non-identity metrics, as well as penalizations during the optimization process. As a result of these flexibilities, together with pre-processing facilities, PTAk constitutes a framework for devising extensions of multidimensional methods such ascorrespondence analysis, discriminant analysis, and multidimensional scaling, also enabling spatio-temporal constraints.

    Machine Learning-Based Ontology Mapping Tool to Enable Interoperability in Coastal Sensor Networks

    Get PDF
    In today’s world, ontologies are being widely used for data integration tasks and solving information heterogeneity problems on the web because of their capability in providing explicit meaning to the information. The growing need to resolve the heterogeneities between different information systems within a domain of interest has led to the rapid development of individual ontologies by different organizations. These ontologies designed for a particular task could be a unique representation of their project needs. Thus, integrating distributed and heterogeneous ontologies by finding semantic correspondences between their concepts has become the key point to achieve interoperability among different representations. In this thesis, an advanced instance-based ontology matching algorithm has been proposed to enable data integration tasks in ocean sensor networks, whose data are highly heterogeneous in syntax, structure, and semantics. This provides a solution to the ontology mapping problem in such systems based on machine-learning methods and string-based methods

    Unified and Conceptual Context Analysis in Ubiquitous Environments

    No full text
    International audienceThis article presents an original approach for the analysis of context information in ubiquitous environments. Large volumes of heterogeneous data are now collected, such as location, temperature, etc. This "environmental" context may be enriched by data related to users, e.g., their activities or applications. We propose a unified analysis and correlation of all these dimensions of context in order to measure their impact on user activities. Formal Concept Analysis and association rules are used to discover non-trivial relationships between context elements and activities, which, otherwise, could seem independent. Our goal is to make an optimal use of available data in order to understand user behavior and eventually make recommendations. In this paper, we describe our general methodology for context analysis and we illustrate it on an experiment conducted on real data collected by a capture system. Thanks to this methodology, it is possible to identify correlation between context elements and user applications, making possible to recommend such applications for user in similar situations

    Formal Concept Analysis for Digital Ecosystem

    Get PDF
    Formal Concept Analysis (FCA) is an effective tool for data analysis and knowledge discovery. Concept lattice, which is derived from mathematical order theory and lattice theory, is the core of FCA. Many research works of various areas show that concept lattices structures is an effective platform for data mining, machine learning, information retrieval, software engineer, etc. This paper offers a brief overview of FCA and proposes to apply FCA as a tool for analysis and visualization of data in Digital ecosystem, and also discusses the applications of data mining for Digital ecosystem

    Refinement Strategies for Correlating Context and User Behavior in Pervasive Information Systems

    Get PDF
    International audienceLarge amounts of traces can be collected by Pervasive Information Systems, reflecting user's actions and the context in which these actions have been performed (location, date, time, network connection, etc.). This article proposes refinement strategies with different frequency measurements on contextual elements in order to better analyze the impact of these elements on the user's behavior. These strategies are based on data mining and Formal Concept Analysis and used to refine input data in order to identify the context elements that have a strong impact on user behaviors. We go further on context analysis by cognizing FCA with semantic distance measures calculated based on a context ontology. The proposed context analysis is further on evaluated in experiments with real data. The novelties of this work lies on these refinement strategies which can lead to a better understanding of context impact. Such understanding represents an important step towards personalization and recommendation features

    Automation Process for Morphometric Analysis of Volumetric CT Data from Pulmonary Vasculature in Rats

    Get PDF
    With advances in medical imaging scanners, it has become commonplace to generate large multidimensional datasets. These datasets require tools for a rapid, thorough analysis. To address this need, we have developed an automated algorithm for morphometric analysis incorporating A Visualization Workshop computational and image processing libraries for three-dimensional segmentation, vascular tree generation and structural hierarchical ordering with a two-stage numeric optimization procedure for estimating vessel diameters. We combine this new technique with our mathematical models of pulmonary vascular morphology to quantify structural and functional attributes of lung arterial trees. Our physiological studies require repeated measurements of vascular structure to determine differences in vessel biomechanical properties between animal models of pulmonary disease. Automation provides many advantages including significantly improved speed and minimized operator interaction and biasing. The results are validated by comparison with previously published rat pulmonary arterial micro-CT data analysis techniques, in which vessels were manually mapped and measured using intense operator intervention

    Automating embedded analysis capabilities and managing software complexity in multiphysics simulation part I: template-based generic programming

    Full text link
    An approach for incorporating embedded simulation and analysis capabilities in complex simulation codes through template-based generic programming is presented. This approach relies on templating and operator overloading within the C++ language to transform a given calculation into one that can compute a variety of additional quantities that are necessary for many state-of-the-art simulation and analysis algorithms. An approach for incorporating these ideas into complex simulation codes through general graph-based assembly is also presented. These ideas have been implemented within a set of packages in the Trilinos framework and are demonstrated on a simple problem from chemical engineering
    • …
    corecore