259 research outputs found
Localizing the Common Action Among a Few Videos
This paper strives to localize the temporal extent of an action in a long
untrimmed video. Where existing work leverages many examples with their start,
their ending, and/or the class of the action during training time, we propose
few-shot common action localization. The start and end of an action in a long
untrimmed video is determined based on just a hand-full of trimmed video
examples containing the same action, without knowing their common class label.
To address this task, we introduce a new 3D convolutional network architecture
able to align representations from the support videos with the relevant query
video segments. The network contains: (\textit{i}) a mutual enhancement module
to simultaneously complement the representation of the few trimmed support
videos and the untrimmed query video; (\textit{ii}) a progressive alignment
module that iteratively fuses the support videos into the query branch; and
(\textit{iii}) a pairwise matching module to weigh the importance of different
support videos. Evaluation of few-shot common action localization in untrimmed
videos containing a single or multiple action instances demonstrates the
effectiveness and general applicability of our proposal.Comment: ECCV 202
Graph Layouts by t‐SNE
We propose a new graph layout method based on a modification of the t-distributed Stochastic Neighbor Embedding (t-SNE) dimensionality reduction technique. Although t-SNE is one of the best techniques for visualizing high-dimensional data as 2D scatterplots, t-SNE has not been used in the context of classical graph layout. We propose a new graph layout method, tsNET, based on representing a graph with a distance matrix, which together with a modified t-SNE cost function results in desirable layouts. We evaluate our method by a formal comparison with state-of-the-art methods, both visually and via established quality metrics on a comprehensive benchmark, containing real-world and synthetic graphs. As evidenced by the quality metrics and visual inspection, tsNET produces excellent layouts
Non-equilibrium relaxation of hot states in organic semiconductors: Impact of mode-selective excitation on charge transfer.
The theoretical study of open quantum systems strongly coupled to a vibrational environment remains computationally challenging due to the strongly non-Markovian characteristics of the dynamics. We study this problem in the case of a molecular dimer of the organic semiconductor tetracene, the exciton states of which are strongly coupled to a few hundreds of molecular vibrations. To do so, we employ a previously developed tensor network approach, based on the formalism of matrix product states. By analyzing the entanglement structure of the system wavefunction, we can expand it in a tree tensor network state, which allows us to perform a fully quantum mechanical time evolution of the exciton-vibrational system, including the effect of 156 molecular vibrations. We simulate the dynamics of hot states, i.e., states resulting from excess energy photoexcitation, by constructing various initial bath states, and show that the exciton system indeed has a memory of those initial configurations. In particular, the specific pathway of vibrational relaxation is shown to strongly affect the quantum coherence between exciton states in time scales relevant for the ultrafast dynamics of application-relevant processes such as charge transfer. The preferential excitation of low-frequency modes leads to a limited number of relaxation pathways, thus "protecting" quantum coherence and leading to a significant increase in the charge transfer yield in the dimer structure.A.M.A. acknowledges the support of the Engineering and Physical Sciences Research Council (EPSRC) for funding under Grant No. EP/L015552/1
Application of Machine Learning Techniques to Parameter Selection for Flight Risk Identification
In recent years, the use of data mining and machine learning techniques for safety analysis,
incident and accident investigation, and fault detection has gained traction among the aviation
community. Flight data collected from recording devices contains a large number of heterogeneous
parameters, sometimes reaching up to thousands on modern commercial aircraft. More
data is being collected continuously which adds to the ever-increasing pool of data available for
safety analysis. However, among the data collected, not all parameters are important from a
risk and safety analysis perspective. Similarly, in order to be useful for modern analysis techniques
such as machine learning, using thousands of parameters collected at a high frequency
might not be computationally tractable. As such, an intelligent and repeatable methodology to
select a reduced set of significant parameters is required to allow safety analysts to focus on the
right parameters for risk identification. In this paper, a step-by-step methodology is proposed
to down-select a reduced set of parameters that can be used for safety analysis. First, correlation
analysis is conducted to remove highly correlated, duplicate, or redundant parameters
from the data set. Second, a pre-processing step removes metadata and empty parameters.
This step also considers requirements imposed by regulatory bodies such as the Federal Aviation
Administration and subject matter experts to further trim the list of parameters. Third,
a clustering algorithm is used to group similar flights and identify abnormal operations and
anomalies. A retrospective analysis is conducted on the clusters to identify their characteristics
and impact on flight safety. Finally, analysis of variance techniques are used to identify which
parameters were significant in the formation of the clusters. Visualization dashboards were
created to analyze the cluster characteristics and parameter significance. This methodology is
employed on data from the approach phase of a representative single-aisle aircraft to demonstrate
its application and robustness across heterogeneous data sets. It is envisioned that this
methodology can be further extended to other phases of flight and aircraft
IdeaHound: Improving Large-scale Collaborative Ideation with Crowd-powered Real-time Semantic Modeling
Prior work on creativity support tools demonstrates how a computational semantic model of a solution space can enable interventions that substantially improve the number, quality and diversity of ideas. However, automated semantic modeling often falls short when people contribute short text snippets or sketches. Innovation platforms can employ humans to provide semantic judgments to construct a semantic model, but this relies on external workers completing a large number of tedious micro tasks. This requirement threatens both accuracy (external workers may lack expertise and context to make accurate semantic judgments) and scalability (external workers are costly). In this paper, we introduce IDEAHOUND, an ideation system that seamlessly integrates the task of defining semantic relationships among ideas into the primary task of idea generation. The system combines implicit human actions with machine learning to create a computational semantic model of the emerging solution space. The integrated nature of these judgments allows IDEAHOUND to leverage the expertise and efforts of participants who are already motivated to contribute to idea generation, overcoming the issues of scalability inherent to existing approaches. Our results show that participants were equally willing to use (and just as productive using) IDEAHOUND compared to a conventional platform that did not require organizing ideas. Our integrated crowdsourcing approach also creates a more accurate semantic model than an existing crowdsourced approach (performed by external crowds). We demonstrate how this model enables helpful creative interventions: providing diverse inspirational examples, providing similar ideas for a given idea and providing a visual overview of the solution space.Engineering and Applied Science
Visual Analysis of a Cold Rolling Process Using Data-Based Modeling
International Conference on Engineering Applications of Neural Networks (13th. 2012. Coventry Univ, Otaniemi, Finland
Interactive Visual Data Exploration with Subjective Feedback
Data visualization and iterative/interactive data mining are growing rapidly in attention, both in research as well as in industry. However, integrated methods and tools that combine advanced visualization and data mining techniques are rare, and those that exist are often specialized to a single problem or domain. In this paper, we introduce a novel generic method for interactive visual exploration of high-dimensional data. In contrast to most visualization tools, it is not based on the traditional dogma of manually zooming and rotating data. Instead, the tool initially presents the user with an ‘interesting’ projection of the data and then employs data randomization with constraints to allow users to flexibly and intuitively express their interests or beliefs using visual interactions that correspond to exactly defined constraints. These constraints expressed by the user are then taken into account by a projection-finding algorithm to compute a new ‘interesting’ projection, a process that can be iterated until the user runs out of time or finds that constraints explain everything she needs to find from the data. We present the tool by means of two case studies, one controlled study on synthetic data and another on real census data. The data and software related to this paper are available at http://www.interesting-patterns.net/forsied/interactive-visual-data-exploration-with-subjective-feedback/
Spectral Graph Analysis for Process Monitoring
Process monitoring is a fundamental task to support operator decisions under ab- normal situations. Most process monitoring approaches, such as Principal Components Analysis and Locality Preserving Projections, are based on dimensionality reduction. In this paper Spectral Graph Analysis Monitoring (SGAM) is introduced. SGAM is a new process monitoring technique that does not require dimensionality reduction techniques. The approach it is based on the spectral graph analysis theory. Firstly, a weighted graph representation of process measurements is developed. Secondly, the process behavior is parameterized by means of graph spectral features, in particular the graph algebraic connectivity and the graph spectral energy. The developed methodology has been illustrated in autocorrelated and non-linear synthetic cases, and applied to the well known Tennessee Eastman process benchmark with promising results.Fil: Musulin, Estanislao. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Rosario. Centro Internacional Franco Argentino de Ciencias de la Información y Sistemas; Argentin
- …