49,722 research outputs found
A markovian approach to unsupervised change detection with multiresolution and multimodality SAR data
In the framework of synthetic aperture radar (SAR) systems, current satellite missions make it possible to acquire images at very high and multiple spatial resolutions with short revisit times. This scenario conveys a remarkable potential in applications to, for instance, environmental monitoring and natural disaster recovery. In this context, data fusion and change detection methodologies play major roles. This paper proposes an unsupervised change detection algorithmfor the challenging case of multimodal SAR data collected by sensors operating atmultiple spatial resolutions. The method is based on Markovian probabilistic graphical models, graph cuts, linear mixtures, generalized Gaussian distributions, Gram-Charlier approximations, maximum likelihood and minimum mean squared error estimation. It benefits from the SAR images acquired at multiple spatial resolutions and with possibly different modalities on the considered acquisition times to generate an output change map at the finest observed resolution. This is accomplished by modeling the statistics of the data at the various spatial scales through appropriate generalized Gaussian distributions and by iteratively estimating a set of virtual images that are defined on the pixel grid at the finest resolution and would be collected if all the sensors could work at that resolution. A Markov random field framework is adopted to address the detection problem by defining an appropriate multimodal energy function that is minimized using graph cuts
Changepoint Detection over Graphs with the Spectral Scan Statistic
We consider the change-point detection problem of deciding, based on noisy
measurements, whether an unknown signal over a given graph is constant or is
instead piecewise constant over two connected induced subgraphs of relatively
low cut size. We analyze the corresponding generalized likelihood ratio (GLR)
statistics and relate it to the problem of finding a sparsest cut in a graph.
We develop a tractable relaxation of the GLR statistic based on the
combinatorial Laplacian of the graph, which we call the spectral scan
statistic, and analyze its properties. We show how its performance as a testing
procedure depends directly on the spectrum of the graph, and use this result to
explicitly derive its asymptotic properties on few significant graph
topologies. Finally, we demonstrate both theoretically and by simulations that
the spectral scan statistic can outperform naive testing procedures based on
edge thresholding and testing
F-formation Detection: Individuating Free-standing Conversational Groups in Images
Detection of groups of interacting people is a very interesting and useful
task in many modern technologies, with application fields spanning from
video-surveillance to social robotics. In this paper we first furnish a
rigorous definition of group considering the background of the social sciences:
this allows us to specify many kinds of group, so far neglected in the Computer
Vision literature. On top of this taxonomy, we present a detailed state of the
art on the group detection algorithms. Then, as a main contribution, we present
a brand new method for the automatic detection of groups in still images, which
is based on a graph-cuts framework for clustering individuals; in particular we
are able to codify in a computational sense the sociological definition of
F-formation, that is very useful to encode a group having only proxemic
information: position and orientation of people. We call the proposed method
Graph-Cuts for F-formation (GCFF). We show how GCFF definitely outperforms all
the state of the art methods in terms of different accuracy measures (some of
them are brand new), demonstrating also a strong robustness to noise and
versatility in recognizing groups of various cardinality.Comment: 32 pages, submitted to PLOS On
Modelling of content-aware indicators for effective determination of shot boundaries in compressed MPEG videos
In this paper, a content-aware approach is proposed to design multiple test conditions for shot cut detection, which are organized into a multiple phase decision tree for abrupt cut detection and a finite state machine for dissolve detection. In comparison with existing approaches, our algorithm is characterized with two categories of content difference indicators and testing. While the first category indicates the content changes that are directly used for shot cut detection, the second category indicates the contexts under which the content change occurs. As a result, indications of frame differences are tested with context awareness to make the detection of shot cuts adaptive to both content and context changes. Evaluations announced by TRECVID 2007 indicate that our proposed algorithm achieved comparable performance to those using machine learning approaches, yet using a simpler feature set and straightforward design strategies. This has validated the effectiveness of modelling of content-aware indicators for decision making, which also provides a good alternative to conventional approaches in this topic
Spectral Clustering with Imbalanced Data
Spectral clustering is sensitive to how graphs are constructed from data
particularly when proximal and imbalanced clusters are present. We show that
Ratio-Cut (RCut) or normalized cut (NCut) objectives are not tailored to
imbalanced data since they tend to emphasize cut sizes over cut values. We
propose a graph partitioning problem that seeks minimum cut partitions under
minimum size constraints on partitions to deal with imbalanced data. Our
approach parameterizes a family of graphs, by adaptively modulating node
degrees on a fixed node set, to yield a set of parameter dependent cuts
reflecting varying levels of imbalance. The solution to our problem is then
obtained by optimizing over these parameters. We present rigorous limit cut
analysis results to justify our approach. We demonstrate the superiority of our
method through unsupervised and semi-supervised experiments on synthetic and
real data sets.Comment: 24 pages, 7 figures. arXiv admin note: substantial text overlap with
arXiv:1302.513
Using Triangles to Improve Community Detection in Directed Networks
In a graph, a community may be loosely defined as a group of nodes that are
more closely connected to one another than to the rest of the graph. While
there are a variety of metrics that can be used to specify the quality of a
given community, one common theme is that flows tend to stay within
communities. Hence, we expect cycles to play an important role in community
detection. For undirected graphs, the importance of triangles -- an undirected
3-cycle -- has been known for a long time and can be used to improve community
detection. In directed graphs, the situation is more nuanced. The smallest
cycle is simply two nodes with a reciprocal connection, and using information
about reciprocation has proven to improve community detection. Our new idea is
based on the four types of directed triangles that contain cycles. To identify
communities in directed networks, then, we propose an undirected edge-weighting
scheme based on the type of the directed triangles in which edges are involved.
We also propose a new metric on quality of the communities that is based on the
number of 3-cycles that are split across communities. To demonstrate the impact
of our new weighting, we use the standard METIS graph partitioning tool to
determine communities and show experimentally that the resulting communities
result in fewer 3-cycles being cut. The magnitude of the effect varies between
a 10 and 50% reduction, and we also find evidence that this weighting scheme
improves a task where plausible ground-truth communities are known.Comment: 10 pages, 3 figure
Jamming aided Generalized Data Attacks: Exposing Vulnerabilities in Secure Estimation
Jamming refers to the deletion, corruption or damage of meter measurements
that prevents their further usage. This is distinct from adversarial data
injection that changes meter readings while preserving their utility in state
estimation. This paper presents a generalized attack regime that uses jamming
of secure and insecure measurements to greatly expand the scope of common
'hidden' and 'detectable' data injection attacks in literature. For 'hidden'
attacks, it is shown that with jamming, the optimal attack is given by the
minimum feasible cut in a specific weighted graph. More importantly, for
'detectable' data attacks, this paper shows that the entire range of relative
costs for adversarial jamming and data injection can be divided into three
separate regions, with distinct graph-cut based constructions for the optimal
attack. Approximate algorithms for attack design are developed and their
performances are demonstrated by simulations on IEEE test cases. Further, it is
proved that prevention of such attacks require security of all grid
measurements. This work comprehensively quantifies the dual adversarial
benefits of jamming: (a) reduced attack cost and (b) increased resilience to
secure measurements, that strengthen the potency of data attacks.Comment: 11 pages, 8 figures, A version of this will appear in HICSS 201
- …