4,004,811 research outputs found
Event Shape Analysis in ALICE
The jets are the final state manifestation of the hard parton scattering.
Since at LHC energies the production of hard processes in proton-proton
collisions will be copious and varied, it is important to develop methods to
identify them through the study of their final states. In the present work we
describe a method based on the use of some shape variables to discriminate
events according their topologies. A very attractive feature of this analysis
is the possibility of using the tracking information of the TPC+ITS in order to
identify specific events like jets. Through the correlation between the
quantities: thrust and recoil, calculated in minimum bias simulations of
proton-proton collisions at 10 TeV, we show the sensitivity of the method to
select specific topologies and high multiplicity. The presented results were
obtained both at level generator and after reconstruction. It remains that with
any kind of jet reconstruction algorithm one will confronted in general with
overlapping jets. The present method determines areas where one does encounter
special topologies of jets in an event. The aim is not to supplant the usual
jet reconstruction algorithms, but rather to allow an easy selection of events
allowing then the application of algorithms.Comment: 24 pages, ALICE Not
Statistical Shape Analysis using Kernel PCA
©2006 SPIE--The International Society for Optical Engineering. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.
The electronic version of this article is the complete one and can be found online at: http://dx.doi.org/10.1117/12.641417DOI:10.1117/12.641417Presented at Image Processing
Algorithms and Systems, Neural Networks, and Machine Learning, 16-18 January 2006, San Jose, California, USA.Mercer kernels are used for a wide range of image and signal processing tasks like de-noising, clustering, discriminant analysis etc. These algorithms construct their solutions in terms of the expansions in a high-dimensional feature space F. However, many applications like kernel PCA (principal component analysis) can be used more effectively if a pre-image of the projection in the feature space is available. In this paper, we propose a novel method to reconstruct a unique approximate pre-image of a feature vector and apply it for statistical shape analysis. We provide some experimental results to demonstrate the advantages of kernel PCA over linear PCA for shape learning, which include, but are not limited to, ability to learn and distinguish multiple geometries of shapes and robustness to occlusions
Compression for Smooth Shape Analysis
Most 3D shape analysis methods use triangular meshes to discretize both the
shape and functions on it as piecewise linear functions. With this
representation, shape analysis requires fine meshes to represent smooth shapes
and geometric operators like normals, curvatures, or Laplace-Beltrami
eigenfunctions at large computational and memory costs.
We avoid this bottleneck with a compression technique that represents a
smooth shape as subdivision surfaces and exploits the subdivision scheme to
parametrize smooth functions on that shape with a few control parameters. This
compression does not affect the accuracy of the Laplace-Beltrami operator and
its eigenfunctions and allow us to compute shape descriptors and shape
matchings at an accuracy comparable to triangular meshes but a fraction of the
computational cost.
Our framework can also compress surfaces represented by point clouds to do
shape analysis of 3D scanning data
Data-Driven Shape Analysis and Processing
Data-driven methods play an increasingly important role in discovering
geometric, structural, and semantic relationships between 3D shapes in
collections, and applying this analysis to support intelligent modeling,
editing, and visualization of geometric data. In contrast to traditional
approaches, a key feature of data-driven approaches is that they aggregate
information from a collection of shapes to improve the analysis and processing
of individual shapes. In addition, they are able to learn models that reason
about properties and relationships of shapes without relying on hard-coded
rules or explicitly programmed instructions. We provide an overview of the main
concepts and components of these techniques, and discuss their application to
shape classification, segmentation, matching, reconstruction, modeling and
exploration, as well as scene analysis and synthesis, through reviewing the
literature and relating the existing works with both qualitative and numerical
comparisons. We conclude our report with ideas that can inspire future research
in data-driven shape analysis and processing.Comment: 10 pages, 19 figure
Line-shape analysis of charmonium resonances
We discuss weather the new enhancements found by BES, alias the ,
, , and are true resonances. We argue that the
nearby thresholds , ,
and , as well as
the and states have a strong influence over the
observed and line-shapes. We propose an
unitarized effective Lagrangian model to study the dynamical effect of the
interaction between each known state and its closest thresholds. In
addition, we present some of our recent motivating results, using the same
model, for the resonance, where the distortion from a Breit-Wigner
line-shape is shown to result not only from the kinematic interference, but
also from the influence of the one-loops. Moreover, two
poles were found, at about 3.78 GeV and at 3.74 GeV, the second one generated
dynamically, yet contributing to the distortion of the line-shape.Comment: Proceedings of the Conference "Hadron 17", held on 25-29 September,
2017, in Salamanca, Spai
HBT shape analysis with q-cumulants
Taking up and extending earlier suggestions, we show how two- and
threedimensional shapes of second-order HBT correlations can be described in a
multivariate Edgeworth expansion around gaussian ellipsoids, with expansion
coefficients, identified as the cumulants of pair momentum difference q, acting
as shape parameters. Off-diagonal terms dominate both the character and
magnitude of shapes. Cumulants can be measured directly and so the shape
analysis has no need for fitting.Comment: 8 pages, 6 figures for a total of 29 subfigs, revtex4. Typos
corrected, three missing terms added, minor text change
Shape deformation analysis from the optimal control viewpoint
A crucial problem in shape deformation analysis is to determine a deformation
of a given shape into another one, which is optimal for a certain cost. It has
a number of applications in particular in medical imaging. In this article we
provide a new general approach to shape deformation analysis, within the
framework of optimal control theory, in which a deformation is represented as
the flow of diffeomorphisms generated by time-dependent vector fields. Using
reproducing kernel Hilbert spaces of vector fields, the general shape
deformation analysis problem is specified as an infinite-dimensional optimal
control problem with state and control constraints. In this problem, the states
are diffeomorphisms and the controls are vector fields, both of them being
subject to some constraints. The functional to be minimized is the sum of a
first term defined as geometric norm of the control (kinetic energy of the
deformation) and of a data attachment term providing a geometric distance to
the target shape. This point of view has several advantages. First, it allows
one to model general constrained shape analysis problems, which opens new
issues in this field. Second, using an extension of the Pontryagin maximum
principle, one can characterize the optimal solutions of the shape deformation
problem in a very general way as the solutions of constrained geodesic
equations. Finally, recasting general algorithms of optimal control into shape
analysis yields new efficient numerical methods in shape deformation analysis.
Overall, the optimal control point of view unifies and generalizes different
theoretical and numerical approaches to shape deformation problems, and also
allows us to design new approaches. The optimal control problems that result
from this construction are infinite dimensional and involve some constraints,
and thus are nonstandard. In this article we also provide a rigorous and
complete analysis of the infinite-dimensional shape space problem with
constraints and of its finite-dimensional approximations
Structural Analysis: Shape Information via Points-To Computation
This paper introduces a new hybrid memory analysis, Structural Analysis,
which combines an expressive shape analysis style abstract domain with
efficient and simple points-to style transfer functions. Using data from
empirical studies on the runtime heap structures and the programmatic idioms
used in modern object-oriented languages we construct a heap analysis with the
following characteristics: (1) it can express a rich set of structural, shape,
and sharing properties which are not provided by a classic points-to analysis
and that are useful for optimization and error detection applications (2) it
uses efficient, weakly-updating, set-based transfer functions which enable the
analysis to be more robust and scalable than a shape analysis and (3) it can be
used as the basis for a scalable interprocedural analysis that produces precise
results in practice.
The analysis has been implemented for .Net bytecode and using this
implementation we evaluate both the runtime cost and the precision of the
results on a number of well known benchmarks and real world programs. Our
experimental evaluations show that the domain defined in this paper is capable
of precisely expressing the majority of the connectivity, shape, and sharing
properties that occur in practice and, despite the use of weak updates, the
static analysis is able to precisely approximate the ideal results. The
analysis is capable of analyzing large real-world programs (over 30K bytecodes)
in less than 65 seconds and using less than 130MB of memory. In summary this
work presents a new type of memory analysis that advances the state of the art
with respect to expressive power, precision, and scalability and represents a
new area of study on the relationships between and combination of concepts from
shape and points-to analyses
- …
