8,567 research outputs found
Shearlet-based regularization in statistical inverse learning with an application to x-ray tomography
Statistical inverse learning theory, a field that lies at the intersection of inverse problems and statistical learning, has lately gained more and more attention. In an effort to steer this interplay more towards the variational regularization framework, convergence rates have recently been proved for a class of convex, p-homogeneous regularizers with p (1, 2], in the symmetric Bregman distance. Following this path, we take a further step towards the study of sparsity-promoting regularization and extend the aforementioned convergence rates to work with .," p -norm regularization, with p (1, 2), for a special class of non-tight Banach frames, called shearlets, and possibly constrained to some convex set. The p = 1 case is approached as the limit case (1, 2) p → 1, by complementing numerical evidence with a (partial) theoretical analysis, based on arguments from "-convergence theory. We numerically validate our theoretical results in the context of x-ray tomography, under random sampling of the imaging angles, using both simulated and measured data. This application allows to effectively verify the theoretical decay, in addition to providing a motivation for the extension to shearlet-based regularization
Dimension 2 condensates and Polyakov Chiral Quark Models
We address a possible relation between the expectation value of the Polyakov
loop in pure gluodynamics and full QCD based on Polyakov Chiral Quark Models
where constituent quarks and the Polyakov loop are coupled in a minimal way. To
this end we use a center symmetry breaking Gaussian model for the Polyakov loop
distribution which accurately reproduces gluodynamics data above the phase
transition in terms of dimension 2 gluon condensate. The role played by the
quantum and local nature of the Polyakov loop is emphasized.Comment: 3 pages, 1 figure. Talk given at the IVth International Conference on
Quarks an Nuclear Physics, Madrid, June 5th-10th 200
GlobVolcano Project Overview
The GlobVolcano project is part of the ESA DUE
programme. The project aims at demonstrating EO-based
services to support the Volcanological Observatories and other
mandate users (e.g. Civil Protection authorities, scientific
communities of volcanoes) in their monitoring activities.
During the project a worldwide selection of user organizations
will cooperate with the GlobVolcano team in order to harmonize
user’s requirements and to evaluate the EO-based services . The
“Osservatorio Vesuviano” of Naples (INGV-Italy) coordinates
the communications between the project and the User
Community. IPGP of Paris is responsible for the scientific
coordination and the validation activities.
The project activities are split in two phases. During the first
phase (completed in June 2008) the service infrastructure and
interface to the users have been developed. Prototype EO-based
information products have been generated and validated. Service
provision on pre-operational basis will take place during the
second phase
A General Optimization Technique for High Quality Community Detection in Complex Networks
Recent years have witnessed the development of a large body of algorithms for
community detection in complex networks. Most of them are based upon the
optimization of objective functions, among which modularity is the most common,
though a number of alternatives have been suggested in the scientific
literature. We present here an effective general search strategy for the
optimization of various objective functions for community detection purposes.
When applied to modularity, on both real-world and synthetic networks, our
search strategy substantially outperforms the best existing algorithms in terms
of final scores of the objective function; for description length, its
performance is on par with the original Infomap algorithm. The execution time
of our algorithm is on par with non-greedy alternatives present in literature,
and networks of up to 10,000 nodes can be analyzed in time spans ranging from
minutes to a few hours on average workstations, making our approach readily
applicable to tasks which require the quality of partitioning to be as high as
possible, and are not limited by strict time constraints. Finally, based on the
most effective of the available optimization techniques, we compare the
performance of modularity and code length as objective functions, in terms of
the quality of the partitions one can achieve by optimizing them. To this end,
we evaluated the ability of each objective function to reconstruct the
underlying structure of a large set of synthetic and real-world networks.Comment: MAIN text: 14 pages, 4 figures, 1 table Supplementary information: 19
pages, 8 figures, 5 table
- …