90,325 research outputs found
Efficient Invariant Features for Sensor Variability Compensation in Speaker Recognition
In this paper, we investigate the use of invariant features for speaker recognition. Owing to their characteristics, these features are introduced to cope with the difficult and challenging problem of sensor variability and the source of performance degradation inherent in speaker recognition systems. Our experiments show: (1) the effectiveness of these features in match cases; (2) the benefit of combining these features with the mel frequency cepstral coefficients to exploit their discrimination power under uncontrolled conditions (mismatch cases). Consequently, the proposed invariant features result in a performance improvement as demonstrated by a reduction in the equal error rate and the minimum decision cost function compared to the GMM-UBM speaker recognition systems based on MFCC features
The composition of Event-B models
The transition from classical B [2] to the Event-B language and method [3] has seen the removal of some forms of model structuring and composition, with the intention of reinventing them in future. This work contributes to thatreinvention. Inspired by a proposed method for state-based decomposition and refinement [5] of an Event-B model, we propose a familiar parallel event composition (over disjoint state variable lists), and the less familiar event fusion (over intersecting state variable lists). A brief motivation is provided for these and other forms of composition of models, in terms of feature-based modelling. We show that model consistency is preserved under such compositions. More significantly we show that model composition preserves refinement
More "normal" than normal: scaling distributions and complex systems
One feature of many naturally occurring or engineered complex systems is tremendous variability in event sizes. To account for it, the behavior of these systems is often described using power law relationships or scaling distributions, which tend to be viewed as "exotic" because of their unusual properties (e.g., infinite moments). An alternate view is based on mathematical, statistical, and data-analytic arguments and suggests that scaling distributions should be viewed as "more normal than normal". In support of this latter view that has been advocated by Mandelbrot for the last 40 years, we review in this paper some relevant results from probability theory and illustrate a powerful statistical approach for deciding whether the variability associated with observed event sizes is consistent with an underlying Gaussian-type (finite variance) or scaling-type (infinite variance) distribution. We contrast this approach with traditional model fitting techniques and discuss its implications for future modeling of complex systems
The hard X-ray spectrum of NGC 1365: scattered light, not black hole spin
Active Galactic Nuclei (AGN) show excess X-ray emission above 10 keV compared
with extrapolation of spectra from lower energies. Risaliti et al. have
recently attempted to model the hard X-ray excess in the type 1.8 AGN NGC 1365,
concluding that the hard excess most likely arises from Compton-scattered
reflection of X-rays from an inner accretion disk close to the black hole.
Their analysis disfavored a model in which the hard excess arises from a high
column density of circumnuclear gas partially covering a primary X-ray source,
despite such components being required in the NGC 1365 data below 10 keV. Using
a Monte Carlo radiative transfer approach, we demonstrate that this conclusion
is invalidated by (i) use of slab absorption models, which have unrealistic
transmission spectra for partial covering gas, (ii) neglect of the effect of
Compton scattering on transmitted spectra and (iii) inadequate modeling of the
spectrum of scattered X-rays. The scattered spectrum is geometry dependent and,
for high global covering factors, may dominate above 10 keV. We further show
that, in models of circumnuclear gas, the suppression of the observed hard
X-ray flux by reprocessing may be no larger than required by the `light
bending' model invoked for inner disk reflection, and the expected emission
line strengths lie within the observed range. We conclude that the
time-invariant `red wing' in AGN X-ray spectra is probably caused by continuum
transmitted through and scattered from circumnuclear gas, not by highly
redshifted line emission, and that measurement of black hole spin is not
possible.Comment: Revised version, accepted for publication by Ap.J. Letter
Towards a method for rigorous development of generic requirements patterns
We present work in progress on a method for the engineering, validation and verification of generic requirements using domain engineering and formal methods. The need to develop a generic requirement set for subsequent system instantiation is complicated by the addition of the high levels of verification demanded by safety-critical domains such as avionics. Our chosen application domain is the failure detection and management function for engine control systems: here generic requirements drive a software product line of target systems. A pilot formal specification and design exercise is undertaken on a small (twosensor) system element. This exercise has a number of aims: to support the domain analysis, to gain a view of appropriate design abstractions, for a B novice to gain experience in the B method and tools, and to evaluate the usability and utility of that method.We also present a prototype method for the production and verification of a generic requirement set in our UML-based formal notation, UML-B, and tooling developed in support. The formal verification both of the structural generic requirement set, and of a particular application, is achieved via translation to the formal specification language, B, using our U2B and ProB tools
On Relaxing Metric Information in Linear Temporal Logic
Metric LTL formulas rely on the next operator to encode time distances,
whereas qualitative LTL formulas use only the until operator. This paper shows
how to transform any metric LTL formula M into a qualitative formula Q, such
that Q is satisfiable if and only if M is satisfiable over words with
variability bounded with respect to the largest distances used in M (i.e.,
occurrences of next), but the size of Q is independent of such distances.
Besides the theoretical interest, this result can help simplify the
verification of systems with time-granularity heterogeneity, where large
distances are required to express the coarse-grain dynamics in terms of
fine-grain time units.Comment: Minor change
Visual Representations: Defining Properties and Deep Approximations
Visual representations are defined in terms of minimal sufficient statistics
of visual data, for a class of tasks, that are also invariant to nuisance
variability. Minimal sufficiency guarantees that we can store a representation
in lieu of raw data with smallest complexity and no performance loss on the
task at hand. Invariance guarantees that the statistic is constant with respect
to uninformative transformations of the data. We derive analytical expressions
for such representations and show they are related to feature descriptors
commonly used in computer vision, as well as to convolutional neural networks.
This link highlights the assumptions and approximations tacitly assumed by
these methods and explains empirical practices such as clamping, pooling and
joint normalization.Comment: UCLA CSD TR140023, Nov. 12, 2014, revised April 13, 2015, November
13, 2015, February 28, 201
Global Production Increased by Spatial Heterogeneity in a Population Dynamics Model
Spatial and temporal heterogeneity are often described as important factors having a strong impact on biodiversity. The effect of heterogeneity is in most cases analyzed by the response of biotic interactions such as competition of predation. It may also modify intrinsic population properties such as growth rate. Most of the studies are theoretic since it is often difficult to manipulate spatial heterogeneity in practice. Despite the large number of studies dealing with this topics, it is still difficult to understand how the heterogeneity affects populations dynamics. On the basis of a very simple model, this paper aims to explicitly provide a simple mechanism which can explain why spatial heterogeneity may be a favorable factor for production.We consider a two patch model and a logistic growth is assumed on each patch. A general condition on the migration rates and the local subpopulation growth rates is provided under which the total carrying capacity is higher than the sum of the local carrying capacities, which is not intuitive. As we illustrate, this result is robust under stochastic perturbations
Phenomenological Quantum Gravity
These notes summarize a set of lectures on phenomenological quantum gravity
which one of us delivered and the other attended with great diligence. They
cover an assortment of topics on the border between theoretical quantum gravity
and observational anomalies. Specifically, we review non-linear relativity in
its relation to loop quantum gravity and high energy cosmic rays. Although we
follow a pedagogic approach we include an open section on unsolved problems,
presented as exercises for the student. We also review varying constant models:
the Brans-Dicke theory, the Bekenstein varying model, and several more
radical ideas. We show how they make contact with strange high-redshift data,
and perhaps other cosmological puzzles. We conclude with a few remaining
observational puzzles which have failed to make contact with quantum gravity,
but who knows... We would like to thank Mario Novello for organizing an
excellent school in Mangaratiba, in direct competition with a very fine beach
indeed.Comment: Lectures given at XI BSC
- …