5,170 research outputs found
Synthetic speech detection and audio steganography in VoIP scenarios
The distinction between synthetic and human voice uses the techniques of the current biometric voice recognition systems, which prevent that a person’s voice, no matter if with good or bad intentions, can be confused with someone else’s. Steganography gives the possibility to hide in a file without a particular value (usually audio, video or image files) a hidden message in such a way as to not rise suspicion to any external observer. This article suggests two methods, applicable in a VoIP hypothetical scenario, which allow us to distinguish a synthetic speech from a human voice, and to insert within the Comfort Noise a text message generated in the pauses of a voice conversation. The first method takes up the studies already carried out for the Modulation Features related to the temporal analysis of the speech signals, while the second one proposes a technique that derives from the Direct Sequence Spread Spectrum, which consists in distributing the signal energy to hide on a wider band transmission.
Due to space limits, this paper is only an extended abstract. The full version will contain further details on our research
Public investment, the Stability Pact and the ‘golden rule’
The fiscal rules set in the Treaty of Maastricht and in the Stability and Growth Pact have sometimes been criticised as an excessively binding constraint for appropriate counter-cyclical action. The risk that the rules may permanently reduce the public sector’s contribution to capital accumulation has also been pointed out. In this framework, the adoption of a ‘golden rule’ has been suggested. Starting from the recent debate, this paper tackles two questions: (a) the implications of the Pact for public investment and (b) the pros and cons of introducing a golden rule in EMU’s fiscal framework, given the objectives of low public debts and adequate margins for a stabilising budgetary policy. The analysis suggests that the rules set in the Treaty and in the Pact may negatively influence public investment spending. However, the golden rule, although intuitively appealing, does not seem to be an appropriate solution to the problem.
Numerical modelling of ground-tunnel support interaction using bedded-beam-spring model with fuzzy parameters
The study of the ground-tunnel interaction by introducing a predetermined degree of variation (fuzziness) in some parameters of the chosen model is presented and discussed. This research comes from the consideration that tunnel model parameters and geometry are usually affected by a degree of uncertainty, mainly due to construction imprecision and the great variability of rock mass properties. The research has been developed by using the fuzzy set theory assuming that three model parameters are affected by a certain amount of uncertainty (defined by the so-called membership functions). The response of the numerical model is calculated by solving the fuzzy equations for different shapes of the membership functions. In order to investigate the effects of some model parameters, and to provide a simple procedure and tool for the designers, a study on the effect of tunnel boundary conditions, based on a fuzzy model, has been carried out using a simple but well known and widely used design method such as the bedded-beam-spring mode
Inference of the Kinetic Ising Model with Heterogeneous Missing Data
We consider the problem of inferring a causality structure from multiple
binary time series by using the Kinetic Ising Model in datasets where a
fraction of observations is missing. We take our steps from a recent work on
Mean Field methods for the inference of the model with hidden spins and develop
a pseudo-Expectation-Maximization algorithm that is able to work even in
conditions of severe data sparsity. The methodology relies on the
Martin-Siggia-Rose path integral method with second order saddle-point solution
to make it possible to calculate the log-likelihood in polynomial time, giving
as output a maximum likelihood estimate of the couplings matrix and of the
missing observations. We also propose a recursive version of the algorithm,
where at every iteration some missing values are substituted by their maximum
likelihood estimate, showing that the method can be used together with
sparsification schemes like LASSO regularization or decimation. We test the
performance of the algorithm on synthetic data and find interesting properties
when it comes to the dependency on heterogeneity of the observation frequency
of spins and when some of the hypotheses that are necessary to the saddle-point
approximation are violated, such as the small couplings limit and the
assumption of statistical independence between couplings
Centrality metrics and localization in core-periphery networks
Two concepts of centrality have been defined in complex networks. The first
considers the centrality of a node and many different metrics for it has been
defined (e.g. eigenvector centrality, PageRank, non-backtracking centrality,
etc). The second is related to a large scale organization of the network, the
core-periphery structure, composed by a dense core plus an outlying and
loosely-connected periphery. In this paper we investigate the relation between
these two concepts. We consider networks generated via the Stochastic Block
Model, or its degree corrected version, with a strong core-periphery structure
and we investigate the centrality properties of the core nodes and the ability
of several centrality metrics to identify them. We find that the three measures
with the best performance are marginals obtained with belief propagation,
PageRank, and degree centrality, while non-backtracking and eigenvector
centrality (or MINRES}, showed to be equivalent to the latter in the large
network limit) perform worse in the investigated networks.Comment: 15 pages, 8 figure
Entire slice regular functions
Entire functions in one complex variable are extremely relevant in several
areas ranging from the study of convolution equations to special functions. An
analog of entire functions in the quaternionic setting can be defined in the
slice regular setting, a framework which includes polynomials and power series
of the quaternionic variable. In the first chapters of this work we introduce
and discuss the algebra and the analysis of slice regular functions. In
addition to offering a self-contained introduction to the theory of
slice-regular functions, these chapters also contain a few new results (for
example we complete the discussion on lower bounds for slice regular functions
initiated with the Ehrenpreis-Malgrange, by adding a brand new Cartan-type
theorem).
The core of the work is Chapter 5, where we study the growth of entire slice
regular functions, and we show how such growth is related to the coefficients
of the power series expansions that these functions have. It should be noted
that the proofs we offer are not simple reconstructions of the holomorphic
case. Indeed, the non-commutative setting creates a series of non-trivial
problems. Also the counting of the zeros is not trivial because of the presence
of spherical zeros which have infinite cardinality. We prove the analog of
Jensen and Carath\'eodory theorems in this setting
Rainy day funds: can they make a difference in Europe
Rainy Day Funds (RDFs) have an important role in the USA. They allow States – which usually have rules requiring a balanced budget for current revenue and spending – to limit procyclical fiscal policies. This paper examines the possible role of RDFs in the European fiscal framework. The analysis suggests that RDFs would not fundamentally alter the incentive problems at the root of the difficulties in the implementation of the Stability and Growth Pact. Moreover, RDFs are not an option for countries with high deficits. However, for low-deficit countries, RDFs can lessen the rigidity of the 3 per cent threshold in bad times. RDFs could be introduced on a voluntary basis at the national level and could contribute to make the rules more country-specific. The introduction of RDFs would require a change in the definition of the “Maastricht deficit”: deposits and withdrawals should be considered respectively as budget expense and revenue. In this way, the balances held in RDFs could be spent in bad times without an increase in the deficit. To ensure that RDFs are not used opportunistically, deposits should only be made out of budget surpluses and circumstances allowing withdrawals should be specified ex ante.rainy day funds, fiscal rules, EMU
The Reliability of EMU FIscal Indicators: Risks and Safeguards
The reliability of EMUÂ’s fiscal indicators has been questioned by recent episodes of large upward deficit revisions. This paper discusses the causes of such revisions in order to identify ways to improve monitoring. The computation of EMUÂ’s deficit indicator involves the assessment of accrued revenue and expenditure and the identification of transactions in financial assets. Both can open margins for opportunistic accounting. However, crosschecks between deficit and changes in gross nominal debt (the other fiscal indicator used in EMU) can reduce the scope for window dressing. Simple comparison of deficit and changes in debt can readily spotlight large inconsistencies in fiscal data. Nevertheless, consistency checks must go deeper than simple comparison, since different items in the reconciliation account between deficit and change in debt can offset each other. Econometric evidence suggests that such offset may indeed have been used to reduce the visibility of deficit-specific window dressing. Attention to the quality of statistics has increased in recent years, also in the context of the reform of the Stability and Growth Pact. In this context, the paper argues that detailed analysis of the reconciliation account between deficit and change in debt is crucial to the effectiveness of monitoring.EMU, fiscal rules, fiscal indicators, stock-flow adjustment
Detectability of Macroscopic Structures in Directed Asymmetric Stochastic Block Model
We study the problem of identifying macroscopic structures in networks,
characterizing the impact of introducing link directions on the detectability
phase transition. To this end, building on the stochastic block model, we
construct a class of hardly detectable directed networks. We find closed form
solutions by using belief propagation method showing how the transition line
depends on the assortativity and the asymmetry of the network. Finally, we
numerically identify the existence of a hard phase for detection close to the
transition point.Comment: 9 pages, 7 figure
- …