4,312 research outputs found

    The fate of one-dollar coins in the U.S.

    Get PDF
    The United States has introduced two one-dollar coins in the past 25 years, both of which have not circulated widely. Many other countries have replaced lower-denomination notes with coins and have achieved wide circulation and cost savings. Lessons from those countries suggest that achieving widespread use of a dollar coin is much harder if the note is allowed to remain in circulation.Dollar, American ; Coinage

    A Reproducible Study on Remote Heart Rate Measurement

    Get PDF
    This paper studies the problem of reproducible research in remote photoplethysmography (rPPG). Most of the work published in this domain is assessed on privately-owned databases, making it difficult to evaluate proposed algorithms in a standard and principled manner. As a consequence, we present a new, publicly available database containing a relatively large number of subjects recorded under two different lighting conditions. Also, three state-of-the-art rPPG algorithms from the literature were selected, implemented and released as open source free software. After a thorough, unbiased experimental evaluation in various settings, it is shown that none of the selected algorithms is precise enough to be used in a real-world scenario

    Geometric quenches in quantum integrable systems

    Full text link
    We consider the generic problem of suddenly changing the geometry of an integrable, one-dimensional many-body quantum system. We show how the physics of an initial quantum state released into a bigger system can be completely described within the framework of the Algebraic Bethe Ansatz, by providing an exact decomposition of the initial state into the eigenstate basis of the system after such a geometric quench. Our results, applicable to a large class of models including the Lieb-Liniger gas and Heisenberg spin chains, thus offer a reliable framework for the calculation of time-dependent expectation values and correlations in this nonequilibrium situation.Comment: 8 page

    Overcoming limitations of nanomechanical resonators with simultaneous resonances

    Get PDF
    Dynamic stabilization by simultaneous primary and superharmonic resonances for high order nonlinearity cancellation is demonstrated with an electrostatically-actuated, piezoresistively-transduced nanomechanical resonator. We prove experimentally how the combination of both the third-order nonlinearity cancellation and simultaneous resonances can be used to linearly drive a nanocantilever up to very large amplitudes compared to fundamental limits like pull-in occurrence, opening the way towards resonators with high frequency stability for high-performance sensing or time reference

    Subjective beliefs formation and elicitation rules : experimental evidence

    Get PDF
    Since they have been increasingly used in economics, elicitation rules for subjective beliefs are under scrutiny. In this paper, we propose an experimental design to compare the performance of such rules. Contrary to previous works in which elicited beliefs are compared to an objective benchmark, we consider a pure subjective belief framework (confidence in own performance in a cognitive task and a perceptual task). The performances of elicitation rules are assessed according to the accuracy of stated beliefs in predicting success. For the perceptual task we also compare stated beliefs to Signal Detection Theory predictions. We find consistent evidence in favor of the Lottery Rule which provides more accurate beliefs and is not sensitive to risk aversion. Furthermore the Free Rule, a simple rule with no incentives, elicits relevant beliefs and even outperforms the Quadratic Scoring Rule. Beside this comparison, we propose a belief formation model where we distinguish between two stages in the beliefs : beliefs for decision making and confidence beliefs. Our results give support to this model.Belief elicitation, confidence, Signal Detection Theory, methodology, incentives, experimental economics.

    A new multipath mitigation method for GNSS receivers based on antenna array

    Get PDF
    the potential of small antenna array for multipath mitigation in GNSS systems is considered in this paper. To discriminate the different incoming signals (Line of sight and multipaths), a new implementation of the well known SAGE algorithm is proposed. This allows a significant complexity reduction and it is fully compatible with conventional GNSS receivers. Theoretical study thanks to the Cramer Rao Bound derivation and tracking simulation results (in static and dynamic scenarios) show that the proposed method is a very promising approach for the multipath mitigation problem in GNSS receivers

    A new tracking approach for multipath mitigation based on antenna array

    Get PDF
    In Global Navigation Satellites Systems (GNSS), multipaths (MP) are still one of the major error sources. The additional signal replica due to reflection will introduce a bias in conventional Delay Lock Loops (DLL) which will finally cause a strong positioning error. Several techniques, based on Maximum Likelihood estimation (ML), have been developed for multipaths mitigation/estimation such as the Narrow correlator spacing [1] or the Multipath Estimating Delay-Lock-Loop (MEDLL) [2] algorithm. These techniques try to discriminate the MP from the Line Of Sight Signal (LOSS) on the time and frequency domains and thus, short delay multipaths (<0.1Chips) can not be completely mitigated. Antenna array perform a spatial sampling of the wave front what makes possible the discrimination of the sources on the space domain (azimuth and elevation). As the time-delay domain and space domain can be assumed independent, we can expect to mitigate/estimate very short delay MP by using an antenna array. However, we don't want to increase too much the size, the complexity and the cost of the receivers and thus, we focus our study on small arrays with a small number of antennas: typically a square 2x2 array. Consequently, conventional beamforming (space Fast Fourier Transform) is not directive enough to assure the mitigation of the multipaths, and then this first class of solutions was rejected. In order to improve the resolution, adaptive beamformers have also been tested. However, the LOSS and the MP signal are strongly correlated and thus, classical adaptive algorithms [3] are not able to discriminate the sources. These preliminary studies have shown that the mitigation/estimation of multipaths based on the space domain will exhibit limited performances in presence of close sources. Then, in order to propose robust algorithms, we decided to investigate a space-time-frequency estimation of the sources. Space Alternating Generalized Expectation maximisation (SAGE) algorithm [4], which is a low-complexity generalization of the Expectation Maximisation (EM) algorithm, has been considered. The basic concept of the SAGE algorithm is the hidden data space [4]. Instead of estimating the parameters of all impinging waves in parallel in one iteration step as done by the EM algorithm, the SAGE algorithm estimates the parameters of each signal sequentially. Moreover, SAGE algorithm breaks down the multi-dimensional optimization problem into several smaller problems. In [5], it can be seen that SAGE algorithm is efficient for any multipaths configurations (small relative delays, close DOAs) and space-time-frequency approach is clearly outperforming classical time-frequency approaches. Notwithstanding, SAGE algorithm is a post processing algorithm. Thus, it's necessary to memorise in the receiver the incoming signal in order to apply SAGE estimation. For example, if we want to process 10ms of signal with a 10MHz sampling rate, we need to store a matrix of m*105 with m the number of antennas. In such condition, we can understand than SAGE algorithm is hardly implemented in real time. The challenge is then to find a new type of algorithms that reach the efficiency of the SAGE algorithms, but with a reduced complexity in order to enable real time processing. Furthermore, the implementation should be compatible with conventional GNSS tracking loops (DLL and PLL). To cope with these two constraints, we propose to apply the SAGE algorithm on the post-correlated signal. Indeed, the correlation step can be seen as a compression step and thus, the size of the studied signal is strongly reduced. In such a way, SAGE algorithm is able to provide estimates of the relative delay and Doppler of the received signals with respect to the local replicas. Thus, a post correlation implementation of SAGE can be seen as a discriminator for both the DLL and the PLL

    A Quantitative Flavour of Robust Reachability

    Full text link
    Many software analysis techniques attempt to determine whether bugs are reachable, but for security purpose this is only part of the story as it does not indicate whether the bugs found could be easily triggered by an attacker. The recently introduced notion of robust reachability aims at filling this gap by distinguishing the input controlled by the attacker from those that are not. Yet, this qualitative notion may be too strong in practice, leaving apart bugs which are mostly but not fully replicable. We aim here at proposing a quantitative version of robust reachability, more flexible and still amenable to automation. We propose quantitative robustness, a metric expressing how easily an attacker can trigger a bug while taking into account that he can only influence part of the program input, together with a dedicated quantitative symbolic execution technique (QRSE). Interestingly, QRSE relies on a variant of model counting (namely, functional E-MAJSAT) unseen so far in formal verification, but which has been studied in AI domains such as Bayesian network, knowledge representation and probabilistic planning. Yet, the existing solving methods from these fields turn out to be unsatisfactory for formal verification purpose, leading us to propose a novel parametric method. These results have been implemented and evaluated over two security-relevant case studies, allowing to demonstrate the feasibility and relevance of our ideas

    Launching of a New Currency in a Simple Random Matching Model

    Get PDF
    This paper studies the launching of a new fiat currency within a search-theoretic framework. We show that legal tender laws may not be sufficient to guarantee the acceptability of the new currency, and that the withdrawal of a large fraction of the competing currency is essential to avoid the failure of such a launching. The possibility of converting the old currency into the new one can ease the transition to the new currency only if it is combined with strict legal tender laws. Finally, a network externality is identified that may generate inefficiencies in the conversion decision.money; search; currency reform; legal tender laws
    corecore