9 research outputs found

    Loss data analysis: Analysis of the sample dependence in density reconstruction by maxentropic methods

    Get PDF
    The problem of determining probability densities of positive random variables from empirical data is important in many fields, in particular in insurance and risk analysis. The method of maximum entropy has proven to be a powerful tool to determine probability densities from a few values of its Laplace transform. This is so even when the amount of data to compute numerically the Laplace transform is small. But in this case, the variability of the reconstruction due to the sample variability in the available data can lead to quite different results. It is the purpose of this note to quantify as much as possible the variability of the densities reconstructed by means of two maxentropic methods: the standard maximum entropy method and its extension to incorporate data with errors

    Maxentropic and quantitative methods in operational risk modeling

    Get PDF
    In risk management the estimation of the distribution of random sums or collective models from historical data is not a trivial problem. This is due to problems related with scarcity of the data, asymmetries and heavy tails that makes difficult a good fit of the data to the most frequent distributions and existing methods. In this work we prove that the maximum entropy approach has important applications in risk management and Insurance Mathematics for the calculation of the density of aggregated risk events, and even for the calculation of the individual losses that come from the aggregated data, when the available information consists of an observed sample, which we usually do not have any information about the underlying process. From the knowledge of a few fractional moments, the Maxentropic methodologies provide an efficient methodology to determine densities when the data is scarce, or when the data presents correlation, large tails or multimodal characteristics. For this procedure, the input would be the sample moments E[e S] = ( ) or some interval that encloses the di fference between the true value of ( ) and the sample moments (for eight values of the Laplace transform), this interval would be related to the uncertainty (error) in the data, where the width of the interval may be adjusted by convenience. Through a simulation study we analyze the quality of the results, considering the differences with respect to the true density and in some cases the study of the size of the gradient and the time of convergence. We compare four different extensions of Maxentropic methodologies, the Standard Method of Maximum Entropy (SME), an extension of this methodology allows to incorporate additional information through a reference measure, called Method of Entropy in the Mean (MEM) and two extensions of the SME that allow introduce errors, called SME with errors or SMEE. Although our motivating example come from the field of Operational Risk analysis, the developed methodology may be applied to any branch of applied sciences.Programa de Doctorado en Economía de la Empresa y Métodos Cuantitativos por la Universidad Carlos III de MadridPresidente: Alejandro Balbás de la Corte; Secretario: Argimiro Arriata Quesada; Vocal: Santiago Carrillo Menénde

    BLIND SOURCE SEPARATION USING MAXIMUM ENTROPY PDF ESTIMATION BASED ON FRACTIONAL MOMENTS

    Get PDF
    Abstract. Recovering a set of independent sources which are linearly mixed is the main task of the blind source separation. Utilizing different methods such as infomax principle, mutual information and maximum likelihood leads to simple iterative procedures such as natural gradient algorithms. These algorithms depend on a nonlinear function (known as score or activation function) of source distributions. Since there is no prior knowledge of source distributions, the optimality of the algorithms is based on the choice of a suitable parametric density model. In this paper, we propose an adaptive optimal score function based on the fractional moments of the sources. In order to obtain a parametric model for the source distributions, we use a few sampled fractional moments to construct the maximum entropy probability density function (PDF) estimation . By applying an optimization method we can obtain the optimal fractional moments that best fit the source distributions. Using the fractional moments (FM) instead of the integer moments causes the maximum entropy estimated PDF to converge to the true PDF much faster . The simulation results show that unlike the most previous proposed models for the nonlinear score function, which are limited to some sorts of source families such as sub-gaussian and super-gaussian or some forms of source distribution models such as generalized gaussian distribution, our new model achieves better results for every source signal without any prior assumption for its randomness behavior

    Enhanced coding, clock recovery and detection for a magnetic credit card

    Get PDF
    Merged with duplicate record 10026.1/2299 on 03.04.2017 by CS (TIS)This thesis describes the background, investigation and construction of a system for storing data on the magnetic stripe of a standard three-inch plastic credit in: inch card. Investigation shows that the information storage limit within a 3.375 in by 0.11 in rectangle of the stripe is bounded to about 20 kBytes. Practical issues limit the data storage to around 300 Bytes with a low raw error rate: a four-fold density increase over the standard. Removal of the timing jitter (that is prob-' ably caused by the magnetic medium particle size) would increase the limit to 1500 Bytes with no other system changes. This is enough capacity for either a small digital passport photograph or a digitized signature: making it possible to remove printed versions from the surface of the card. To achieve even these modest gains has required the development of a new variable rate code that is more resilient to timing errors than other codes in its efficiency class. The tabulation of the effects of timing errors required the construction of a new code metric and self-recovering decoders. In addition, a new method of timing recovery, based on the signal 'snatches' has been invented to increase the rapidity with which a Bayesian decoder can track the changing velocity of a hand-swiped card. The timing recovery and Bayesian detector have been integrated into one computation (software) unit that is self-contained and can decode a general class of (d, k) constrained codes. Additionally, the unit has a signal truncation mechanism to alleviate some of the effects of non-linear distortion that are present when a magnetic card is read with a magneto-resistive magnetic sensor that has been driven beyond its bias magnetization. While the storage density is low and the total storage capacity is meagre in comparison with contemporary storage devices, the high density card may still have a niche role to play in society. Nevertheless, in the face of the Smart card its long term outlook is uncertain. However, several areas of coding and detection under short-duration extreme conditions have brought new decoding methods to light. The scope of these methods is not limited just to the credit card

    Robust Techniques for Signal Processing: A Survey

    Get PDF
    Coordinated Science Laboratory was formerly known as Control Systems LaboratoryU.S. Army Research Office / DAAG29-81-K-0062U.S. Air Force Office of Scientific Research / AFOSR 82-0022Joint Services Electronics Program / N00014-84-C-0149National Science Foundation / ECS-82-12080U.S. Office of Naval Research / N00014-80-K-0945 and N00014-81-K-001

    Modulation codes

    Get PDF

    Model reconstruction for moment-based stochastic chemical kinetics

    Get PDF
    Based on the theory of stochastic chemical kinetics, the inherent randomness and stochasticity of biochemical reaction networks can be accurately described by discrete-state continuous-time Markov chains, where each chemical reaction corresponds to a state transition of the process. However, the analysis of such processes is computationally expensive and sophisticated numerical methods are required. The main complication comes due to the largeness problem of the state space, so that analysis techniques based on an exploration of the state space are often not feasible and the integration of the moments of the underlying probability distribution has become a very popular alternative. In this thesis we propose an analysis framework in which we integrate a number of moments of the process instead of the state probabilities. This results in a more timeefficient simulation of the time evolution of the process. In order to regain the state probabilities from the moment representation, we combine the moment-based simulation (MM) with a maximum entropy approach: the maximum entropy principle is applied to derive a distribution that fits best to a given sequence of moments. We further extend this approach by incorporating the conditional moments (MCM) which allows not only to reconstruct the distribution of the species present in high amount in the system, but also to approximate the probabilities of species with low molecular counts. For the given distribution reconstruction framework, we investigate the numerical accuracy and stability using case studies from systems biology, compare two different moment approximation methods (MM and MCM), examine if it can be used for the reaction rates estimation problem and describe the possible future applications. In this thesis we propose an analysis framework in which we integrate a number of moments of the process instead of the state probabilities. This results in a more time-efficient simulation of the time evolution of the process. In order to regain the state probabilities from the moment representation, we combine the moment-based simulation (MM) with a maximum entropy approach: the maximum entropy principle is applied to derive a distribution that fits best to a given sequence of moments. We further extend this approach by incorporating the conditional moments (MCM) which allows not only to reconstruct the distribution of the species present in high amount in the system, but also to approximate the probabilities of species with low molecular counts. For the given distribution reconstruction framework, we investigate the numerical accuracy and stability using case studies from systems biology, compare two different moment approximation methods (MM and MCM), examine if it can be used for the reaction rates estimation problem and describe the possible future applications.Basierend auf der Theorie der stochastischen chemischen Kinetiken können die inhärente Zufälligkeit und Stochastizität von biochemischen Reaktionsnetzwerken durch diskrete zeitkontinuierliche Markow-Ketten genau beschrieben werden, wobei jede chemische Reaktion einem Zustandsübergang des Prozesses entspricht. Die Analyse solcher Prozesse ist jedoch rechenaufwendig und komplexe numerische Verfahren sind erforderlich. Analysetechniken, die auf dem Abtasten des Zustandsraums basieren, sind durch dessen Größe oft nicht anwendbar. Als populäre Alternative wird heute häufig die Integration der Momente der zugrundeliegenden Wahrscheinlichkeitsverteilung genutzt. In dieser Arbeit schlagen wir einen Analyserahmen vor, in dem wir, anstatt der Zustandswahrscheinlichkeiten, zugrundeliegende Momente des Prozesses integrieren. Dies führt zu einer zeiteffizienteren Simulation der zeitlichen Entwicklung des Prozesses. Um die Zustandswahrscheinlichkeiten aus der Momentreprsentation wiederzugewinnen, kombinieren wir die momentbasierte Simulation (MM) mit Entropiemaximierung: Die Maximum- Entropie-Methode wird angewendet, um eine Verteilung abzuleiten, die am besten zu einer bestimmten Sequenz von Momenten passt. Wir erweitern diesen Ansatz durch das Einbeziehen bedingter Momente (MCM), die es nicht nur erlauben, die Verteilung der in großer Menge im System enthaltenen Spezies zu rekonstruieren, sondern es ebenso ermöglicht, sich den Wahrscheinlichkeiten von Spezies mit niedrigen Molekulargewichten anzunähern. Für das gegebene System zur Verteilungsrekonstruktion untersuchen wir die numerische Genauigkeit und Stabilität anhand von Fallstudien aus der Systembiologie, vergleichen zwei unterschiedliche Verfahren der Momentapproximation (MM und MCM), untersuchen, ob es für das Problem der Abschätzung von Reaktionsraten verwendet werden kann und beschreiben die mögliche zukünftige Anwendungen
    corecore