1,543 research outputs found

    Web services synchronization health care application

    Full text link
    With the advance of Web Services technologies and the emergence of Web Services into the information space, tremendous opportunities for empowering users and organizations appear in various application domains including electronic commerce, travel, intelligence information gathering and analysis, health care, digital government, etc. In fact, Web services appear to be s solution for integrating distributed, autonomous and heterogeneous information sources. However, as Web services evolve in a dynamic environment which is the Internet many changes can occur and affect them. A Web service is affected when one or more of its associated information sources is affected by schema changes. Changes can alter the information sources contents but also their schemas which may render Web services partially or totally undefined. In this paper, we propose a solution for integrating information sources into Web services. Then we tackle the Web service synchronization problem by substituting the affected information sources. Our work is illustrated with a healthcare case study.Comment: 18 pages, 12 figure

    Scaling Models for the Severity and Frequency of External Operational Loss Data

    Get PDF
    According to Basel II criteria, the use of external data is absolutely indispensable to the implementation of an advanced method for calculating operational capital. This article investigates how the severity and frequencies of external losses are scaled for integration with internal data. We set up an initial model designed to explain the loss severity. This model takes into account firm size, location, and business lines as well as risk types. It also shows how to calculate the internal loss equivalent to an external loss, which might occur in a given bank. OLS estimation results show that the above variables have significant power in explaining the loss amount. They are used to develop a normalization formula. A second model based on external data is developed to scale the frequency of losses over a given period. Two regression models are analyzed: the truncated Poisson model and the truncated negative binomial model. Variables estimating the size and geographical distribution of the banks' activities have been introduced as explanatory variables. The results show that the negative binomial distribution outperforms the Poisson distribution. The scaling is done by calculating the parameters of the selected distribution based on the estimated coefficients and the variables related to a given bank. Frequency of losses of more than $1 million are generated on a specific horizon.Operational risk in banks, scaling, severity distribution, frequency distribution, truncated count data regression models

    Top-Down Behavioral Modeling Methodology of a Piezoelectric Microgenerator For Integrated Power Harvesting Systems

    Get PDF
    In this study, we developed a top/down methodology for behavioral and structural modeling of multi-domain microsystems. Then, we validated this methodology through a study case : a piezoelectric microgenerator. We also proved the effectiveness of VHDL-AMS language not only for modeling in behavioral and structural levels but also in writing physical models that can predict the experimental results. Finally, we validated these models by presenting and discussing simulations results.Comment: Submitted on behalf of EDA Publishing Association (http://irevues.inist.fr/handle/2042/16838

    Unbounded symmetric operators in KK-homology and the Baum-Connes Conjecture

    Get PDF
    Using the unbounded picture of analytical K-homology, we associate a well-defined K-homology class to an unbounded symmetric operator satisfying certain mild technical conditions. We also establish an ``addition formula'' for the Dirac operator on the circle and for the Dolbeault operator on closed surfaces. Two proofs are provided, one using topology and the other one, surprisingly involved, sticking to analysis, on the basis of the previous result. As a second application, we construct, in a purely analytical language, various homomorphisms linking the homology of a group in low degree, the K-homology of its classifying space and the analytic K-theory of its C^*-algebra, in close connection with the Baum-Connes assembly map. For groups classified by a 2-complex, this allows to reformulate the Baum-Connes Conjecture.Comment: 42 pages, 3 figure

    Hurst's exponent behaviour, weak-form stock market efficiency and financial liberalization: the Tunisian case

    Get PDF
    In this paper, we test the weak-form stock market efficiency for the Tunisian stock market (TSE). Our empirical approach is founded on the analysis of the behaviour over time of the Hurst's exponent. Thus, we computed the Hurst's exponent using a ñ€Ɠrolling sampleñ€ with a time window of 4 years. The sample data covers in daily frequency the period (January, 1997- October 2007). Since the classical R/S analysis is strongly affected by short-range dependencies both in the mean and the conditional variance of TSE daily volatility, daily stock returns were filtered using the traditional AR-GARCH(1,1) model. Our results for Hurst's and filtered Hurst's exponents behaviour analysis show a strong evidence of long-range dependence with persistent behaviour of the TSE. However, during the last two years, the filtered Hurst's exponent seems to exhibit a switching regime behaviour with alternating persistent and antipersistent behaviour but where it was somewhat close to 0.5.The nonparametric statistic approach results reveal that some TSE reforms including the launching of the Electronic quotation system on April, 1998, the fiscal regime for holdings, the security reinforcement laws, the legal protection of minority shareholder may play a role in understanding the Hurst's exponent behaviour over timefinancial reforms, long-range dependence; weak-form efficiency; Hurst's exponent; rolling sample approach.

    Minimum BER Precoding in 1-Bit Massive MIMO Systems

    Full text link
    1-bit digital-to-analog (DACs) and analog-to-digital converters (ADCs) are gaining more interest in massive MIMO systems for economical and computational efficiency. We present a new precoding technique to mitigate the inter-user-interference (IUI) and the channel distortions in a 1-bit downlink MUMISO system with QPSK symbols. The transmit signal vector is optimized taking into account the 1-bit quantization. We develop a sort of mapping based on a look-up table (LUT) between the input signal and the transmit signal. The LUT is updated for each channel realization. Simulation results show a significant gain in terms of the uncoded bit-error-ratio (BER) compared to the existing linear precoding techniques.Comment: Presented in IEEE SAM 2016, 10th-13th July 2016, Rio De Janeiro, Brazi

    Extremal Events in a Bank Operational Losses

    Get PDF
    Operational losses are true dangers for banks since their maximal values to signal default are difficult to predict. This risky situation is unlike default risk whose maximum values are limited by the amount of credit granted. For example, our data from a very large US bank show that this bank could suffer, on average, more than four major losses a year. This bank had seven losses exceeding hundreds of millions of dollars over its 52 documented losses of more than 1millionduringthe1994−2004period.Thetailofthelossdistribution(aParetodistributionwithoutexpectationwhosecharacteristicexponentis0.95???1)showsthatthisbankcanfearextremeoperationallossesrangingfrom1 million during the 1994-2004 period. The tail of the loss distribution (a Pareto distribution without expectation whose characteristic exponent is 0.95 ? ? ? 1) shows that this bank can fear extreme operational losses ranging from 1 billion to 11billion,atprobabilitiessituatedrespectivelybetween111 billion, at probabilities situated respectively between 1% and 0.1%. The corresponding annual insurance premiums are evaluated to range between 350 M and close to $1 billion.Bank operational loss, value at risk, Pareto distribution, insurance premium, extremal event
    • 

    corecore