220,266 research outputs found

    Web services synchronization health care application

    Full text link
    With the advance of Web Services technologies and the emergence of Web Services into the information space, tremendous opportunities for empowering users and organizations appear in various application domains including electronic commerce, travel, intelligence information gathering and analysis, health care, digital government, etc. In fact, Web services appear to be s solution for integrating distributed, autonomous and heterogeneous information sources. However, as Web services evolve in a dynamic environment which is the Internet many changes can occur and affect them. A Web service is affected when one or more of its associated information sources is affected by schema changes. Changes can alter the information sources contents but also their schemas which may render Web services partially or totally undefined. In this paper, we propose a solution for integrating information sources into Web services. Then we tackle the Web service synchronization problem by substituting the affected information sources. Our work is illustrated with a healthcare case study.Comment: 18 pages, 12 figure

    Scaling Models for the Severity and Frequency of External Operational Loss Data

    Get PDF
    According to Basel II criteria, the use of external data is absolutely indispensable to the implementation of an advanced method for calculating operational capital. This article investigates how the severity and frequencies of external losses are scaled for integration with internal data. We set up an initial model designed to explain the loss severity. This model takes into account firm size, location, and business lines as well as risk types. It also shows how to calculate the internal loss equivalent to an external loss, which might occur in a given bank. OLS estimation results show that the above variables have significant power in explaining the loss amount. They are used to develop a normalization formula. A second model based on external data is developed to scale the frequency of losses over a given period. Two regression models are analyzed: the truncated Poisson model and the truncated negative binomial model. Variables estimating the size and geographical distribution of the banks' activities have been introduced as explanatory variables. The results show that the negative binomial distribution outperforms the Poisson distribution. The scaling is done by calculating the parameters of the selected distribution based on the estimated coefficients and the variables related to a given bank. Frequency of losses of more than $1 million are generated on a specific horizon.Operational risk in banks, scaling, severity distribution, frequency distribution, truncated count data regression models

    Top-Down Behavioral Modeling Methodology of a Piezoelectric Microgenerator For Integrated Power Harvesting Systems

    Get PDF
    In this study, we developed a top/down methodology for behavioral and structural modeling of multi-domain microsystems. Then, we validated this methodology through a study case : a piezoelectric microgenerator. We also proved the effectiveness of VHDL-AMS language not only for modeling in behavioral and structural levels but also in writing physical models that can predict the experimental results. Finally, we validated these models by presenting and discussing simulations results.Comment: Submitted on behalf of EDA Publishing Association (http://irevues.inist.fr/handle/2042/16838

    Unbounded symmetric operators in KK-homology and the Baum-Connes Conjecture

    Get PDF
    Using the unbounded picture of analytical K-homology, we associate a well-defined K-homology class to an unbounded symmetric operator satisfying certain mild technical conditions. We also establish an ``addition formula'' for the Dirac operator on the circle and for the Dolbeault operator on closed surfaces. Two proofs are provided, one using topology and the other one, surprisingly involved, sticking to analysis, on the basis of the previous result. As a second application, we construct, in a purely analytical language, various homomorphisms linking the homology of a group in low degree, the K-homology of its classifying space and the analytic K-theory of its C^*-algebra, in close connection with the Baum-Connes assembly map. For groups classified by a 2-complex, this allows to reformulate the Baum-Connes Conjecture.Comment: 42 pages, 3 figure

    Hurst's exponent behaviour, weak-form stock market efficiency and financial liberalization: the Tunisian case

    Get PDF
    In this paper, we test the weak-form stock market efficiency for the Tunisian stock market (TSE). Our empirical approach is founded on the analysis of the behaviour over time of the Hurst's exponent. Thus, we computed the Hurst's exponent using a “rolling sample†with a time window of 4 years. The sample data covers in daily frequency the period (January, 1997- October 2007). Since the classical R/S analysis is strongly affected by short-range dependencies both in the mean and the conditional variance of TSE daily volatility, daily stock returns were filtered using the traditional AR-GARCH(1,1) model. Our results for Hurst's and filtered Hurst's exponents behaviour analysis show a strong evidence of long-range dependence with persistent behaviour of the TSE. However, during the last two years, the filtered Hurst's exponent seems to exhibit a switching regime behaviour with alternating persistent and antipersistent behaviour but where it was somewhat close to 0.5.The nonparametric statistic approach results reveal that some TSE reforms including the launching of the Electronic quotation system on April, 1998, the fiscal regime for holdings, the security reinforcement laws, the legal protection of minority shareholder may play a role in understanding the Hurst's exponent behaviour over timefinancial reforms, long-range dependence; weak-form efficiency; Hurst's exponent; rolling sample approach.

    Minimum BER Precoding in 1-Bit Massive MIMO Systems

    Full text link
    1-bit digital-to-analog (DACs) and analog-to-digital converters (ADCs) are gaining more interest in massive MIMO systems for economical and computational efficiency. We present a new precoding technique to mitigate the inter-user-interference (IUI) and the channel distortions in a 1-bit downlink MUMISO system with QPSK symbols. The transmit signal vector is optimized taking into account the 1-bit quantization. We develop a sort of mapping based on a look-up table (LUT) between the input signal and the transmit signal. The LUT is updated for each channel realization. Simulation results show a significant gain in terms of the uncoded bit-error-ratio (BER) compared to the existing linear precoding techniques.Comment: Presented in IEEE SAM 2016, 10th-13th July 2016, Rio De Janeiro, Brazi

    The South African Regulatory System: Past, Present, and Future

    Get PDF
    The drive for improved regulatory systems and the establishment of a more effective regulatory framework in South Africa has been evident for the past two decades but despite political intentions and legislative revisions success has been limited to date. Efforts to address the increasing volume of applications that have been received have to date failed and resources have been stretched to capacity resulting in the development of a significant backlog and extended timelines for product registration. The promulgation of the recently amended Medicines and Related Substance Act of 1965 triggered the establishment of the South African Health Products Regulatory Authority (SAHPRA) as a separate juristic person outside of the National Department of Health to replace the former medicine regulatory authority the Medicines Control Council (MCC). The aim of this review is to provide the historical context supporting the new regulatory environment in South Africa and the transition from the MCC to SAHPRA. Key recommendations to SAHPRA to ensure the full potential of the new regulatory environment in South Africa include: establishing a quality management system to safeguard accountability, consistency and transparency and to streamline the implementation of good review practices including quality decisionmaking practices and benefit-risk assessment; the measurement and monitoring of regulatory performance, targets for overall approval time and key review milestones to instill a culture of accurate metrics collection and measurement of key performance indicators and their continuous improvement and the employment of a risk-based approach to the evaluation of medical products and codify the use of facilitated regulatory pathways in policy and culture. The application of a risk-based approach to regulatory review commensurate with a product’s risk to patients will facilitate the application of increased resources for pharmacovigilance activities and to support the reliance and recognition of reference agencies.Peer reviewe
    • …
    corecore