237,417 research outputs found

    MODELING OPERATIONAL RISK IN DATA QUALITY (Practice-oriented paper)

    Get PDF
    Abstract: In this paper, we address how data quality (DQ) is likely linked to failed business processes that pose operational risks to the Enterprise system. Operational value at risk (OPVAR), which is used in the finance literature to mean how much we might expect to lose if an event in the tail of the loss probability distribution does not occur, can be used to conduct Enterprise software reliability and damage function analysis. This paper explores (a) how to combine distributional assumptions for event frequency and severity to derive software loss cost estimates using the familiar example of software processing errors and (b) how to utilize the estimates of this distribution to estimate OPVAR-based losses. The empirical results show (a) that it is possible to fit DQ problems, such as the daily mishandling event data, to a distribution and to use maximum likelihood analysis to derive a consistent set of critical event count thresholds and (b) that the resulting OPVAR-based losses can be used by DQ managers to ascertain the real costs of mitigating DQ problems

    Implementing Loss Distribution Approach for Operational Risk

    Full text link
    To quantify the operational risk capital charge under the current regulatory framework for banking supervision, referred to as Basel II, many banks adopt the Loss Distribution Approach. There are many modeling issues that should be resolved to use the approach in practice. In this paper we review the quantitative methods suggested in literature for implementation of the approach. In particular, the use of the Bayesian inference method that allows to take expert judgement and parameter uncertainty into account, modeling dependence and inclusion of insurance are discussed

    Expert Elicitation for Reliable System Design

    Full text link
    This paper reviews the role of expert judgement to support reliability assessments within the systems engineering design process. Generic design processes are described to give the context and a discussion is given about the nature of the reliability assessments required in the different systems engineering phases. It is argued that, as far as meeting reliability requirements is concerned, the whole design process is more akin to a statistical control process than to a straightforward statistical problem of assessing an unknown distribution. This leads to features of the expert judgement problem in the design context which are substantially different from those seen, for example, in risk assessment. In particular, the role of experts in problem structuring and in developing failure mitigation options is much more prominent, and there is a need to take into account the reliability potential for future mitigation measures downstream in the system life cycle. An overview is given of the stakeholders typically involved in large scale systems engineering design projects, and this is used to argue the need for methods that expose potential judgemental biases in order to generate analyses that can be said to provide rational consensus about uncertainties. Finally, a number of key points are developed with the aim of moving toward a framework that provides a holistic method for tracking reliability assessment through the design process.Comment: This paper commented in: [arXiv:0708.0285], [arXiv:0708.0287], [arXiv:0708.0288]. Rejoinder in [arXiv:0708.0293]. Published at http://dx.doi.org/10.1214/088342306000000510 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    New Trends regarding the Operational Risks in Financial Sector

    Get PDF
    Risks, especially "operational risks" are part of corporate life, they are the essence of financial institutions' activities. Operational risks are complex and often interlinked and have to be managed properly. Today, there is more pressure to avoid operational risks while continuing to improve corporate performance in the new environment. The operational risk management of the future has to be seen in the wider context of globalization and Internet-related technologies. The two major future drivers - globalization and Internet-related technologies - will challenge the firms from financial sector to take on additional and partly new operational risk.operational risk, financial sector, models, trends

    Loss Distribution Approach for Operational Risk Capital Modelling under Basel II: Combining Different Data Sources for Risk Estimation

    Full text link
    The management of operational risk in the banking industry has undergone significant changes over the last decade due to substantial changes in operational risk environment. Globalization, deregulation, the use of complex financial products and changes in information technology have resulted in exposure to new risks very different from market and credit risks. In response, Basel Committee for banking Supervision has developed a regulatory framework, referred to as Basel II, that introduced operational risk category and corresponding capital requirements. Over the past five years, major banks in most parts of the world have received accreditation under the Basel II Advanced Measurement Approach (AMA) by adopting the loss distribution approach (LDA) despite there being a number of unresolved methodological challenges in its implementation. Different approaches and methods are still under hot debate. In this paper, we review methods proposed in the literature for combining different data sources (internal data, external data and scenario analysis) which is one of the regulatory requirement for AMA

    Operational Risk Assesement Tools for Quality Management in Banking Services

    Get PDF
    Among all the different types of risks that can affect financial companies, the operational risk can be the most devastating and the most difficult to anticipate. The management of operational risk is a key component of financial and risk management discipline that drives net income results, 2capital management and customer satisfaction. The present paper contains a statistical analysis in order to determine the number of operational errors as quality based services determinants, depending on the number of transactions performed at the branch unit level. Regression model applied to a sample of 418 branches of a major Romanian bank is used to guide the decision taken by the bank, consistent with its priorities of minimizing the risk and enlarging the customer base ensuring high quality services. The analyisis reveals that the model can predict the quality of the transactions based on the number of operational errors. Under Basel II, this could be a very helpful instrument for banks in order to adjust the capital requirement to the losses due to operational errors, predicted by the model.quality management, operational risk, banking services, binary regression model

    On green routing and scheduling problem

    Full text link
    The vehicle routing and scheduling problem has been studied with much interest within the last four decades. In this paper, some of the existing literature dealing with routing and scheduling problems with environmental issues is reviewed, and a description is provided of the problems that have been investigated and how they are treated using combinatorial optimization tools
    corecore