930 research outputs found

    Evaluation of load balancing approaches for Erlang concurrent application in cloud systems

    Get PDF
    Cloud system accommodates the computing environment including PaaS (platform as a service), SaaS (software as a service), and IaaS (infrastructure as service) that enables the services of cloud systems.  Cloud system allows multiple users to employ computing services through browsers, which reflects an alternative service model that alters the local computing workload to a distant site. Cloud virtualization is another characteristic of the clouds that deliver virtual computing services and imitate the functionality of physical computing resources. It refers to an elastic load balancing management that provides the flexible model of on-demand services. The virtualization allows organizations to improve high levels of reliability, accessibility, and scalability by having a capability to execute applications on multiple resources simultaneously. In this paper we use a queuing model to consider a flexible load balancing and evaluate performance metrics such as mean queue length, throughput, mean waiting time, utilization, and mean traversal time. The model is aware of the arrival of concurrent applications with an Erlang distribution. Simulation results regarding performance metrics are investigated. Results point out that in Cloud systems both the fairness and load balancing are to be significantly considered

    Call Center Experience Optimization: A Case for a Virtual Predictive Queue

    Get PDF
    The evolution of the call center into contact centers and the growth of their use in providing customer-facing service by many companies has brought considerable capabilities in maintaining customer relationships but it also has brought challenges in providing quality service when call volumes are high. Limited in their ability to provide service at all times to all customers, companies are forced to balance the costs associated with hiring more customer service representatives and the quality of service provided by a fewer number. A primary challenge when there are not enough customer service representatives to engage the volume of callers in a timely manner is the significant wait times that can be experienced by many customers. Normally, callers are handled in accordance with a first-come, first-served policy with exceptions being skill-based routing to those customer service representatives with specialized skills. A proposed call center infrastructure framework called a Virtual Predictive Queue (VPQ) can allow some customers to benefit from a shorter call queue wait time. This proposed system can be implemented within a call center’s Automatic Call Distribution (ACD) device associated with computer telephony integration (CTI) and theoretically will not violate a first-come, first served policy

    A STUDY OF QUEUING THEORY IN LOW TO HIGH REWORK ENVIRONMENTS WITH PROCESS AVAILABILITY

    Get PDF
    In manufacturing systems subject to machine and operator resource constraints the effects of rework can be profound. High levels of rework burden the resources unnecessarily and as the utilization of these resources increases the expected queuing time of work in process increases exponentially. Queuing models can help managers to understand and control the effects of rework, but often this tool is overlooked in part because of concerns over accuracy in complex environments and/or the need for limiting assumptions. One aim of this work is to increase understanding of system variables on the accuracy of simple queuing models. A queuing model is proposed that combines G/G/1 modeling techniques for rework with effective processing time techniques for machine availability and the accuracy of this model is tested under varying levels of rework, external arrival variability, and machine availability. Results show that the model performs best under exponential arrival patterns and can perform well even under high rework conditions. Generalizations are made with regards to the use of this tool for allocation of jobs to specific workers and/or machines based on known rework rates with the ultimate aim of queue time minimization

    A survey of the machine interference problem

    Get PDF
    This paper surveys the research published on the machine interference problem since the 1985 review by Stecke & Aronson. After introducing the basic model, we discuss the literature along several dimensions. We then note how research has evolved since the 1985 review, including a trend towards the modelling of stochastic (rather than deterministic) systems and the corresponding use of more advanced queuing methods for analysis. We conclude with some suggestions for areas holding particular promise for future studies.Natural Sciences and Engineering Research Council (NSERC) Discovery Grant 238294-200

    MODELLING OPERATIONAL RISK MEASUREMENT IN ISLAMIC BANKING: A THEORETICAL AND EMPIRICAL INVESTIGATION

    Get PDF
    With the emergence and development of Islamic banking industry, the need to cater operational risks issues has attracted the attention of academics in recent years. Such studies commonly agree that operational risk is relatively higher and serious than credit risk and market risk for Islamic banks. However, there is not any single research in the context of Islamic banking which thoroughly tackles the issue of operational risks by tackling it in three main aspects: theoretical, methodological, and empirical. This may be due to the fact that operational risk is relatively new area, which requires further research to understand the complexities it carries. This is the sources of motivation for the research, which aims to fill this observed gap in the literature by responding to the mentioned three aspects. This research, hence, aims to develop a new measurement model of operational risk exposures in Islamic banking with the objective of theoretically determining the underlying features of operational risk exposures and its measurement particularly for Islamic banks. In its attempt to develop a theoretical framework of the proposed model, the research provides a classification of operational risks in major Islamic financial contracts. In addition, rather than adopting the existing operational risk measurement methods, this research develops a proposed measurement model attributed as Delta Gamma Sensitivity Analysis- Extreme Value Theory (DGSA-EVT) model. DGSA-EVT is a model to measure high frequency-low severity (HF-LS) and low frequency-high severity (LF-HS) type of operational risks. This is the core of this research’s methodological contribution. As regards to the empirical contributions, in analysing operational value at risk (opVaR), this research carefully analyses the behaviour of the data by taking into account volatility, skewness and kurtosis of the variables. In the modelling, volatility analysis employs two models: constant-variance model and exponential weighted moving average (EWMA) model. Results of the empirical tests show that the operational risk variables in this research are non-normal; thus, non-normality involving skewness and kurtosis as well as volatility has to be taken into account in the estimation of VaR. In doing so, this research employs Cornish-Fisher expansion upon which the confidence interval of operational variables is an explicit function of the skewness and kurtosis as well as the volatility. Empirical findings by deploying a set of econometrics tests reveal that for financing activities, the role of maintaining operational efficiency as part of an Islamic bank’s fiduciary responsibilities is immensely high. However, people risk is enormous and plays a dominant role in affecting the level of operational risk exposures in Islamic banks in investment activities

    Neural Network Modelling of Constrained Spatial Interaction Flows

    Get PDF
    Fundamental to regional science is the subject of spatial interaction. GeoComputation - a new research paradigm that represents the convergence of the disciplines of computer science, geographic information science, mathematics and statistics - has brought many scholars back to spatial interaction modeling. Neural spatial interaction modeling represents a clear break with traditional methods used for explicating spatial interaction. Neural spatial interaction models are termed neural in the sense that they are based on neurocomputing. They are clearly related to conventional unconstrained spatial interaction models of the gravity type, and under commonly met conditions they can be understood as a special class of general feedforward neural network models with a single hidden layer and sigmoidal transfer functions (Fischer 1998). These models have been used to model journey-to-work flows and telecommunications traffic (Fischer and Gopal 1994, Openshaw 1993). They appear to provide superior levels of performance when compared with unconstrained conventional models. In many practical situations, however, we have - in addition to the spatial interaction data itself - some information about various accounting constraints on the predicted flows. In principle, there are two ways to incorporate accounting constraints in neural spatial interaction modeling. The required constraint properties can be built into the post-processing stage, or they can be built directly into the model structure. While the first way is relatively straightforward, it suffers from the disadvantage of being inefficient. It will also result in a model which does not inherently respect the constraints. Thus we follow the second way. In this paper we present a novel class of neural spatial interaction models that incorporate origin-specific constraints into the model structure using product units rather than summation units at the hidden layer and softmax output units at the output layer. Product unit neural networks are powerful because of their ability to handle higher order combinations of inputs. But parameter estimation by standard techniques such as the gradient descent technique may be difficult. The performance of this novel class of spatial interaction models will be demonstrated by using the Austrian interregional traffic data and the conventional singly constrained spatial interaction model of the gravity type as benchmark. References Fischer M M (1998) Computational neural networks: A new paradigm for spatial analysis Environment and Planning A 30 (10): 1873-1891 Fischer M M, Gopal S (1994) Artificial neural networks: A new approach to modelling interregional telecommunciation flows, Journal of Regional Science 34(4): 503-527 Openshaw S (1993) Modelling spatial interaction using a neural net. In Fischer MM, Nijkamp P (eds) Geographical information systems, spatial modelling, and policy evaluation, pp. 147-164. Springer, Berlin
    • …
    corecore