448,721 research outputs found

    How Much to Spend on Flexibility? Determining the Value of Information System Flexibility

    Get PDF
    ABSTRACT In the current paper, we outline several approaches to determine the value of information system (IS) flexibility, defined as the extent to which an IS can be modified and upgraded following its initial implementation. Building on an earlier conceptual model by Gebauer and Schober (2006), we calculate the value of IS flexibility for a numerical example with deterministic and stochastic model parameters. We compare the results of decision tree analysis and real option analysis and present the results of a simulation experiment. Besides practical implications, our results contribute to earlier research on IS flexibility as they highlight the need to include stochastic elements in the evaluation of IS flexibility

    Modelling manufacturing systems flexibility.

    Get PDF
    The flexl.bility to change product and processes quickly and economically represents a significant competitive advantage to manufacturing organisations. The rapid rise in global sourcing, has resulted in manufacturers having to offer greater levels of customisation, thus a wider product range is essential to an organisation's competitiveness. The rate at which new products are introduced to the market has also increased, with greatly reduced development times being essential to a new product's market success. Hence there is a strong need to have a flexible manufacturing system such that new products may be introduced rapidly. These drivers have made the need for flexibility within manufacturing systems of great importance. However, there are many types of flexibility and to ensure that organisations correctly target these types of flexibility there is a need to measure fleXlbility, because, measuring fleXlDility allows manufacturers to identify systems which will improve their performance. This research, therefore, has focused on the development measures for two types of flexibility ie. mix fleXlDility and product flexibility. These represent the ability to change between the manufacture of current products i. e. mix flexibility and the ability to introduce new products i.e. product fleXlDility. In order to develop effective measures for these types of fleXlbility a conceptual model has been developed, which represents the current and potential future product range of manufacturing systems. The methodology developed for measuring mix and product flexibility has been successfully applied in two companies. These companies represent diverse manufacturing environments. One operates in high volume chemical manufacture and the other in low to medium volume furniture manufacture. Through applying this methodology in these two companies it has been demonstrated that the methodology is generic and can be used in a wide range of companIes

    Plotting virtuality : dimensions of eLearning space

    Get PDF
    The term eLearning enjoys wide currency, but is loosely employed. A lack of clarity as to its nature accompanies a lack of understanding as to its applications and appropriate use. These are important issues, as political, educational and commercial policy-makers need an informed frame of reference from which to make decisions regarding the employment of eLearning alongside or in the place of existing methods of education and training. There is also a need for accurate description of eLearning products for the clients who might use them. This paper seeks to provide contextual and internal analyses of eLearning as an initial stage in the process of creating such a frame of reference. Firstly, eLearning is located within a variety of education and training contexts so as to delineate its boundaries, and an overview is made of ways in which it is employed at higher education level within private, corporate and state-funded systems. Secondly, earlier conceptual models for eLearning are examined and a model is proposed comprising four dimensions of virtual space: course utility, study flexibility, delivery technology and learning paradigm. A graphical representation of the dimensional model is used to profile the different contexts for eLearning explored earlier; this method of visualisation affords ready comparison of the variety of ways in which eLearning is employed. Thirdly, a rationale is advanced for these dimensions, which are then discussed in relation to typical learning activities. Finally, consideration is given to how the dimensional model might be applied in the areas of learner appeal, course marketing, educational systems design and course quality evaluation

    Developing Supply Chain Agility for the High-Volume and High-Variety Industry

    Get PDF
    Supply chains are under pressure to meet performance expectations under conditions in which access to the global network of suppliers and customers is fluid. Most studies accept the importance of agility to enhance performance using flexibility as a key dimension. Moreover, based on literature and empirical implications, it is essentially noticeable that there is an agreement on the need for flexibility in manufacturing to address both internal changes at the manufacturing echelon (e.g., a variation of process times) and external uncertainties (e.g., availability of ingredients, delivery schedules).However, there is a lack of adoptable metrics of manufacturing flexibility that can be used to evaluate manufacturing flexibility’s impact to enhance TH and reduce cost, both at the manufacturing echelon and the supply chain as a system as well as its impact on other echelons. Therefore, focusing on manufacturing flexibility as a competitive strategy induces a driving force for the success of the performance of supply chains. The purpose of this research is to present an applicable methodology for the evaluation of flexibility in a supply chain called Flexible Discrete Supply Chain (FDSC). The FDSC structure consists of a supplier, manufacturer, distributor, and customer as its conceptual model. Two main performance indicators – TH and cost are used to study the FDSC performance. This study utilizes four dimensions: volume, delivery, mix, and innovation (VDMI) flexibility. Quality function deployment is used to translate the dimensions of flexibility to key metrics that can be controlled in a discrete-event simulation (DES) model. The DES model is used to generate data, and for configuring VDMI metrics. The data is used for further sensitivity analysis. The developed methodology is verified and validated using data from a real case study. It is applicable to all supply chains within the FDSC criteria. This study contributes to the body of knowledge of supply chain flexibility through technical, methodical, and managerial implications. It clearly illustrated scenarios and provided guidelines for operations managers, to test among VMDI flexibility to maximize TH constrained by cost. Key directions for future research are identified

    High-Performance Work Systems as an Initiator of Employee Proactivity and Flexible Work Processes

    Get PDF
    We offer a conceptual framework that explicates the effect of high-performance work systems (HPWS) on the flexibility of organizational work processes. The flexibility of work processes is conceptualized as the extent to which organizational work routines can be modified by employees to better exploit existing capabilities or be adapted to explore new alternatives. We argue that HPWS directly facilitate individual proactivity, and foster a supportive social structure that further enables individuals to be proactive in modifying their work processes. The proposed model is in response to calls for researchers to consider proximal outcomes related to the use of human resource management (HRM) systems and, more specifically, the need to better understand how HRM systems can enable employees to respond to threats and opportunities. Future research issues are also considered, including recommendations for empirical assessment of how employees modify their work processes

    Site investigation techniques for DNAPL source and plume zone characterisation

    Get PDF
    Establishing the location of the Source Area BioREmediation (SABRE) research cell was a primary objective of the site characterisation programme. This bulletin describes the development of a two-stage site characterisation methodology that combined qualitative and quantitative data to guide and inform an assessment of dense nonaqueous phase liquid (DNAPL) distribution at the site. DNAPL site characterisation has traditionally involved multiple phases of site investigation, characterised by rigid sampling and analysis programmes, expensive mobilisations and long decision-making timeframes (Crumbling, 2001a) , resulting in site investigations that are costly and long in duration. Here we follow the principles of an innovative framework, termed Triad (Crumbling, 2001a, 2001b; Crumbling et al., 2001, Crumbling et al. 2003), which describes a systematic approach for the characterisation and remediation of contaminated sites. The Triad approach to site characterisation focuses on three main components: a) systematic planning which is implemented with a preliminary conceptual site model from existing data. The desired outcomes are planned and decision uncertainties are evaluated; b) dynamic work strategies that focus on the need for flexibility as site characterisation progresses so that new information can guide the investigation in real-time and c) real-time measurement technologies that are critical in making dynamic work strategies possible. Key to this approach is the selection of suitable measurement technologies, of which there are two main categories (Crumbling et al., 2003). The first category provides qualitative, dense spatial data, often with detection limits over a preset value. These methods are generally of lower cost, produce real-time data and are primarily used to identify site areas that require further investigation. Examples of such "decisionquality" methods are laser induced fluorescence (Kram et al., 2001), membrane interface probing (McAndrews et al., 2003) and cone penetrometer testing (Robertson, 1990), all of which produce data in continuous vertical profiles. Because these methods are rapid, many profiles can be generated and hence the subsurface data density is greatly improved. These qualitative results are used to guide the sampling strategy for the application of the second category of technologies that generate quantitative, precise data that have low detection limits and are analyte-specific. These methods tend to be high cost with long turnaround times that preclude on-site decision making, hence applying them to quantify rather than produce a conceptual model facilitates a key cost saving. Examples include instrumental laboratory analyses such as soil solvent extractions (Parker et al., 2004)and water analyses (USEPA, 1996). Where these two categories of measurement technologies are used in tandem, a more complete and accurate dataset is achieved without additional site mobilisations. The aim of the site characterisation programme at the SABRE site was to delineate the DNAPL source zone rapidly and identify a location for the in situ research cell. The site characterisation objectives were to; a) test whether semi-quantitative measurement techniques could reliably determine geological interfaces, contaminant mass distribution and inform the initial site conceptual model; and b) quantitatively determine DNAPL source zone distribution, guided by the qualitative site conceptual model

    A generic approach to the evolution of interaction in ubiquitous systems

    Get PDF
    This dissertation addresses the challenge of the configuration of modern (ubiquitous, context-sensitive, mobile et al.) interactive systems where it is difficult or impossible to predict (i) the resources available for evolution, (ii) the criteria for judging the success of the evolution, and (iii) the degree to which human judgements must be involved in the evaluation process used to determine the configuration. In this thesis a conceptual model of interactive system configuration over time (known as interaction evolution) is presented which relies upon the follow steps; (i) identification of opportunities for change in a system, (ii) reflection on the available configuration alternatives, (iii) decision-making and (iv) implementation, and finally iteration of the process. This conceptual model underpins the development of a dynamic evolution environment based on a notion of configuration evaluation functions (hereafter referred to as evaluation functions) that provides greater flexibility than current solutions and, when supported by appropriate tools, can provide a richer set of evaluation techniques and features that are difficult or impossible to implement in current systems. Specifically this approach has support for changes to the approach, style or mode of use used for configuration - these features may result in more effective systems, less effort involved to configure them and a greater degree of control may be offered to the user. The contributions of this work include; (i) establishing the the need for configuration evolution through a literature review and a motivating case study experiment, (ii) development of a conceptual process model supporting interaction evolution, (iii) development of a model based on the notion of evaluation functions which is shown to support a wide range of interaction configuration approaches, (iv) a characterisation of the configuration evaluation space, followed by (v) an implementation of these ideas used in (vi) a series of longitudinal technology probes and investigations into the approaches

    Managing Evolving Business Workflows through the Capture of Descriptive Information

    Full text link
    Business systems these days need to be agile to address the needs of a changing world. In particular the discipline of Enterprise Application Integration requires business process management to be highly reconfigurable with the ability to support dynamic workflows, inter-application integration and process reconfiguration. Basing EAI systems on model-resident or on a so-called description-driven approach enables aspects of flexibility, distribution, system evolution and integration to be addressed in a domain-independent manner. Such a system called CRISTAL is described in this paper with particular emphasis on its application to EAI problem domains. A practical example of the CRISTAL technology in the domain of manufacturing systems, called Agilium, is described to demonstrate the principles of model-driven system evolution and integration. The approach is compared to other model-driven development approaches such as the Model-Driven Architecture of the OMG and so-called Adaptive Object Models.Comment: 12 pages, 4 figures. Presented at the eCOMO'2003 4th Int. Workshop on Conceptual Modeling Approaches for e-Busines
    • …
    corecore