104 research outputs found

    Design of critical infrastructures: application to electrical systems

    Get PDF
    The recent publication of the 5th revision of TIA 942 standard represents a benchmark framework to design resilient power systems. This standard provides a classification for electric infrastructures in terms of their capacity to tolerate failures and to allow safely maintenance operations. This ranking is not based upon technical specifications, but on system resilience level, that is the capacity to resist to an unexpected destructive event, breakdown or malfunctioning which afflicts the end user. However, this standard is provided only for design purposes. Aim of this paper is to propose an approach by which the current resilience status of a system can be evaluated, in accordance to this standard classification. The proposed technique should allow to easily analyze the gap – in terms of infrastructure topology, components and distribution lines – between an existing system and a generic configuration with a desired resilience level, and thus to suggest the steps to reach the proper availability for the system specific mission. A preliminary version of the technique – which however still leaves some open issues – has been validated with the power system infrastructure that supports one of the largest datacenters in Italy, inside a primary IT Company which has to guarantee a 24/7 continuous operation of its software application, mission critical in the interests of its customers

    Quality issues impacting production planning

    Get PDF
    Among the various problems affecting production processes, the unpredictability of quality factors is one of the main issues which concern manufacturing enterprises. In make-to-order or in perishable good production systems, the gap between expected and real output quality increases product cost mainly in two different ways: through the costs of extra production or reworks due to the presence of non-compliant items and through the costs originating from inefficient planning and the need of unscheduled machine changeovers. While the first are relatively easy to compute, even ex-ante, the latter are much more difficult to estimate because they depend on several planning variables such as lot size, sequencing, deliveries due dates, etc. This paper specifically addresses this problem in a make-to-order multi-product customized production system; here, the enterprise diversifies each production lot due to the fact that each order is based on the customer specific requirements and it is unique (in example, packaging or textiles and apparel industry). In these contexts, using a rule-of-thumb in overestimating the input size may cause high costs because all the excess production will generate little or no revenues on top of contributing to increasing wastes in general. On the other hand, the underestimation of the lots size is associated to the eventual need of launching a new, typically very small production order, thus a single product will bear twice the changeover costs. With little markups, it may happen that these extra costs can reduce profit to zero. Aim of this paper is to provide a critical analysis of the literature state-of-art while introducing some elements that can help the definition of lot-sizing policies considering poor quality costs

    Estimating projects duration in uncertain environments: Monte Carlo simulations strike back

    Get PDF
    PERT (Program Evaluation and Review Technique), developed in the 1950’s, represented the first attempt to incorporate uncertainty in project scheduling. Despite some weaknesses, it is still widely used in project management mostly thanks to the simplicity of its algorithm in operating on activity network diagrams. Today the increasing complexity of projects requires new techniques and the increasing availability of computer power have not brought project simulation into common usage as expected. Although several reviews assert that simulative approach has already superseded PERT when coping with uncertain environment; the reason why it is not diffused is that simulations require a too long computing time. In this paper we show through an algorithm and experimental results that the computational time, historically the major drawback of Monte Carlo simulations, is definitely minimum thanks also to the computational power available nowadays. We present results of an efficient program made of few lines of code and able to compute the completion time of a network activity diagram with 100.000 activities and about 50.000.000 precedence constraints between them

    The Effect of Slot-Code Optimization on Travel Times in Common Unit-Load Warehouses

    Get PDF
    The main aim of this paper is to estimate material handling times reductions in one-block unit-load warehouse organised with an optimal slot-code allocation, rather than with a uniform pick/store locations distribution, while comparing single and dual-command cycles from a travel distance perspective; results are calculated through multiple what-if analysis based on random scenarios simulations assuming variable input/output positions and warehouse shapes. Simulations helped in the effective quantification of travel times reductions, gaining a result of extreme importance for those manufacturing, distribution and retailing companies which aim at both designing their warehouse and determining the right type and number of transportation resources. Because of currently used warehouse management systems (WMS), companies do not reckon so needful of existing literature relying on uniform pick/store distribution: this paper seems the first to address a precise estimation of material handling times when fast-movers items are more or less effectively placed nearby warehouses entranc

    Quality issues impacting production planning

    Get PDF
    Among the various problems affecting production processes, the unpredictability of quality factors is one of the main issues which concern manufacturing enterprises. In make-to-order or in perishable good production systems, the gap between expected and real output quality increases product cost mainly in two different ways: through the costs of extra production or reworks due to the presence of non-compliant items and through the costs originating from inefficient planning and the need of unscheduled machine changeovers. While the first are relatively easy to compute, even ex-ante, the latter are much more difficult to estimate because they depend on several planning variables such as lot size, sequencing, deliveries due dates, etc. This paper specifically addresses this problem in a make-to-order multi-product customized production system; here, the enterprise diversifies each production lot due to the fact that each order is based on the customer specific requirements and it is unique (in example, packaging or textiles and apparel industry). In these contexts, using a rule-of-thumb in overestimating the input size may cause high costs because all the excess production will generate little or no revenues on top of contributing to increasing wastes in general. On the other hand, the underestimation of the lots size is associated to the eventual need of launching a new, typically very small production order, thus a single product will bear twice the changeover costs. With little markups, it may happen that these extra costs can reduce profit to zero. Aim of this paper is to provide a critical analysis of the literature state-of-art while introducing some elements that can help the definition of lot-sizing policies considering poor quality costs

    Proposta di un criterio di valutazione del rischio di progetto ai fini della procedura di asseverazione delle iniziative di project financing

    Get PDF
    La valutazione del rischio ai fini della concessione di finanziamenti alle imprese è un passo fondamentale e sempre delicato; la criticità cresce ulteriormente nel caso di iniziative che sfruttino lo strumento del project financing: tale strumento, particolarmente adatto a supportare le imprese di ingegneria nella realizzazione di opere pubbliche, prevede che la concessione del finanziamento non dipenda in via prioritaria dall’affidabilità e dalla capacità di credito dei soggetti ideatori del progetto – ovvero dal valore e dalla consistenza degli asset messi a disposizione dei finanziatori – bensì dalla accertata capacità dell’iniziativa di consentire il rientro del prestito per essa accordato: la copertura del finanziamento si realizza attraverso l’obbligazione, assunta dal promotore, a destinare parte del cash flow generato dal progetto alla graduale estinzione del debito, ed i soggetti finanziatori accettano come garanzia l’assicurazione della redditività dell’iniziativa. In tal senso la verifica dell’attendibilità dei risultati previsti dal promotore risulta essere determinante. A tal fine, in Italia una serie di Atti di Regolazione emanati dall’Autorità di Vigilanza sui Lavori Pubblici e le leggi che disciplinano il project financing indicano che il piano economico-finanziario presentato per una gara di appalto per “costruzione e gestione” di un’opera di pubblica utilità debba essere asseverato da un istituto di credito che ne certifichi la validità; pur sottolineando l’importanza della verifica del profilo di rischio e dell’equilibrio economico-finanziario dell’operazione, purtroppo la legislazione non definisce con la dovuta precisione modalità ed indicatori della procedura di asseverazione; in tal modo all’istituto asseveratore è spesso richiesto un giudizio complessivo, che lo spinge ad assumersi la responsabilità di approvare o respingere la proposta di progetto dovendone valutare anche aspetti tecnici o specifici del contesto in cui il progetto si pone, ciò che va ben al di là delle tradizionali competenze di un Istituto di Credito. In questa sede si riporta il risultato di uno studio condotto in collaborazione con un primario Istituto di Credito nazionale al fine di proporre una metodologia di valutazione del rischio di progetto che superi gli evidenti limiti della analisi degli scenari (tipicamente, average, best e worst) utilizzando congiuntamente il metodo Monte Carlo e l’analisi di sensibilità. La metodologia proposta, specificamente studiata per superare le criticità della procedura di asseverazione, è stata convalidata su un progetto di realizzazione di opere pubbliche con finanziamento privat

    Supply chain network design for the diffusion of a new product

    Get PDF
    Supply Chain Network Design (SCND) deals with the determination of the physical configuration and infrastructures of the supply chain. Specifically, facility location is one of the most critical decisions: transportation, inventory and information sharing decisions can be readily re-optimized in response to changes in the context, while facility location is often fixed and difficult to change even in the medium term. On top of this, when designing a supply network to support a new product diffusion (NPD), the problem becomes both dynamic and stochastic. While literature concentrated on approaching SCND for NPD separately coping with dynamic and stochastic issues, we propose an integrated optimisation model, which allows warehouse positioning decisions in concert with the demand dynamics during the diffusion stage of an innovative product/service. A stochastic dynamic model, which integrates a Stochastic Bass Model (SBM) in order to better describe and capture demand dynamics, is presented. A myopic policy is elaborated in order to solve and validate on the data of a real case of SCND with 1,400 potential market points and 28 alternatives for logistics platforms

    Opportunities for using RFID in the aircraft production process

    Get PDF
    This paper presents the results of the economical evaluation of the possibility of RFID technology adoption in an European company involved in the manufacturing of the fuselage of a new long-range, mid-size, wide-body jet airliner made of carbon fibre reinforced polymer. The peculiar constraints to the management of some of the raw materials – Time And Temperature Sensitive (TATS) materials – has pushed the Company to consider RFID tags introduction despite the fact that the material supplier did not provide any support. Thus, the most important impulse to the introduction of the RFID system in the Company has been given by the problems encountered in the management of TATS materials. However, further analyses are presented on the opportunity for the Company of extending RFID application to non-TATS materials. In this sense, several scenarios are presented, evaluating investment in hardware and tags costs with respect to advantages in terms of time savings in material handling processes or costs saving in terms of inventory misalignment reductions. In most scenarios, RFID introduction resulted to be profitable

    Storage Location Assignment Problem: implementation in a warehouse design optimization tool

    Get PDF
    This paper focuses on possible improvements of common practices of warehouse storage management taking cue from Operations Research SLAP (Storage Location Assignment Problem), thus aiming to reach an efficient and organized allocation of products to the warehouse slots. The implementation of a SLAP approach in a tool able to model multiple storage policies will be discussed, with the aim both to reduce the overall required warehouse space - to efficiently allocate produced goods - and to minimize the internal material handling times. The overcome of some of the limits of existing warehousing information management systems modules will be shown, sketching the design of a software tool able to return an organized slot-product allocation. The results of the validation of a prototype on an industrial case are presented, showing the efficiency increase of using the proposed approach with dedicated slot storage policy adoption
    • …
    corecore