1,147 research outputs found

    Increasing Sustainability of Logistic Networks by Reducing Product Losses: A Network DEA Approach

    Get PDF
    This paper considers a multiproduct supply network, in which losses (e.g., spoilage of perishable products) can occur at either the nodes or the arcs. Using observed data, a Network Data Envelopment Analysis (NDEA) approach is proposed to assess the efficiency of the product flows in varying periods. Losses occur in each process as the observed output flows are lower than the observed input flows. The proposed NDEA model computes, within the NDEA technology, input and output targets for each process. The target operating points correspond to the minimum losses attainable using the best observed practice. The efficiency scores are computed comparing the observed losses with the minimum feasible losses. In addition to computing relative efficiency scores, an overall loss factor for each product and each node and link can be determined, both for the observed data and for the computed targets. A detailed illustration and an experimental design are used to study and validate the proposed approach. The results indicate that the proposed approach can identify and remove the inefficiencies in the observed data and that the potential spoilage reduction increases with the variability in the losses observed in the different periods.Ministerio de Ciencia DPI2017-85343-PFondo Europeo de Desarrollo Regional DPI2017-85343-

    Developing a short-term comparative optimization forecasting model for operational units’ strategic planning

    Get PDF
    Data drain for peer active units operating in the same sector is a major factor that prevents policy makers from developing flawless strategic plans for their organisation. This study introduces a hybrid model that incorporates a purely deterministic method, Data Envelopment Analysis (DEA), and a semi-parametric technique, Artificial Neural Networks (ANNs), to provide a strategic planning tool for efficiency optimization applicable to short-term lag of data availability. For consecutive time instances, t and t+1, the developed DEANN model returns optimum “regression-type” input and output levels for every sample operational unit, even for the fully efficient ones, that may decide to alter the levels of the efficiency determinants, respecting the t-time efficiency frontier.Forecasting, Optimization, Efficiency, Data Envelopment Analysis (DEA), Artificial Neural Networks (ANN), Adaptive Techniques

    An agent-based dynamic information network for supply chain management

    Get PDF
    One of the main research issues in supply chain management is to improve the global efficiency of supply chains. However, the improvement efforts often fail because supply chains are complex, are subject to frequent changes, and collaboration and information sharing in the supply chains are often infeasible. This paper presents a practical collaboration framework for supply chain management wherein multi-agent systems form dynamic information networks and coordinate their production and order planning according to synchronized estimation of market demands. In the framework, agents employ an iterative relaxation contract net protocol to find the most desirable suppliers by using data envelopment analysis. Furthermore, the chain of buyers and suppliers, from the end markets to raw material suppliers, form dynamic information networks for synchronized planning. This paper presents an agent-based dynamic information network for supply chain management and discusses the associated pros and cons

    A Methodology for Assessing Eco-efficiency in Logistics Networks

    Get PDF
    Recent literature on sustainable logistics networks points to two important questions: (i) How to spot the preferred solution(s) balancing environmental and business concerns? (ii) How to improve the understanding of the trade-offs between these two dimensions? We posit that a complete exploration of the efficient frontier and trade-offs between profitability and environmental impacts are particularly suitable to answer these two questions. In order to deal with the exponential number of basic efficient points in the frontier, we propose a formulation that performs in exponential time for the number of objective functions only. We illustrate our findings by designing a complex recycling logistics network in Germany.Eco-efficiency;Environmental impacts;Profitability;Recycling logistics network

    CAN FISCAL POLICY EXPLAIN TECHNICAL INEFFICIENCY OF PRIVATISED FIRMS? A PARAMETRIC AND NONPARAMETRIC APPROACH

    Get PDF
    The massive interests of economic literature about the privatisation gave a notable impulse to the discussion about this theme in the pre and post privatisation firms performance. Basically in every case after privatisation the level of profit increases. Does this mean that privatisation is certainly able to increase efficiency? In this field a large part of the literature leave out the complex problem that public firms usually are subject to objectives and constraints that differently from private firms can affect the overall economic efficiency. Unfortunately many authors ignore the effects of taxation during the process of privatisation, but in real term there are significant tax issues that must be considered by public and private decision maker. In this paper we concentrate the attention on the efficiency measures with the purpose to identify and measure sources of successful performance that can be used in policy planning and allocation of resources. Several techniques to calculate these frontier functions have been used, some of them parametric, others non-parametric to empirically investigate the relationship between taxation on firm’s income and efficiency in the period pre and post-privatisation. In this work we use both econometric and mathematical programming approaches for measuring efficiency. The econometric tool provide maximum likelihood estimates of a stochastic production and cost functions to distinguish noise from inefficiency. Instead, the mathematical programming approaches are nonstochastic and they do not make strict assumptions on the functional form of production and the statistical properties of the data. The general results obtained from the 3 different tools (Stochastic Frontier, Data Envelopment Analysis and Neural Network) are consistent. In fact, we see that privatization enhanced efficiency in three out of four sample firms.Privatization, Fiscal policy, Data Envelopment Analysis, Stochastic Frontier, Neural Network

    Sensitivity analysis of energy inputs in crop production using artificial neural networks

    Get PDF
    Sensitivity analysis establishes priorities for research and allows to identify and rank the most important factors which lead to great improvements in output factors. The aim of this study is to examine sensitivity analysis of inputs in grape production. We are proposing to perform sensitivity analysis using partial rank correlation coefficient (PRCC) which is the most reliable and efficient method, and we apply this for the first time in crop production. This research investigates the use of energy in the vineyard of a semi-arid zone of Iran. Energy use efficiency, energy productivity, specific energy and net energy were calculated. Various artificial neural network (ANN) models were developed to predict grape yield with respect to input energies. ANN models consist of a multilayer perceptron (MLP) with seven neurons in the input layer, one and two hidden layer(s) with different number of neurons, and an output layer with one neuron. Input energies were labor, machinery, chemicals, farmyard manure (FYM), diesel, electricity and water for irrigation. Sensitivity analysis was performed on over 100 samples of parameter space generated by Latin hypercube sampling method, which was then fed to the ANN model to predict the yield for each sample. The PRCC between the predicted yield and each parameter value (input) was used to calculate the sensitivity of the model to each input. Results of sensitivity analysis showed that machinery had the greatest impact on grape yield followed by diesel fuel and labor

    Hybrid SOM+k-Means Clustering to Improve Planning, Operation and Management in Water Distribution Systems

    Full text link
    [EN] With the advance of new technologies and emergence of the concept of the smart city, there has been a dramatic increase in available information. Water distribution systems (WDSs) in which databases can be updated every few minutes are no exception. Suitable techniques to evaluate available information and produce optimized responses are necessary for planning, operation, and management. This can help identify critical characteristics, such as leakage patterns, pipes to be replaced, and other features. This paper presents a clustering method based on self-organizing maps coupled with k-means algorithms to achieve groups that can be easily labeled and used for WDS decision-making. Three case-studies are presented, namely a classification of Brazilian cities in terms of their water utilities; district metered area creation to improve pressure control; and transient pressure signal analysis to identify burst pipes. In the three cases, this hybrid technique produces excellent results. © 2018 Elsevier Ltd. All rights reserved.This work is partially supported by Capes and CNPq, Brazilian research agencies. The use of English was revised by John Rawlins.Brentan, BM.; Meirelles, G.; Luvizotto, E.; Izquierdo Sebastián, J. (2018). Hybrid SOM+k-Means Clustering to Improve Planning, Operation and Management in Water Distribution Systems. Environmental Modelling & Software. 106:77-88. https://doi.org/10.1016/j.envsoft.2018.02.013S778810

    Neural Network Based Models for Efficiency Frontier Analysis: An Application to East Asian Economies' Growth Decomposition

    Get PDF
    There has been a long tradition in business and economics to use frontier analysis to assess a production unit’s performance. The first attempt utilized the data envelopment analysis (DEA) which is based on a piecewise linear and mathematical programming approach, whilst the other employed the parametric approach to estimate the stochastic frontier functions. Both approaches have their advantages as well as limitations. This paper sets out to use an alternative approach, i.e. artificial neural networks (ANNs) for measuring efficiency and productivity growth for seven East Asian economies at manufacturing level, for the period 1963 to 1998, and the relevant comparisons are carried out between DEA and ANN, and stochastic frontier analysis (SFA) and ANN in order to test the ANNs’ ability to assess the performance of production units. The results suggest that ANNs are a promising alternative to traditional approaches, to approximate production functions more accurately and measure efficiency and productivity under non-linear contexts, with minimum assumptions.total factor productivity, neural networks, stochastic frontier analysis, DEA, East Asian economies
    corecore