2,818 research outputs found

    Food supply chain network robustness : a literature review and research agenda

    Get PDF
    Today’s business environment is characterized by challenges of strong global competition where companies tend to achieve leanness and maximum responsiveness. However, lean supply chain networks (SCNs) become more vulnerable to all kind of disruptions. Food SCNs have to become robust, i.e. they should be able to continue to function in the event of disruption as well as in normal business environment. Current literature provides no explicit clarification related to robustness issue in food SCN context. This paper explores the meaning of SCN robustness and highlights further research direction

    Integrated system to perform surrogate based aerodynamic optimisation for high-lift airfoil

    Get PDF
    This work deals with the aerodynamics optimisation of a generic two-dimensional three element high-lift configuration. Although the high-lift system is applied only during take-off and landing in the low speed phase of the flight the cost efficiency of the airplane is strongly influenced by it [1]. The ultimate goal of an aircraft high lift system design team is to define the simplest configuration which, for prescribed constraints, will meet the take-off, climb, and landing requirements usually expressed in terms of maximum L/D and/or maximum CL. The ability of the calculation method to accurately predict changes in objective function value when gaps, overlaps and element deflections are varied is therefore critical. Despite advances in computer capacity, the enormous computational cost of running complex engineering simulations makes it impractical to rely exclusively on simulation for the purpose of design optimisation. To cut down the cost, surrogate models, also known as metamodels, are constructed from and then used in place of the actual simulation models. This work outlines the development of integrated systems to perform aerodynamics multi-objective optimisation for a three-element airfoil test case in high lift configuration, making use of surrogate models available in MACROS Generic Tools, which has been integrated in our design tool. Different metamodeling techniques have been compared based on multiple performance criteria. With MACROS is possible performing either optimisation of the model built with predefined training sample (GSO) or Iterative Surrogate-Based Optimization (SBO). In this first case the model is build independent from the optimisation and then use it as a black box in the optimisation process. In the second case is needed to provide the possibility to call CFD code from the optimisation process, and there is no need to build any model, it is being built internally during the optimisation process. Both approaches have been applied. A detailed analysis of the integrated design system, the methods as well as th

    Solution and quality robust project scheduling: a methodological framework.

    Get PDF
    The vast majority of the research efforts in project scheduling over the past several years has concentrated on the development of exact and suboptimal procedures for the generation of a baseline schedule assuming complete information and a deterministic environment. During execution, however, projects may be the subject of considerable uncertainty, which may lead to numerous schedule disruptions. Predictive-reactive scheduling refers to the process where a baseline schedule is developed prior to the start of the project and updated if necessary during project execution. It is the objective of this paper to review possible procedures for the generation of proactive (robust) schedules, which are as well as possible protected against schedule disruptions, and for the deployment of reactive scheduling procedures that may be used to revise or re-optimize the baseline schedule when unexpected events occur. We also offer a methodological framework that should allow project management to identify the proper scheduling methodology for different project scheduling environments. Finally, we survey the basics of Critical Chain scheduling and indicate in which environments it is useful.Framework; Information; Management; Processes; Project management; Project scheduling; Project scheduling under uncertainty; Stability; Robust scheduling; Quality; Scheduling; Stability; Uncertainty;

    Investigating biocomplexity through the agent-based paradigm.

    Get PDF
    Capturing the dynamism that pervades biological systems requires a computational approach that can accommodate both the continuous features of the system environment as well as the flexible and heterogeneous nature of component interactions. This presents a serious challenge for the more traditional mathematical approaches that assume component homogeneity to relate system observables using mathematical equations. While the homogeneity condition does not lead to loss of accuracy while simulating various continua, it fails to offer detailed solutions when applied to systems with dynamically interacting heterogeneous components. As the functionality and architecture of most biological systems is a product of multi-faceted individual interactions at the sub-system level, continuum models rarely offer much beyond qualitative similarity. Agent-based modelling is a class of algorithmic computational approaches that rely on interactions between Turing-complete finite-state machines--or agents--to simulate, from the bottom-up, macroscopic properties of a system. In recognizing the heterogeneity condition, they offer suitable ontologies to the system components being modelled, thereby succeeding where their continuum counterparts tend to struggle. Furthermore, being inherently hierarchical, they are quite amenable to coupling with other computational paradigms. The integration of any agent-based framework with continuum models is arguably the most elegant and precise way of representing biological systems. Although in its nascence, agent-based modelling has been utilized to model biological complexity across a broad range of biological scales (from cells to societies). In this article, we explore the reasons that make agent-based modelling the most precise approach to model biological systems that tend to be non-linear and complex

    Offshoring effectiveness: Measurement and improvement with optimization approach.

    Get PDF
    This study takes a refreshing look at IT outsourcing from a vendor\u27s perspective and discusses best practices required to effectively manage offshore business needs and offshoring effectiveness. We have conducted a detailed investigation to learn why outsourcing ventures fail, how to effectively measure up to service provider capability, and how to deliver strategic value to the end customer. Extant literature does not talk about the vendor\u27s issues and problems in outsourcing, and our investigation emphasized the vendor\u27s perspective on offshoring strategy and offshore resource effectiveness as the two important differentiators in a make-or-buy decision. Measurement metrics for each of the two items were devised to estimate their effect on offshoring effectiveness. We spoke to some of the top 10 IT vendors in India, collected offshoring data from both clients and vendors, and used the data to validate our decision framework. The framework helps us to investigate current industry practices in IT outsourcing, identify issues and problems beyond the obvious advantages of outsourcing, and propose measures to assess offshoring effectiveness. The investigation gave us an opportunity to record the best IT practices as well as suggest possible improvements in the service or product delivery cycle to enhance customer experience

    A scalable multi-core architecture with heterogeneous memory structures for Dynamic Neuromorphic Asynchronous Processors (DYNAPs)

    Full text link
    Neuromorphic computing systems comprise networks of neurons that use asynchronous events for both computation and communication. This type of representation offers several advantages in terms of bandwidth and power consumption in neuromorphic electronic systems. However, managing the traffic of asynchronous events in large scale systems is a daunting task, both in terms of circuit complexity and memory requirements. Here we present a novel routing methodology that employs both hierarchical and mesh routing strategies and combines heterogeneous memory structures for minimizing both memory requirements and latency, while maximizing programming flexibility to support a wide range of event-based neural network architectures, through parameter configuration. We validated the proposed scheme in a prototype multi-core neuromorphic processor chip that employs hybrid analog/digital circuits for emulating synapse and neuron dynamics together with asynchronous digital circuits for managing the address-event traffic. We present a theoretical analysis of the proposed connectivity scheme, describe the methods and circuits used to implement such scheme, and characterize the prototype chip. Finally, we demonstrate the use of the neuromorphic processor with a convolutional neural network for the real-time classification of visual symbols being flashed to a dynamic vision sensor (DVS) at high speed.Comment: 17 pages, 14 figure

    Automated synthesis of delay-insensitive circuits

    Get PDF

    Deep-learning assisted reduced order model for high-dimensional flow prediction from sparse data

    Full text link
    The reconstruction and prediction of full-state flows from sparse data are of great scientific and engineering significance yet remain challenging, especially in applications where data are sparse and/or subjected to noise. To this end, this study proposes a deep-learning assisted non-intrusive reduced order model (named DCDMD) for high-dimensional flow prediction from sparse data. Based on the compressed sensing (CS)-Dynamic Mode Decomposition (DMD), the DCDMD model is distinguished by two novelties. Firstly, a sparse matrix is defined to overcome the strict random distribution condition of sensor locations in CS, thus allowing flexible sensor deployments and requiring very few sensors. Secondly, a deep-learning-based proxy is invoked to acquire coherent flow modes from the sparse data of high-dimensional flows, thereby addressing the issue of defining sparsity and the stringent incoherence condition in the conventional CSDMD. The two advantageous features, combined with the fact that the model retains flow physics in the online stage, lead to significant enhancements in accuracy and efficiency, as well as superior insensitivity to data noises (i.e., robustness), in both reconstruction and prediction of full-state flows. These are demonstrated by three benchmark examples, i.e., cylinder wake, weekly-mean sea surface temperature and isotropic turbulence in a periodic square area.Comment: 36 Pages, 23 Figures, 5 Table

    Self-Organising Approaches to Coordination

    Get PDF

    A fuzzy multi-criteria decision making approach for managing performance and risk in integrated procurement-production planning

    Get PDF
    Nowadays in Supply Chain (SC) networks, a high level of risk comes from SC partners. An effective risk management process becomes as a consequence mandatory, especially at the tactical planning level. The aim of this article is to present a risk-oriented integrated procurement–production approach for tactical planning in a multi-echelon SC network involving multiple suppliers, multiple parallel manufacturing plants, multiple subcontractors and several customers. An originality of the work is to combine an analytical model allowing to build feasible scenarios and a multi-criteria approach for assessing these scenarios. The literature has mainly addressed the problem through cost or profit-based optimisation and seldom considers more qualitative yet important criteria linked to risk, like trust in the supplier, flexibility or resilience. Unlike the traditional approaches, we present a method evaluating each possible supply scenario through performance-based and risk-based decision criteria, involving both qualitative and quantitative factors, in order to clearly separate the performance of a scenario and the risk taken if it is adopted. Since the decision-maker often cannot provide crisp values for some critical data, fuzzy sets theory is suggested in order to model vague information based on subjective expertise. Fuzzy Technique for Order of Preference by Similarity to Ideal Solution is used to determine both the performance and risk measures correlated to each possible tactical plan. The applicability and tractability of the proposed approach is shown on an illustrative example and a sensitivity analysis is performed to investigate the influence of criteria weights on the selection of the procurement–production plan
    corecore