1,471 research outputs found

    INNOVATIONS AND SUSTAINABLE DEVELOPMENT: NEOCLASSICAL VERSUS EVOLUTIONARY APPROACH

    Get PDF
    In the last 20 years, the concept of ‘Sustainable Development’ (SD) has become very popular and wide spread in the world. In particular, the environmental dimension of SD asks for new ways to accomplish enhanced quality of life with reduced environmental impact. As a consequence, innovations that contribute to sustainable path ways through an improved environmental quality (the so-called ‘Sustainable Innovations’ - SI s) are facing a growing interest. The present study aims at contributing to the debate about innovation and SD, by focusing on the analysis of SIs from, respectively, the neoclassical and the evolutionary perspective. Whereas neoclassical theorists neoclassical theorists focus on the ‘double externality problem’ of SIs, on the one hand, and on the factors that influence the irimplementation, on the other, evolutionary approach analyses mainly radical technological changes thus stressing the need for a consideration of additional aspects ( in particular social and institutional ones) in the analysis of SIs.Innovations, Sustainable Development, Neoclassical Theory, Evolutionary Approach

    When Should Nintendo Launch its Wii? Insights From a Bivariate Successive Generation Model

    Get PDF
    November 2006 most likely marks the launch of Sony’s PS3, the successor to PS2. Later, Nintendo is expected to launch the Wii, the successor to the GameCube. We answer the question in the title by analyzing the diffusion of the earlier generations of these consoles, and by using a new model that extends the successive-generations model of Norton and Bass (1987) by introducing two market players. Based on interviews with consumers and with retailers, we calibrate part of this model. The main outcome is that an optimal launch time is around June 2007, as then total sales of Nintendo’s GameCube and Wii would get maximized

    Holistic, data-driven, service and supply chain optimisation: linked optimisation.

    Get PDF
    The intensity of competition and technological advancements in the business environment has made companies collaborate and cooperate together as a means of survival. This creates a chain of companies and business components with unified business objectives. However, managing the decision-making process (like scheduling, ordering, delivering and allocating) at the various business components and maintaining a holistic objective is a huge business challenge, as these operations are complex and dynamic. This is because the overall chain of business processes is widely distributed across all the supply chain participants; therefore, no individual collaborator has a complete overview of the processes. Increasingly, such decisions are automated and are strongly supported by optimisation algorithms - manufacturing optimisation, B2B ordering, financial trading, transportation scheduling and allocation. However, most of these algorithms do not incorporate the complexity associated with interacting decision-making systems like supply chains. It is well-known that decisions made at one point in supply chains can have significant consequences that ripple through linked production and transportation systems. Recently, global shocks to supply chains (COVID-19, climate change, blockage of the Suez Canal) have demonstrated the importance of these interdependencies, and the need to create supply chains that are more resilient and have significantly reduced impact on the environment. Such interacting decision-making systems need to be considered through an optimisation process. However, the interactions between such decision-making systems are not modelled. We therefore believe that modelling such interactions is an opportunity to provide computational extensions to current optimisation paradigms. This research study aims to develop a general framework for formulating and solving holistic, data-driven optimisation problems in service and supply chains. This research achieved this aim and contributes to scholarship by firstly considering the complexities of supply chain problems from a linked problem perspective. This leads to developing a formalism for characterising linked optimisation problems as a model for supply chains. Secondly, the research adopts a method for creating a linked optimisation problem benchmark by linking existing classical benchmark sets. This involves using a mix of classical optimisation problems, typically relating to supply chain decision problems, to describe different modes of linkages in linked optimisation problems. Thirdly, several techniques for linking supply chain fragmented data have been proposed in the literature to identify data relationships. Therefore, this thesis explores some of these techniques and combines them in specific ways to improve the data discovery process. Lastly, many state-of-the-art algorithms have been explored in the literature and these algorithms have been used to tackle problems relating to supply chain problems. This research therefore investigates the resilient state-of-the-art optimisation algorithms presented in the literature, and then designs suitable algorithmic approaches inspired by the existing algorithms and the nature of problem linkages to address different problem linkages in supply chains. Considering research findings and future perspectives, the study demonstrates the suitability of algorithms to different linked structures involving two sub-problems, which suggests further investigations on issues like the suitability of algorithms on more complex structures, benchmark methodologies, holistic goals and evaluation, processmining, game theory and dependency analysis

    The Launch Timing of New and Dominant Multigeneration Technologies

    Get PDF
    In this paper we introduce a model that is suitable to study the diffusion of new and dominant multi-generation technologies. Examples are computer operat- ing systems, mobile phone standards, video game consoles. Our model incorporates three new features that are not included in related models. First, we add the ability of a firm to transfer users of its old technologies to the new generations, what we call firms’ alpha. Second, we add competitive relations between market technolo- gies. Third, the launch strategies diagnosed by our model cover, as special cases, the now or never strategies and hence it is suitable to study intermediate launch strategies. We find that the appropriate timing of a new technology depends heavily on both the firms’ alphas and on the competitive positioning of their products. In addition, we argue that the strategic interaction of firms may lead to very different sales outcomes depending on the competitive positioning of their products. In the VGC case we find that the Nintendo Wii was launched at an appropriate moment while the Sony PS3 perhaps should have never been launched

    A multi-objective evolutionary approach to simulation-based optimisation of real-world problems.

    Get PDF
    This thesis presents a novel evolutionary optimisation algorithm that can improve the quality of solutions in simulation-based optimisation. Simulation-based optimisation is the process of finding optimal parameter settings without explicitly examining each possible configuration of settings. An optimisation algorithm generates potential configurations and sends these to the simulation, which acts as an evaluation function. The evaluation results are used to refine the optimisation such that it eventually returns a high-quality solution. The algorithm described in this thesis integrates multi-objective optimisation, parallelism, surrogate usage, and noise handling in a unique way for dealing with simulation-based optimisation problems incurred by these characteristics. In order to handle multiple, conflicting optimisation objectives, the algorithm uses a Pareto approach in which the set of best trade-off solutions is searched for and presented to the user. The algorithm supports a high degree of parallelism by adopting an asynchronous master-slave parallelisation model in combination with an incremental population refinement strategy. A surrogate evaluation function is adopted in the algorithm to quickly identify promising candidate solutions and filter out poor ones. A novel technique based on inheritance is used to compensate for the uncertainties associated with the approximative surrogate evaluations. Furthermore, a novel technique for multi-objective problems that effectively reduces noise by adopting a dynamic procedure in resampling solutions is used to tackle the problem of real-world unpredictability (noise). The proposed algorithm is evaluated on benchmark problems and two complex real-world problems of manufacturing optimisation. The first real-world problem concerns the optimisation of a production cell at Volvo Aero, while the second one concerns the optimisation of a camshaft machining line at Volvo Cars Engine. The results from the optimisations show that the algorithm finds better solutions for all the problems considered than existing, similar algorithms. The new techniques for dealing with surrogate imprecision and noise used in the algorithm are identified as key reasons for the good performance.University of Skövde Knowledge Foundation Swede

    Application of Shallow Neural Networks to Retail Intermittent Demand Time Series

    Get PDF
    Accurate sales predictions are essential for businesses in the fast-moving consumer goods (FMCG) industry. However, their demand forecasts are often unreliable, leading to imprecisions that affect downstream decisions. This dissertation proposes using an artificial neural network to improve intermittent demand forecasting in the retail sector. The research investigates the validity of using unprocessed historical information, eluding hand-crafted features, to learn patterns in intermittent demand data. The experiment tests a selection of shallow neural network architectures that can expedite the time-to-market in comparison to conventional demand forecasting methods. The results demonstrate that organisations that still rely on manual and direct forecasting methods could improve their predicting accuracy and establish a high-performing baseline for future development. The solution also offers an end-to-end systematic forecasting landscape enabling a lift-and-shift and easy transition from design to deployment. A practical implementation should bring about stable and reliable forecasts, resulting in cost savings, improved customer service, and increased profitability. Lastly, the research findings contribute to the broader academic field of forecasting and ML with a seminal proposal that provides insights and opportunities for future research

    Multi-objective optimisation of aircraft flight trajectories in the ATM and avionics context

    Get PDF
    The continuous increase of air transport demand worldwide and the push for a more economically viable and environmentally sustainable aviation are driving significant evolutions of aircraft, airspace and airport systems design and operations. Although extensive research has been performed on the optimisation of aircraft trajectories and very efficient algorithms were widely adopted for the optimisation of vertical flight profiles, it is only in the last few years that higher levels of automation were proposed for integrated flight planning and re-routing functionalities of innovative Communication Navigation and Surveillance/Air Traffic Management (CNS/ATM) and Avionics (CNS+A) systems. In this context, the implementation of additional environmental targets and of multiple operational constraints introduces the need to efficiently deal with multiple objectives as part of the trajectory optimisation algorithm. This article provides a comprehensive review of Multi-Objective Trajectory Optimisation (MOTO) techniques for transport aircraft flight operations, with a special focus on the recent advances introduced in the CNS+A research context. In the first section, a brief introduction is given, together with an overview of the main international research initiatives where this topic has been studied, and the problem statement is provided. The second section introduces the mathematical formulation and the third section reviews the numerical solution techniques, including discretisation and optimisation methods for the specific problem formulated. The fourth section summarises the strategies to articulate the preferences and to select optimal trajectories when multiple conflicting objectives are introduced. The fifth section introduces a number of models defining the optimality criteria and constraints typically adopted in MOTO studies, including fuel consumption, air pollutant and noise emissions, operational costs, condensation trails, airspace and airport operations

    Adaptive algorithms for history matching and uncertainty quantification

    Get PDF
    Numerical reservoir simulation models are the basis for many decisions in regard to predicting, optimising, and improving production performance of oil and gas reservoirs. History matching is required to calibrate models to the dynamic behaviour of the reservoir, due to the existence of uncertainty in model parameters. Finally a set of history matched models are used for reservoir performance prediction and economic and risk assessment of different development scenarios. Various algorithms are employed to search and sample parameter space in history matching and uncertainty quantification problems. The algorithm choice and implementation, as done through a number of control parameters, have a significant impact on effectiveness and efficiency of the algorithm and thus, the quality of results and the speed of the process. This thesis is concerned with investigation, development, and implementation of improved and adaptive algorithms for reservoir history matching and uncertainty quantification problems. A set of evolutionary algorithms are considered and applied to history matching. The shared characteristic of applied algorithms is adaptation by balancing exploration and exploitation of the search space, which can lead to improved convergence and diversity. This includes the use of estimation of distribution algorithms, which implicitly adapt their search mechanism to the characteristics of the problem. Hybridising them with genetic algorithms, multiobjective sorting algorithms, and real-coded, multi-model and multivariate Gaussian-based models can help these algorithms to adapt even more and improve their performance. Finally diversity measures are used to develop an explicit, adaptive algorithm and control the algorithm’s performance, based on the structure of the problem. Uncertainty quantification in a Bayesian framework can be carried out by resampling of the search space using Markov chain Monte-Carlo sampling algorithms. Common critiques of these are low efficiency and their need for control parameter tuning. A Metropolis-Hastings sampling algorithm with an adaptive multivariate Gaussian proposal distribution and a K-nearest neighbour approximation has been developed and applied

    Green Technologies for Production Processes

    Get PDF
    This book focuses on original research works about Green Technologies for Production Processes, including discrete production processes and process production processes, from various aspects that tackle product, process, and system issues in production. The aim is to report the state-of-the-art on relevant research topics and highlight the barriers, challenges, and opportunities we are facing. This book includes 22 research papers and involves energy-saving and waste reduction in production processes, design and manufacturing of green products, low carbon manufacturing and remanufacturing, management and policy for sustainable production, technologies of mitigating CO2 emissions, and other green technologies
    • 

    corecore