21,323 research outputs found

    Forecasting and Forecast Combination in Airline Revenue Management Applications

    Get PDF
    Predicting a variable for a future point in time helps planning for unknown future situations and is common practice in many areas such as economics, finance, manufacturing, weather and natural sciences. This paper investigates and compares approaches to forecasting and forecast combination that can be applied to service industry in general and to airline industry in particular. Furthermore, possibilities to include additionally available data like passenger-based information are discussed

    Semiparametric Bayesian Time-Space Analysis of Unemployment Duration

    Get PDF
    In this paper, we analyze unemployment duration in Germany with official data from the German Federal Employment Office for the years 1980-1995. Conventional hazard rate models for leaving unemployment cannot cope with simultaneous and flexible fitting of duration dependence, nonlinear covariate effects, trend and seasonal calendar time components and a large number of regional effects. We apply a semiparametric hierarchical Bayesian modelling approach that is suitable for time-space analysis of unemployment duration by simultaneously including and estimating effects of several time scales, regional variation and further covariates. Inference is fully Bayesian and uses recent Markov chain Monte Carlo techniques

    Optimal modelling and experimentation for the improved sustainability of microfluidic chemical technology design

    Get PDF
    Optimization of the dynamics and control of chemical processes holds the promise of improved sustainability for chemical technology by minimizing resource wastage. Anecdotally, chemical plant may be substantially over designed, say by 35-50%, due to designers taking account of uncertainties by providing greater flexibility. Once the plant is commissioned, techniques of nonlinear dynamics analysis can be used by process systems engineers to recoup some of this overdesign by optimization of the plant operation through tighter control. At the design stage, coupling the experimentation with data assimilation into the model, whilst using the partially informed, semi-empirical model to predict from parametric sensitivity studies which experiments to run should optimally improve the model. This approach has been demonstrated for optimal experimentation, but limited to a differential algebraic model of the process. Typically, such models for online monitoring have been limited to low dimensions. Recently it has been demonstrated that inverse methods such as data assimilation can be applied to PDE systems with algebraic constraints, a substantially more complicated parameter estimation using finite element multiphysics modelling. Parametric sensitivity can be used from such semi-empirical models to predict the optimum placement of sensors to be used to collect data that optimally informs the model for a microfluidic sensor system. This coupled optimum modelling and experiment procedure is ambitious in the scale of the modelling problem, as well as in the scale of the application - a microfluidic device. In general, microfluidic devices are sufficiently easy to fabricate, control, and monitor that they form an ideal platform for developing high dimensional spatio-temporal models for simultaneously coupling with experimentation. As chemical microreactors already promise low raw materials wastage through tight control of reagent contacting, improved design techniques should be able to augment optimal control systems to achieve very low resource wastage. In this paper, we discuss how the paradigm for optimal modelling and experimentation should be developed and foreshadow the exploitation of this methodology for the development of chemical microreactors and microfluidic sensors for online monitoring of chemical processes. Improvement in both of these areas bodes to improve the sustainability of chemical processes through innovative technology. (C) 2008 The Institution of Chemical Engineers. Published by Elsevier B.V. All rights reserved

    Reservoir Computing Approach to Robust Computation using Unreliable Nanoscale Networks

    Full text link
    As we approach the physical limits of CMOS technology, advances in materials science and nanotechnology are making available a variety of unconventional computing substrates that can potentially replace top-down-designed silicon-based computing devices. Inherent stochasticity in the fabrication process and nanometer scale of these substrates inevitably lead to design variations, defects, faults, and noise in the resulting devices. A key challenge is how to harness such devices to perform robust computation. We propose reservoir computing as a solution. In reservoir computing, computation takes place by translating the dynamics of an excited medium, called a reservoir, into a desired output. This approach eliminates the need for external control and redundancy, and the programming is done using a closed-form regression problem on the output, which also allows concurrent programming using a single device. Using a theoretical model, we show that both regular and irregular reservoirs are intrinsically robust to structural noise as they perform computation

    Big Data and Reliability Applications: The Complexity Dimension

    Full text link
    Big data features not only large volumes of data but also data with complicated structures. Complexity imposes unique challenges in big data analytics. Meeker and Hong (2014, Quality Engineering, pp. 102-116) provided an extensive discussion of the opportunities and challenges in big data and reliability, and described engineering systems that can generate big data that can be used in reliability analysis. Meeker and Hong (2014) focused on large scale system operating and environment data (i.e., high-frequency multivariate time series data), and provided examples on how to link such data as covariates to traditional reliability responses such as time to failure, time to recurrence of events, and degradation measurements. This paper intends to extend that discussion by focusing on how to use data with complicated structures to do reliability analysis. Such data types include high-dimensional sensor data, functional curve data, and image streams. We first provide a review of recent development in those directions, and then we provide a discussion on how analytical methods can be developed to tackle the challenging aspects that arise from the complexity feature of big data in reliability applications. The use of modern statistical methods such as variable selection, functional data analysis, scalar-on-image regression, spatio-temporal data models, and machine learning techniques will also be discussed.Comment: 28 pages, 7 figure

    Spatial clustering and nonlinearities in the location of multinational firms

    Get PDF
    We propose a semiparametric geoadditive negative binomial model of industrial location which allows to simultaneously address some important methodological issues, such as spatial clustering and nonlinearities, which have been only partly addressed in previous studies. We apply this model to analyze location determinants of inward greenfield investments occurred over the 2003-2007 period in 249 European regions. The inclusion of a geoadditive component (a smooth spatial trend surface) allows to control for omitted variables which induce spatial clustering, and suggests that such unobserved factors may be related to regional policies towards foreign investors Allowing for nonlinearities reveals, in line with theoretical predictions, that the positive effect of agglomeration economies fades as the density of economic activities reaches some limit value.industrial location, negative binomial models, geoadditive models, european union.
    • …
    corecore