10 research outputs found

    Forecast horizon aggregation in integer autoregressive moving average (INARMA) models

    Get PDF
    This paper addresses aggregation in integer autoregressive moving average (INARMA) models. Although aggregation in continuous-valued time series has been widely discussed, the same is not true for integer-valued time series. Forecast horizon aggregation is addressed in this paper. It is shown that the overlapping forecast horizon aggregation of an INARMA process results in an INARMA process. The conditional expected value of the aggregated process is also derived for use in forecasting. A simulation experiment is conducted to assess the accuracy of the forecasts produced by the aggregation method and to compare it to the accuracy of cumulative h-step ahead forecasts over the forecasting horizon. The results of an empirical analysis are also provided

    Intermittent demand forecasting with integer autoregressive moving average models

    Get PDF
    April 2009 This PhD thesis focuses on using time series models for counts in modelling and forecasting a special type of count series called intermittent series. An intermittent series is a series of non-negative integer values with some zero values. Such series occur in many areas including inventory control of spare parts. Various methods have been developed for intermittent demand forecasting with Croston’s method being the most widely used. Some studies focus on finding a model underlying Croston’s method. With none of these studies being successful in demonstrating an underlying model for which Croston’s method is optimal, the focus should now shift towards stationary models for intermittent demand forecasting. This thesis explores the application of a class of models for count data called the Integer Autoregressive Moving Average (INARMA) models. INARMA models have had applications in different areas such as medical science and economics, but this is the first attempt to use such a model-based method to forecast intermittent demand. In this PhD research, we first fill some gaps in the INARMA literature by finding the unconditional variance and the autocorrelation function of the general INARMA(p,q) model. The conditional expected value of the aggregated process over lead time is also obtained to be used as a lead time forecast. The accuracy of h-step-ahead and lead time INARMA forecasts are then compared to those obtained by benchmark methods of Croston, Syntetos-Boylan Approximation (SBA) and Shale-Boylan-Johnston (SBJ). The results of the simulation suggest that in the presence of a high autocorrelation in data, INARMA yields much more accurate one-step ahead forecasts than benchmark methods. The degree of improvement increases for longer data histories. It has been shown that instead of identification of the autoregressive and moving average order of the INARMA model, the most general model among the possible models can be used for forecasting. This is especially useful for short history and high autocorrelation in data. The findings of the thesis have been tested on two real data sets: (i) Royal Air Force (RAF) demand history of 16,000 SKUs and (ii) 3,000 series of intermittent demand from the automotive industry. The results show that for sparse data with long history, there is a substantial improvement in using INARMA over the benchmarks in terms of Mean Square Error (MSE) and Mean Absolute Scaled Error (MASE) for the one-step ahead forecasts. However, for series with short history the improvement is narrower. The improvement is greater for h-step ahead forecasts. The results also confirm the superiority of INARMA over the benchmark methods for lead time forecasts

    Intermittent demand forecasting with integer autoregressive moving average models

    Get PDF
    This PhD thesis focuses on using time series models for counts in modelling and forecasting a special type of count series called intermittent series. An intermittent series is a series of non-negative integer values with some zero values. Such series occur in many areas including inventory control of spare parts. Various methods have been developed for intermittent demand forecasting with Croston’s method being the most widely used. Some studies focus on finding a model underlying Croston’s method. With none of these studies being successful in demonstrating an underlying model for which Croston’s method is optimal, the focus should now shift towards stationary models for intermittent demand forecasting. This thesis explores the application of a class of models for count data called the Integer Autoregressive Moving Average (INARMA) models. INARMA models have had applications in different areas such as medical science and economics, but this is the first attempt to use such a model-based method to forecast intermittent demand. In this PhD research, we first fill some gaps in the INARMA literature by finding the unconditional variance and the autocorrelation function of the general INARMA(p,q) model. The conditional expected value of the aggregated process over lead time is also obtained to be used as a lead time forecast. The accuracy of h-step-ahead and lead time INARMA forecasts are then compared to those obtained by benchmark methods of Croston, Syntetos-Boylan Approximation (SBA) and Shale-Boylan-Johnston (SBJ). The results of the simulation suggest that in the presence of a high autocorrelation in data, INARMA yields much more accurate one-step ahead forecasts than benchmark methods. The degree of improvement increases for longer data histories. It has been shown that instead of identification of the autoregressive and moving average order of the INARMA model, the most general model among the possible models can be used for forecasting. This is especially useful for short history and high autocorrelation in data. The findings of the thesis have been tested on two real data sets: (i) Royal Air Force (RAF) demand history of 16,000 SKUs and (ii) 3,000 series of intermittent demand from the automotive industry. The results show that for sparse data with long history, there is a substantial improvement in using INARMA over the benchmarks in terms of Mean Square Error (MSE) and Mean Absolute Scaled Error (MASE) for the one-step ahead forecasts. However, for series with short history the improvement is narrower. The improvement is greater for h-step ahead forecasts. The results also confirm the superiority of INARMA over the benchmark methods for lead time forecasts.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Reproducibility in forecasting research

    Get PDF
    The importance of replication has been recognised across many scientific disciplines. Reproducibility is a necessary condition for replicability, because an inability to reproduce results implies that the methods have not been specified sufficiently, thus precluding replication. This paper describes how two independent teams of researchers attempted to reproduce the empirical findings of an important paper, ‘‘Shrinkage estimators of time series seasonal factors and their effect on forecasting accuracy’’ (Miller & Williams, 2003). The two teams proceeded systematically, reporting results both before and after receiving clarifications from the authors of the original study. The teams were able to approximately reproduce each other’s results, but not those of Miller and Williams. These discrepancies led to differences in the conclusions as to the conditions under which seasonal damping outperforms classical decomposition. The paper specifies the forecasting methods employed using a flowchart. It is argued that this approach to method documentation is complementary to the provision of computer code, as it is accessible to a broader audience of forecasting practitioners and researchers. The significance of this research lies not only in its lessons for seasonal forecasting but also, more generally, in its approach to the reproduction of forecasting research

    Damage Formation: Equations of water block in oil and water wells

    Get PDF
    Water block or invasion of water into the pores of reservoir forms during the operations of water-based drilling, injection, many perforations, completion fluids, and some other particular processes in the reservoir (such as fingering and conning). Subsequently, the alteration in the shape or composition of the fine particles such as clay (water-wet solids), as a result of the stress on it, in the flow path of the second phase can lead to the permeability decline of reservoir. Consequently, the solvents such as surfactants (as demulsifiers) to lower the surface tension as a phenomenon associated with intermolecular forces (known as capillary action) during flowback are consumed to avoid the emulsions and sludge mostly in the near-wellbore zone or undertreatment and under-injection radius of the reservoir. However, in addition to surging or swabbing the wells to lower the surface tension, using solvents as the wettability changing agent along with base fluid is a common method in the water block elimination from the wellbore, especially in the low permeability porous media or the reservoirs latter its average pressure declined below bubble point. For more profitability, after using solvents in various reservoir characterizations, the trend of their behavior variations in the different lithologies is required to decide on the removed damage percentage. The investigations on this subject involve many experimental studies and have not been presented any mathematical formulas for the damage of water block in the water, oil, and gas reservoirs. These formulas determine selection criteria for the applied materials and increase variable performance. An integrated set of procedures and guidelines for one or more phases in a porous media is necessary to carry out the step-by-step approach at wellhead. Erroneous decisions and difficult situations can also be addressed in the injection wells or saltwater disposal wells, in which water block is a formation damage type. Misconceptions and difficult situations resulting from these injuries can increase water saturation in borehole and affect the fluid transmissibility power in reaching far and near distances of the wellbore, which results in injection rate loss at the wellhead. Accordingly, for the equations of water block here, a set of variables, of a particular domain, for defining relationships between rock- and fluid-based parameters are required. For these equations, at first, the structural classifications of fracture and grain in the layers (d1, d2, and d3) are defined. Afterward, the equations of overburden pressure (Pob) for a definite sectional area surrounding the wellbore for any lithology (in the three categories relative to porosity) are obtained by these structural classifications and other characteristics of rock and fluid. Naturally, prior to equations of overburden pressure in a definite layer or a definite sectional area around the wellbore, the overburden pressure of a point in a layer in the first four equations is expressed. In the second, the estimated overburden pressure equations are applied in driving the equations of removed water block (Bk). The equations of removed water block, themselves, are divided into two groups of equations, i.e., equations of oil wells and equations of saltwater disposal wells, and each group of equations is again classified based on the wettability of reservoir rock (oil-wet or water-wet) in the two ranges of porosity. In the third, after describing these equations (i.e., equations of Bk), the other new variable included in the equations of removed water block, that is, the acid expanding ability (Ik) for a definite oil layer around the wellbore, is presented, which is extracted from (1) the full characteristics of reservoir (including experimental and empirical equations of overburden pressure), (2) the history of producing well, (3) core flooding displacement experiments at laboratory, and (4) the acidic and alkaline solvent properties. Finally, the rate of forming water block (q) is calculated using the value calculated for the removed water block, and, additionally, the trend of using solvents is determined for different rocks using these sets of equations. The acceptance criteria are the nature of rock and fluid in the reservoir circumstances. Equations as a quick and cost-efficient method are also introduced, providing computational methods to determine how much and how the blocked fluid in the reservoir layers is removed from the definite strata around the wellbore after injection operation of acids and solvents, with various degrees of acidity, to the types of lithology during acidizing operations. Moreover, these equations can calculate the removed water block (Bk) after injecting solvents to the different acidic properties in the acidizing, for two categories of porosity which cover all lithologies. The equations also ascertain in the current reservoir conditions how much solvent for a type of lithology is to be mixed with other base fluids

    The application of product-group seasonal indexes to individual products

    No full text
    Forecasting seasonal products can be difficult when the products are fairly new or highly variable. The Spring 2007 issue of Foresight contained a special feature on modeling seasonality in short time series. The articles addressed the issues of minimum sample size requirements and surveyed the options for applying the seasonal patterns that are in aggregates (e.g. product group) as well as in analogous product data to the individual product at hand. Now, Maryam, John, and Aris show specifically how to form product-group seasonal indexes and explain how to determine when group indexes will be superior to individual indexes for forecasting the individual products. They also make the important point that there may be better ways to form product groups for seasonal forecasting than a company’s standard product groupings. Copyright International Institute of Forecasters, 201

    Grocery omnichannel perishable inventories: performance measures and influencing factors

    Get PDF
    Purpose: Perishable inventory management for the grocery sector has become more challenging with extended omnichannel activities and emerging consumer expectations. This paper aims to identify and formalize key performance measures of omnichannel perishable inventory management (OCPI) and explore the influence of operational and market-related factors on these measures. Design/methodology/approach: The inductive approach of this research synthesizes three performance measures (product waste, lost sales, and freshness), and four influencing factors (channel effect, demand variability, product perishability, and shelf life visibility) for OCPI, through industry investigation, expert interviews, and a systematic literature review. Treating OCPI as a complex adaptive system and considering its transaction costs, this paper formalizes the OCPI performance measures and their influencing factors in two statements and four propositions, which are then tested through numerical analysis with simulation. Findings: Product waste, lost sales, and freshness are identified as distinctive OCPI performance measures, which are influenced by product perishability, shelf life visibility, demand variability, and channel effects. The OCPI sensitivity to those influencing factors is diverse, whereas those factors are found to moderate each other’s effects. Originality/Value: This paper provides a novel theoretical view on perishables in omnichannel systems. It specifies the OCPI performance, beyond typical inventory policies for cost minimization, while discussing its sensitivity to operations and market factors. Practical implications: To manage perishables more effectively, with less waste and lost sales for the business and fresher products for the consumer, omnichannel firms need to consider store and online channel requirements and strive to reduce demand variability, extend product shelf life, and facilitate item-level shelf life visibility. While flexible logistics capacity and dynamic pricing can mitigate demand variability, the product shelf life extension needs modifications in product design, production, or storage conditions. OCPI executives can also increase the product shelf life visibility through advanced stock monitoring/tracking technologies (e.g. smart tags or more comprehensive barcodes), particularly for the online channel which demands fresher products

    Revisiting operations agility and formalizing digitalization in response to varying levels of uncertainty and customization

    No full text
    This paper aims to find how digitalization supports inter-organizational purchasing/order fulfillment processes and the required agility to respond to supply/demand uncertainties. The research method includes multiple case studies. Qualitative data are collected via interviews and documentation review. Within-case and cross-case analyses of the research lead to 14 propositions and a novel framework, which formalize and link agility and digitalization at different levels. The research findings point out the agility in micro and macro types for the demand and supply sides of the business, responding to different levels of uncertainties. The findings categorize the relevant applications of digitalization at three levels: data interchange, data integration, and predictive data analytics. Moreover, the agility-digitalization relationships are defined for different levels of customization, represented by customer order decoupling points. This paper contributes to the literature by offering an in-depth and explicit understanding of the impacts of digitalization on different types of agility for different levels of customization

    Grocery omnichannel perishable inventories: performance measures and influencing factors

    No full text
    Purpose: perishable inventory management for the grocery sector has become more challenging with extended omnichannel activities and emerging consumer expectations. This paper aims to identify and formalize key performance measures of omnichannel perishable inventory management (OCPI) and explore the influence of operational and market-related factors on these measures.Design/methodology/approach: the inductive approach of this research synthesizes three performance measures (product waste, lost sales and freshness) and four influencing factors (channel effect, demand variability, product perishability and shelf life visibility) for OCPI, through industry investigation, expert interviews and a systematic literature review. Treating OCPI as a complex adaptive system and considering its transaction costs, this paper formalizes the OCPI performance measures and their influencing factors in two statements and four propositions, which are then tested through numerical analysis withsimulation.Findings: product waste, lost sales and freshness are identified as distinctive OCPI performance measures, which are influenced by product perishability, shelf life visibility, demand variability and channel effects. The OCPI sensitivity to those influencing factors is diverse, whereas those factors are found to moderate each other’s effects.Practical implications – To manage perishables more effectively, with less waste and lost sales for the business and fresher products for the consumer, omnichannel firms need to consider store and online channel requirements and strive to reduce demand variability, extend product shelf life and facilitate item-level shelf life visibility. While flexible logistics capacity and dynamic pricing can mitigate demand variability, the product shelf life extension needs modifications in product design, production, or storage conditions. OCPI executives can also increase the product shelf life visibility through advanced stock monitoring/tracking technologies (e.g. smart tags or more comprehensive barcodes), particularly for the online channel which demands fresherproducts.Originality/value – This paper provides a novel theoretical view on perishables in omnichannel systems. It specifies the OCPI performance, beyond typical inventory policies for cost minimization, while discussing its sensitivity to operations and market factors
    corecore