19 research outputs found

    Forecasting With Exponential Smoothing Whats The Right Smoothing Constant?

    Get PDF
    This paper examines exponential smoothing constants that minimize summary error measures associated with a large number of forecasts. These forecasts were made on numerous time series generated through simulation on a spreadsheet. The series varied in length and underlying nature no trend, linear trend, and nonlinear trend. Forecasts were made using simple exponential smoothing as well as exponential smoothing with trend correction and with different kinds of initial forecasts. We found that when initial forecasts were good and the nature of the underlying data did not change, smoothing constants were typically very small. Conversely, large smoothing constants indicated a change in the nature of the underlying data or the use of an inappropriate forecasting model. These results reduce the confusion about the role and right size of these constants and offer clear recommendations on how they should be discussed in classroom settings

    Determining the Optimal Values of Exponential Smoothing Constants--Does Solver Really Work?

    Get PDF
    A key issue in exponential smoothing is the choice of the values of the smoothing constants used. One approach that is becoming increasingly popular in introductory management science and operations management textbooks is the use of Solver, an Excel-based non-linear optimizer, to identify values of the smoothing constants that minimize a measure of forecast error like Mean Absolute Deviation (MAD) or Mean Squared Error (MSE). We point out some difficulties with this approach and suggest an easy fix. We examine the impact of initial forecasts on the smoothing constants and the idea of optimizing the initial forecast along with the smoothing constants. We make recommendations on the use of Solver in the context of the teaching of forecasting and suggest that there is a better method than Solver to identify the appropriate smoothing constants

    Determining The Optimal Values Of Exponential Smoothing Constants – Does Solver Really Work?

    Get PDF
    A key issue in exponential smoothing is the choice of the values of the smoothing constants used.  One approach that is becoming increasingly popular in introductory management science and operations management textbooks is the use of Solver, an Excel-based non-linear optimizer, to identify values of the smoothing constants that minimize a measure of forecast error like Mean Absolute Deviation (MAD) or Mean Squared Error (MSE).  We point out some difficulties with this approach and suggest an easy fix. We examine the impact of initial forecasts on the smoothing constants and the idea of optimizing the initial forecast along with the smoothing constants.  We make recommendations on the use of Solver in the context of the teaching of forecasting and suggest that there is a better method than Solver to identify the appropriate smoothing constants

    Determining The Optimal Values Of Exponential Smoothing Constants Does Solver Really Work?

    Get PDF
    A key issue in exponential smoothing is the choice of the values of the smoothing constants used.One approach that is becoming increasingly popular in introductory management science and operations management textbooks is the use of Solver, an Excel-based non-linear optimizer, to identify values of the smoothing constants that minimize a measure of forecast error like Mean Absolute Deviation (MAD) or Mean Squared Error (MSE).We point out some difficulties with this approach and suggest an easy fix. We examine the impact of initial forecasts on the smoothing constants and the idea of optimizing the initial forecast along with the smoothing constants.We make recommendations on the use of Solver in the context of the teaching of forecasting and suggest that there is a better method than Solver to identify the appropriate smoothing constants

    ABC Analysis for Inventory Management: Bridging the Gap between Research and Classroom

    Get PDF
    ABC analysis is a well-established categorization technique based on the Pareto Principle for determining which items should get priority in the management of a company\u27s inventory. In discussing this topic, today\u27s operations management and supply chain textbooks focus on dollar volume as the sole criterion for performing the categorization. The authors argue that today\u27s businesses and supply chains operate in a world where the ability to deliver the right products rapidly to very specific markets is key to survival. With suppliers, intermediaries, and customers all over the globe, and product lives decreasing rapidly, this focus on a single criterion is misplaced. The large body of research was summarized based on multiple criteria ABC analysis that has accumulated since the 1980s and recommend that textbooks incorporate their key findings and methods into their discussions of this topic. Suggestions are offered on how this discussion might be structured

    ABC Analysis For Inventory Management: Bridging The Gap Between Research And Classroom

    Get PDF
    ABC analysis is a well-established categorization technique based on the Pareto Principle for determining which items should get priority in the management of a company’s inventory.  In discussing this topic, today’s operations management and supply chain textbooks focus on dollar volume as the sole criterion for performing the categorization.  The authors argue that today’s businesses and supply chains operate in a world where the ability to deliver the right products rapidly to very specific markets is key to survival.  With suppliers, intermediaries, and customers all over the globe, and product lives decreasing rapidly, this focus on a single criterion is misplaced.  The large body of research was summarized based on multiple criteria ABC analysis that has accumulated since the 1980s and recommend that textbooks incorporate their key findings and methods into their discussions of this topic.  Suggestions are offered on how this discussion might be structured

    The Treatment of Six Sigma in Introductory Operations Management Textbooks: Clearing Up the Confusion

    Get PDF
    This paper critically examines the treatment of the statistical basis for Six Sigma and process capability in popular operations management textbooks. It discusses areas of confusion and suggest ways of treating the topic that make sense to instructors as well as students. Even though Six Sigma was introduced almost 30 years ago, misconceptions persist. In the textbooks we have found no consistency of approach or understanding of the statistical underpinnings (3.4 defects per million opportunities) of Six Sigma. Sometimes statements are made that are factually incorrect and cause frustration for students and instructors. Similar difficulties are encountered in discussions of the related concept of process capability. The paper suggests changes that will help resolve these issues and bring much-needed clarity to discussions of these important ideas. Students will find the material much more accessible and instructors will find it much easier to convey the concepts underlying this important topic

    ABC Analysis For Inventory Management: Bridging The Gap Between Research And Classroom

    Get PDF
    ABC analysis is a well-established categorization technique based on the Pareto Principle for determining which items should get priority in the management of a company’s inventory.  In discussing this topic, today’s operations management and supply chain textbooks focus on dollar volume as the sole criterion for performing the categorization.  The authors argue that today’s businesses and supply chains operate in a world where the ability to deliver the right products rapidly to very specific markets is key to survival.  With suppliers, intermediaries, and customers all over the globe, and product lives decreasing rapidly, this focus on a single criterion is misplaced.  The large body of research was summarized based on multiple criteria ABC analysis that has accumulated since the 1980s and recommend that textbooks incorporate their key findings and methods into their discussions of this topic.  Suggestions are offered on how this discussion might be structured.

    An Integrated Approach to the Teaching of Operations Management in a Business School

    Get PDF
    The authors discuss a curriculum integration effort that a school of business piloted recently. This effort was aimed at integrating the core functions (finance, marketing, management, and operations) so that undergraduate students would better appreciate the full impact of functional decisions on each other and in achieving the corporation\u27s business objectives. The authors deployed a webbed integration model in which a business case was used to highlight the impact of a functional decision on the other three functions. The focus of the article is on how this model was implemented in the context of a required introductory course in operations management. The authors also discuss the results of this effort, lessons learned, and the path forward

    Random Error in Holistic Evaluations and Additive Decompositions of Multi-attribute Utility — An Empirical Comparison

    No full text
    This paper details the results of an empirical investigation of the random errors associated with decomposition estimates of multiattribute utility. In a riskless setting, two groups of subjects were asked to evaluate multiattribute alternatives both holistically and with the use of an additive decomposition. For one group, the alternatives were described in terms of three attributes, and for the other in terms of five. Estimates of random error associated with the various elicitations (holistic, single‐attribute utility, scaling constants, or weights) were obtained using a test‐retest format. It was found for both groups that the additive decomposition had significantly smaller levels of random error than the holistic evaluation. However, the number of attributes did not seem to make a significant difference to the amount of random error associated with the decomposition estimates. The levels of error found in the various elicitations were consistent with theoretical bounds that have recently been proposed in the literature. These results show that the structure imposed on the problem through decomposition results in measurable improvement in quality of the multiattribute utility judgements, and contribute to a greater understanding of the decomposition method in decision analysis
    corecore