8 research outputs found
Improving the consumer demand forecast to generate more accurate suggested orders at the store-item level
Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management; and, (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering; in conjunction with the Leaders for Manufacturing Program at MIT, 2008.Includes bibliographical references (p. 57).One of the biggest opportunities for this consumer goods company today is reducing retail stockouts at its Direct Store Delivery (DSD) customers via pre-selling, which represents approximately 70% of the company's total sales volume. But reducing retail stock-outs is becoming constantly more challenging with an ever-burgeoning number of SKUs due to new product introductions and packaging innovations. The main tool this consumer goods company uses to combat retail stock-outs is the pre-sell handheld, which the company provides to all field sales reps. The handheld runs proprietary software developed by this consumer goods company that creates suggested orders based on a number of factors including: * Baseline forecast (specific to store-item combination) * Seasonality effects (i.e., higher demand for products during particular seasons) * Promotional effects (i.e., lift created from sale prices) * Presence of in-store displays (i.e., more space for product than just shelf space) * Weekday effects (i.e., selling more on weekends when most people shop) * Holiday effects (i.e., higher demand for products at holidays) * Inventory levels on the shelves and in the back room * In-transit orders (i.e., orders that may already be on their way to the customer) The more accurate that the suggested orders are, the fewer retail stock-outs will occur. This project seeks to increase the accuracy of the consumer demand forecast, and ultimately the suggested orders, by improving the baseline forecast and accounting for the effect of cannibalization on demand.by Susan D. Bankston.S.M.M.B.A
Recommended from our members
Finite-element/progressive-lattice-sampling response surface methodology and application to benchmark probability quantification problems
Optimal response surface construction is being investigated as part of Sandia discretionary (LDRD) research into Analytic Nondeterministic Methods. The goal is to achieve an adequate representation of system behavior over the relevant parameter space of a problem with a minimum of computational and user effort. This is important in global optimization and in estimation of system probabilistic response, which are both made more viable by replacing large complex computer models with fast-running accurate and noiseless approximations. A Finite Element/Lattice Sampling (FE/LS) methodology for constructing progressively refined finite element response surfaces that reuse previous generations of samples is described here. Similar finite element implementations can be extended to N-dimensional problems and/or random fields and applied to other types of structured sampling paradigms, such as classical experimental design and Gauss, Lobatto, and Patterson sampling. Here the FE/LS model is applied in a ``decoupled`` Monte Carlo analysis of two sets of probability quantification test problems. The analytic test problems, spanning a large range of probabilities and very demanding failure region geometries, constitute a good testbed for comparing the performance of various nondeterministic analysis methods. In results here, FE/LS decoupled Monte Carlo analysis required orders of magnitude less computer time than direct Monte Carlo analysis, with no appreciable loss of accuracy. Thus, when arriving at probabilities or distributions by Monte Carlo, it appears to be more efficient to expend computer-model function evaluations on building a FE/LS response surface than to expend them in direct Monte Carlo sampling