9,387 research outputs found

    A Probabilistic One-Step Approach to the Optimal Product Line Design Problem Using Conjoint and Cost Data

    Get PDF
    Designing and pricing new products is one of the most critical activities for a firm, and it is well-known that taking into account consumer preferences for design decisions is essential for products later to be successful in a competitive environment (e.g., Urban and Hauser 1993). Consequently, measuring consumer preferences among multiattribute alternatives has been a primary concern in marketing research as well, and among many methodologies developed, conjoint analysis (Green and Rao 1971) has turned out to be one of the most widely used preference-based techniques for identifying and evaluating new product concepts. Moreover, a number of conjoint-based models with special focus on mathematical programming techniques for optimal product (line) design have been proposed (e.g., Zufryden 1977, 1982, Green and Krieger 1985, 1987b, 1992, Kohli and Krishnamurti 1987, Kohli and Sukumar 1990, Dobson and Kalish 1988, 1993, Balakrishnan and Jacob 1996, Chen and Hausman 2000). These models are directed at determining optimal product concepts using consumers' idiosyncratic or segment level part-worth preference functions estimated previously within a conjoint framework. Recently, Balakrishnan and Jacob (1996) have proposed the use of Genetic Algorithms (GA) to solve the problem of identifying a share maximizing single product design using conjoint data. In this paper, we follow Balakrishnan and Jacob's idea and employ and evaluate the GA approach with regard to the problem of optimal product line design. Similar to the approaches of Kohli and Sukumar (1990) and Nair et al. (1995), product lines are constructed directly from part-worths data obtained by conjoint analysis, which can be characterized as a one-step approach to product line design. In contrast, a two-step approach would start by first reducing the total set of feasible product profiles to a smaller set of promising items (reference set of candidate items) from which the products that constitute a product line are selected in a second step. Two-step approaches or partial models for either the first or second stage in this context have been proposed by Green and Krieger (1985, 1987a, 1987b, 1989), McBride and Zufryden (1988), Dobson and Kalish (1988, 1993) and, more recently, by Chen and Hausman (2000). Heretofore, with the only exception of Chen and Hausman's (2000) probabilistic model, all contributors to the literature on conjoint-based product line design have employed a deterministic, first-choice model of idiosyncratic preferences. Accordingly, a consumer is assumed to choose from her/his choice set the product with maximum perceived utility with certainty. However, the first choice rule seems to be an assumption too rigid for many product categories and individual choice situations, as the analyst often won't be in a position to control for all relevant variables influencing consumer behavior (e.g., situational factors). Therefore, in agreement with Chen and Hausman (2000), we incorporate a probabilistic choice rule to provide a more flexible representation of the consumer decision making process and start from segment-specific conjoint models of the conditional multinomial logit type. Favoring the multinomial logit model doesn't imply rejection of the widespread max-utility rule, as the MNL includes the option of mimicking this first choice rule. We further consider profit as a firm's economic criterion to evaluate decisions and introduce fixed and variable costs for each product profile. However, the proposed methodology is flexible enough to accomodate for other goals like market share (as well as for any other probabilistic choice rule). This model flexibility is provided by the implemented Genetic Algorithm as the underlying solver for the resulting nonlinear integer programming problem. Genetic Algorithms merely use objective function information (in the present context on expected profits of feasible product line solutions) and are easily adjustable to different objectives without the need for major algorithmic modifications. To assess the performance of the GA methodology for the product line design problem, we employ sensitivity analysis and Monte Carlo simulation. Sensitivity analysis is carried out to study the performance of the Genetic Algorithm w.r.t. varying GA parameter values (population size, crossover probability, mutation rate) and to finetune these values in order to provide near optimal solutions. Based on more than 1500 sensitivity runs applied to different problem sizes ranging from 12.650 to 10.586.800 feasible product line candidate solutions, we can recommend: (a) as expected, that a larger problem size be accompanied by a larger population size, with a minimum popsize of 130 for small problems and a minimum popsize of 250 for large problems, (b) a crossover probability of at least 0.9 and (c) an unexpectedly high mutation rate of 0.05 for small/medium-sized problems and a mutation rate in the order of 0.01 for large problem sizes. Following the results of the sensitivity analysis, we evaluated the GA performance for a large set of systematically varying market scenarios and associated problem sizes. We generated problems using a 4-factorial experimental design which varied by the number of attributes, number of levels in each attribute, number of items to be introduced by a new seller and number of competing firms except the new seller. The results of the Monte Carlo study with a total of 276 data sets that were analyzed show that the GA works efficiently in both providing near optimal product line solutions and CPU time. Particularly, (a) the worst-case performance ratio of the GA observed in a single run was 96.66%, indicating that the profit of the best product line solution found by the GA was never less than 96.66% of the profit of the optimal product line, (b) the hit ratio of identifying the optimal solution was 84.78% (234 out of 276 cases) and (c) it tooks at most 30 seconds for the GA to converge. Considering the option of Genetic Algorithms for repeated runs with (slightly) changed parameter settings and/or different initial populations (as opposed to many other heuristics) further improves the chances of finding the optimal solution.

    MCDM Farm System Analysis for Public Management of Irrigated Agriculture

    Get PDF
    In this paper we present a methodology within the multi-criteria paradigm to assist policy decision-making on water management for irrigation. In order to predict farmers' response to policy changes a separate multi-attribute utility function for each homogeneous group, attained applying cluster analysis, is elicited. The results of several empirical applications of this methodology suggest an improvement of the ability to simulate farmers' decision-making process compared to other approaches. Once the utility functions are obtained the policy maker can evaluate the differential impacts on each cluster and the overall impacts in the area of study (i.e. a river basin) by aggregation. On the empirical side, the authors present some studies for different policy instruments including water pricing, water markets, modernization of irrigation systems and a combination of them.multi-attribute utility theory, water management, irrigation, policy analysis, Agricultural and Food Policy, Q25, Q15, C61,

    Profitability and Long-term Survival of Community Banks: Evidence from Texas

    Get PDF
    This study examines the impact of distance among competing bank locations on market their pricing behavior. A general spatial autoregressive model that nests both spatial autoregressive and spatial error models is used to examine the impact of distance on pricing behavior of 686 non-metro banks in Texas. Results show that non-metro banks exercise market power in pricing their products. An increase in spatial competition may reduce profitability and challenge long term survival of small community based financial institutions.Financial Economics,

    Should Optimal Designers Worry About Consideration?

    Full text link
    Consideration set formation using non-compensatory screening rules is a vital component of real purchasing decisions with decades of experimental validation. Marketers have recently developed statistical methods that can estimate quantitative choice models that include consideration set formation via non-compensatory screening rules. But is capturing consideration within models of choice important for design? This paper reports on a simulation study of a vehicle portfolio design when households screen over vehicle body style built to explore the importance of capturing consideration rules for optimal designers. We generate synthetic market share data, fit a variety of discrete choice models to the data, and then optimize design decisions using the estimated models. Model predictive power, design "error", and profitability relative to ideal profits are compared as the amount of market data available increases. We find that even when estimated compensatory models provide relatively good predictive accuracy, they can lead to sub-optimal design decisions when the population uses consideration behavior; convergence of compensatory models to non-compensatory behavior is likely to require unrealistic amounts of data; and modeling heterogeneity in non-compensatory screening is more valuable than heterogeneity in compensatory trade-offs. This supports the claim that designers should carefully identify consideration behaviors before optimizing product portfolios. We also find that higher model predictive power does not necessarily imply better design decisions; that is, different model forms can provide "descriptive" rather than "predictive" information that is useful for design.Comment: 5 figures, 26 pages. In Press at ASME Journal of Mechanical Design (as of 3/17/15

    Scale Economics, Market Power, and Pricing Behavior Evidence from German Newspaper and Magazine Publishing

    Get PDF
    The anomalous inverse concentration-price relationship observed by some researchers in the newspaper market has been attributed to scale economies. In this paper we suggest that the newspaper's (or magazine's) "double-product" feature (i.e., news supplied to readers and advertising space supplied to advertisers) is the main source of this anomaly. In a simple oligopoly model it is shown how a profit-maximizing publisher takes advantage of that feature. Empirically an inverse concentration-price relationship may arise if double-product pricing is not controlled for. Regression results for a cross-section of 222 German Newspapers and magazines corroborate the theoretical implications.Scale Economics, Advertising, Concentration, Pricing Behavior

    Enhancing Irrigation Efficiency but Increasing Water Use: The Jevons' Paradox

    Get PDF
    In this paper we analyze the conditions under which increasing technical efficiency of water use in the agricultural sector might not reduce water demand and pressures on water ecosystems. Departing from this basic problem we discuss how policy measures performed to enhance water productivity in the agriculture might be transformed into effective alternatives to improve the conservation of water resources and then guarantee the successful implementation of the Water Framework Directive. A preference revelation model is presented in the third section of the paper and one empirical application to an irrigation district in southern Spain is used in the fourth section to discuss the effectiveness of water savings measures.Water Framework Directive, Water Economics, Agricultural Economics, Simulation Models, Preference Revelation., Resource /Energy Economics and Policy,

    Country of Origin Labeling with Horizontal Differentiation and Cost Variability

    Get PDF
    This paper studies whether a seller achieves higher profits by providing consumers with information that allows them to distinguish between products from different countries, and how mandatory provision of such information impacts welfare. We analyze a model of multi-product monopoly with horizontal differentiation and random country-specific input costs. We find that if the variability in the input costs is sufficiently high and the share of consumers with high valuations is in some intermediate range, the seller prefers to withhold information about product origin. Mandatory labeling of products with their country of origin may reduce or increase welfare depending on the share of consumers with high valuations. We also discuss extensions of the basic model that allow for continuous distributions of valuations and input costs, and consumer learning.Country of origin labeling, consumer learning, food policy, Agricultural and Food Policy, Industrial Organization, International Relations/Trade,

    STRATEGIC PRODUCT DESIGN DECISIONS FOR UNCERTAIN, CONVERGING AND SERVICE ORIENTED MARKETS

    Get PDF
    Market driven product design decisions are receiving increasing attention in the engineering design research literature. Econometric models and marketing research techniques are being integrated into engineering design in order to assist with profit maximizing product design decisions. This stream of research is referred to as "Design for Market Systems" (DMS). The existing DMS approaches fall short when the market environment is complex. The complexity can be incurred by the uncertain action-reactions of market players which impose unexpected market responses to a new design. The complexity can originate from the emergence of a niche product which creates a new product market by integrating the features of two or more existing products categories. The complexity can also arise when the designer is challenged to handle the couplings of outsourced subsystems from suppliers and explore the integration of the product with service providers. The objective of the thesis is to overcome such limitations and facilitate design decisions by modeling and interpreting the complex market environment. The research objective is achieved by three research thrusts. Thrust 1 examines the impact of action-reactions of market players on the long and short term design decisions for single category products using an agent based simulation approach. Thrust 2 concerns the design decisions for "convergence products". A convergence product physically integrates two or more existing product categories into a common product form. Convergence products make the consumer choice behavior and profit implications of design alternatives differ significantly from the situation where only a single product market is involved. Thrust 3 explores product design decisions while considering the connection to the upstream suppliers and downstream service providers. The connection is achieved by a quantitative understanding of interoperability of physical product modules as well as between a physical product and a service provider
    corecore