274 research outputs found

    Bayesian neural network learning for repeat purchase modelling in direct marketing.

    Get PDF
    We focus on purchase incidence modelling for a European direct mail company. Response models based on statistical and neural network techniques are contrasted. The evidence framework of MacKay is used as an example implementation of Bayesian neural network learning, a method that is fairly robust with respect to problems typically encountered when implementing neural networks. The automatic relevance determination (ARD) method, an integrated feature of this framework, allows to assess the relative importance of the inputs. The basic response models use operationalisations of the traditionally discussed Recency, Frequency and Monetary (RFM) predictor categories. In a second experiment, the RFM response framework is enriched by the inclusion of other (non-RFM) customer profiling predictors. We contribute to the literature by providing experimental evidence that: (1) Bayesian neural networks offer a viable alternative for purchase incidence modelling; (2) a combined use of all three RFM predictor categories is advocated by the ARD method; (3) the inclusion of non-RFM variables allows to significantly augment the predictive power of the constructed RFM classifiers; (4) this rise is mainly attributed to the inclusion of customer\slash company interaction variables and a variable measuring whether a customer uses the credit facilities of the direct mailing company.Marketing; Companies; Models; Model; Problems; Neural networks; Networks; Variables; Credit;

    Wrapped feature selection for neural networks in direct marketing.

    Get PDF
    In this paper, we try to validate existing theory on and develop additional insight into repeat purchasing behaviour in a direct-marketing setting by means of an illuminating case study. The case involves the detection and qualification of the most relevant RFM (Recency, Frequency and Monetary) features, using a wrapped feature selection method in a neural network context. Results indicate that elimination of redundant/irrelevant features by means of the discussed feature selection method, allows to significantly reduce model complexity without degrading generalisation ability. It is precisely this issue that will allow to infer some very interesting marketing conclusions concerning the relative importance of the RFM-predictor categories. The empirical findings highlight the importance of a combined use of all three RFM variables in predicting repeat purchase behaviour. However, the study also reveals the dominant role of the frequency variable. Results indicate that a model including only frequency variables still yields satisfactory classification accuracy compared to the optimally reduced model.Marketing; Networks; Selection; Theory; Purchasing; Case studies; Studies; Model; Variables; Yield; Classification; Neural networks;

    Joint optimization of customer segmentation and marketing policy to maximize long-term profitability

    Get PDF
    With the advent of one-to-one marketing media, e.g. targeted direct mail or internet marketing, the opportunities to develop targeted marketing campaigns are enhanced in such a way that it is now both organizationally and economically feasible to profitably support a substantially larger number of marketing segments. However, the problem of what segments to distinguish, and what actions to take towards the different segments increases substantially in such an environment. A systematic analytic procedure optimizing both steps would be very welcome.In this study, we present a joint optimization approach addressing two issues: (1) the segmentation of customers into homogeneous groups of customers, (2) determining the optimal policy (i.e., what action to take from a set of available actions) towards each segment. We implement this joint optimization framework in a direct-mail setting for a charitable organization. Many previous studies in this area highlighted the importance of the following variables: R(ecency), F(requency), and M(onetary value). We use these variables to segment customers. In a second step, we determine which marketing policy is optimal using markov decision processes, following similar previous applications. The attractiveness of this stochastic dynamic programming procedure is based on the long-run maximization of expected average profit. Our contribution lies in the combination of both steps into one optimization framework to obtain an optimal allocation of marketing expenditures. Moreover, we control segment stability and policy performance by a bootstrap procedure. Our framework is illustrated by a real-life application. The results show that the proposed model outperforms a CHAID segmentation

    What Values in Design? The Challenge of Incorporating Moral Values into Design

    Get PDF
    Recently, there is increased attention to the integration of moral values into the conception, design, and development of emerging IT. The most reviewed approach for this purpose in ethics and technology so far is Value-Sensitive Design (VSD). This article considers VSD as the prime candidate for implementing normative considerations into design. Its methodology is considered from a conceptual, analytical, normative perspective. The focus here is on the suitability of VSD for integrating moral values into the design of technologies in a way that joins in with an analytical perspective on ethics of technology. Despite its promising character, it turns out that VSD falls short in several respects: (1) VSD does not have a clear methodology for identifying stakeholders, (2) the integration of empirical methods with conceptual research within the methodology of VSD is obscure, (3) VSD runs the risk of committing the naturalistic fallacy when using empirical knowledge for implementing values in design, (4) the concept of values, as well as their realization, is left undetermined and (5) VSD lacks a complimentary or explicit ethical theory for dealing with value trade-offs. For the normative evaluation of a technology, I claim that an explicit and justified ethical starting point or principle is required. Moreover, explicit attention should be given to the value aims and assumptions of a particular design. The criteria of adequacy for such an approach or methodology follow from the evaluation of VSD as the prime candidate for implementing moral values in design

    Ethics and Nanopharmacy: Value Sensitive Design of New Drugs

    Get PDF
    Although applications are being developed and have reached the market, nanopharmacy to date is generally still conceived as an emerging technology. Its concept is ill-defined. Nanopharmacy can also be construed as a converging technology, which combines features of multiple technologies, ranging from nanotechnology to medicine and ICT. It is still debated whether its features give rise to new ethical issues or that issues associated with nanopharma are merely an extension of existing issues in the underlying fields. We argue here that, regardless of the alleged newness of the ethical issues involved, developments occasioned by technological advances affect the roles played by stakeholders in the field of nanopharmacy to such an extent that this calls for a different approach to responsible innovation in this field. Specific features associated with nanopharmacy itself and features introduced to the associated converging technologies- bring about a shift in the roles of stakeholders that call for a different approach to responsibility. We suggest that Value Sensitive Design is a suitable framework to involve stakeholders in addressing moral issues responsibly at an early stage of development of new nanopharmaceuticals

    Wall roughness induces asymptotic ultimate turbulence

    Get PDF
    Turbulence is omnipresent in Nature and technology, governing the transport of heat, mass, and momentum on multiple scales. For real-world applications of wall-bounded turbulence, the underlying surfaces are virtually always rough; yet characterizing and understanding the effects of wall roughness for turbulence remains a challenge, especially for rotating and thermally driven turbulence. By combining extensive experiments and numerical simulations, here, taking as example the paradigmatic Taylor-Couette system (the closed flow between two independently rotating coaxial cylinders), we show how wall roughness greatly enhances the overall transport properties and the corresponding scaling exponents. If only one of the walls is rough, we reveal that the bulk velocity is slaved to the rough side, due to the much stronger coupling to that wall by the detaching flow structures. If both walls are rough, the viscosity dependence is thoroughly eliminated in the boundary layers and we thus achieve asymptotic ultimate turbulence, i.e. the upper limit of transport, whose existence had been predicted by Robert Kraichnan in 1962 (Phys. Fluids {\bf 5}, 1374 (1962)) and in which the scalings laws can be extrapolated to arbitrarily large Reynolds numbers

    Major decline of hepatitis C virus incidence rate over two decades in a cohort of drug users

    Get PDF
    Injecting drug users (DU) are at high risk for hepatitis C virus (HCV) and HIV infections. To examine the prevalence and incidence of these infections over a 20-year period (1985–005), the authors evaluated 1276 DU from the Amsterdam Cohort Studies who had been tested prospectively for HIV infection and retrospectively for HCV infection. To compare HCV and HIV incidences, a smooth trend was assumed for both curves over calendar time. Risk factors for HCV seroconversion were determined using Poisson regression. Among ever-injecting DU, the prevalence of HCV antibodies was 84.5% at study entry, and 30.9% were co-infected with HIV. Their yearly HCV incidence dropped from 27.5/100 person years (PY) in the 1980s to 2/100 PY in recent years. In multivariate analyses, ever-injecting DU who currently injected and borrowed needles were at increased risk of HCV seroconversion (incidence rate ratio 29.9, 95% CI 12.6, 70.9) compared to ever-injecting DU who did not currently inject. The risk of HCV seroconversion decreased over calendar time. The HCV incidence in ever-injecting DU was on average 4.4 times the HIV incidence, a pattern seen over the entire study period. The simultaneous decline of both HCV and HIV incidence probably results from reduced risk behavior at the population level

    Responsibility Ascriptions in Technology Development and Engineering: Three Perspectives

    Get PDF
    In the last decades increasing attention is paid to the topic of responsibility in technology development and engineering. The discussion of this topic is often guided by questions related to liability and blameworthiness. Recent discussions in engineering ethics call for a reconsideration of the traditional quest for responsibility. Rather than on alleged wrongdoing and blaming, the focus should shift to more socially responsible engineering, some authors argue. The present paper aims at exploring the different approaches to responsibility in order to see which one is most appropriate to apply to engineering and technology development. Using the example of the development of a new sewage water treatment technology, the paper shows how different approaches for ascribing responsibilities have different implications for engineering practice in general, and R&D or technological design in particular. It was found that there was a tension between the demands that follow from these different approaches, most notably between efficacy and fairness. Although the consequentialist approach with its efficacy criterion turned out to be most powerful, it was also shown that the fairness of responsibility ascriptions should somehow be taken into account. It is proposed to look for alternative, more procedural ways to approach the fairness of responsibility ascriptions

    Predicting self‐declared movie watching behavior using Facebook data and information‐fusion sensitivity analysis

    Get PDF
    The main purpose of this paper is to evaluate the feasibility of predicting whether yes or no a Facebook user has self-reported to have watched a given movie genre. Therefore, we apply a data analytical framework that (1) builds and evaluates several predictive models explaining self-declared movie watching behavior, and (2) provides insight into the importance of the predictors and their relationship with self-reported movie watching behavior. For the first outcome, we benchmark several algorithms (logistic regression, random forest, adaptive boosting, rotation forest, and naive Bayes) and evaluate their performance using the area under the receiver operating characteristic curve. For the second outcome, we evaluate variable importance and build partial dependence plots using information-fusion sensitivity analysis for different movie genres. To gather the data, we developed a custom native Facebook app. We resampled our dataset to make it representative of the general Facebook population with respect to age and gender. The results indicate that adaptive boosting outperforms all other algorithms. Time- and frequency-based variables related to media (movies, videos, and music) consumption constitute the list of top variables. To the best of our knowledge, this study is the first to fit predictive models of self-reported movie watching behavior and provide insights into the relationships that govern these models. Our models can be used as a decision tool for movie producers to target potential movie-watchers and market their movies more efficiently
    corecore