9,253 research outputs found

    The Role of the Mangement Sciences in Research on Personalization

    Get PDF
    We present a review of research studies that deal with personalization. We synthesize current knowledge about these areas, and identify issues that we envision will be of interest to researchers working in the management sciences. We take an interdisciplinary approach that spans the areas of economics, marketing, information technology, and operations. We present an overarching framework for personalization that allows us to identify key players in the personalization process, as well as, the key stages of personalization. The framework enables us to examine the strategic role of personalization in the interactions between a firm and other key players in the firm's value system. We review extant literature in the strategic behavior of firms, and discuss opportunities for analytical and empirical research in this regard. Next, we examine how a firm can learn a customer's preferences, which is one of the key components of the personalization process. We use a utility-based approach to formalize such preference functions, and to understand how these preference functions could be learnt based on a customer's interactions with a firm. We identify well-established techniques in management sciences that can be gainfully employed in future research on personalization.CRM, Persoanlization, Marketing, e-commerce,

    Reduced-Order Modelling of Parametric Systems via Interpolation of Heterogeneous Surrogates

    No full text

    Uncertainty Wedge Analysis: Quantifying the Impact of Sparse Sound Speed Profiling Regimes on Sounding Uncertainty

    Get PDF
    Recent advances in real-time monitoring of uncertainty due to refraction have demonstrated the power of estimating and visualizing uncertainty over the entire potential sounding space. This representation format, referred to as an uncertainty wedge, can be used to help solve difficult survey planning problems regarding the spatio-temporal variability of the watercolumn. Though initially developed to work in-line with underway watercolumn sampling hardware (e.g. moving vessel profilers), uncertainty wedge analysis techniques are extensible to investigate problems associated with low-density watercolumn sampling in which only a few sound speed casts are gathered per day. As uncertainty wedge analysis techniques require no sounding data, the overhead of post-processing soundings is circumvented in the situation when one needs to quickly ascertain the impact of a particular sampling regime. In keeping with the spirit of the underlying real-time monitoring tools, a just in time analysis of sound speed casts can help the field operator assess the effects of watercolumn variability during acquisition and objectively seek a watercolumn sampling regime which would balance the opposing goals of maximizing survey efficiency and maintaining reasonable sounding accuracy. In this work, we investigate the particular problem of estimating the uncertainty that would be associated with a particular low-density sound speed sampling regime. A pre-analysis technique is proposed in which a high-density set of sound speed profiles provides a baseline against which various low-density sampling regimes can be tested, the end goal being to ascertain the penalty in sounding confidence that would be associated with a particular low-density sampling regime. In other words, by knowing too much about the watercolumn, one can objectively quantify the impact of not knowing enough. In addition to the goal-seeking field application outlined earlier, this allows for more confi- dent attribution of uncertainty to soundings, a marked improvement over current approaches to refraction uncertainty estimation

    Comparative performance of selected variability detection techniques in photometric time series

    Full text link
    Photometric measurements are prone to systematic errors presenting a challenge to low-amplitude variability detection. In search for a general-purpose variability detection technique able to recover a broad range of variability types including currently unknown ones, we test 18 statistical characteristics quantifying scatter and/or correlation between brightness measurements. We compare their performance in identifying variable objects in seven time series data sets obtained with telescopes ranging in size from a telephoto lens to 1m-class and probing variability on time-scales from minutes to decades. The test data sets together include lightcurves of 127539 objects, among them 1251 variable stars of various types and represent a range of observing conditions often found in ground-based variability surveys. The real data are complemented by simulations. We propose a combination of two indices that together recover a broad range of variability types from photometric data characterized by a wide variety of sampling patterns, photometric accuracies, and percentages of outlier measurements. The first index is the interquartile range (IQR) of magnitude measurements, sensitive to variability irrespective of a time-scale and resistant to outliers. It can be complemented by the ratio of the lightcurve variance to the mean square successive difference, 1/h, which is efficient in detecting variability on time-scales longer than the typical time interval between observations. Variable objects have larger 1/h and/or IQR values than non-variable objects of similar brightness. Another approach to variability detection is to combine many variability indices using principal component analysis. We present 124 previously unknown variable stars found in the test data.Comment: 29 pages, 8 figures, 7 tables; accepted to MNRAS; for additional plots, see http://scan.sai.msu.ru/~kirx/var_idx_paper
    corecore