1,578 research outputs found

    Path Data in Marketing: An Integrative Framework and Prospectus for Model Building

    Get PDF
    Many data sets, from different and seemingly unrelated marketing domains, all involve paths—records of consumers\u27 movements in a spatial configuration. Path data contain valuable information for marketing researchers because they describe how consumers interact with their environment and make dynamic choices. As data collection technologies improve and researchers continue to ask deeper questions about consumers\u27 motivations and behaviors, path data sets will become more common and will play a more central role in marketing research. To guide future research in this area, we review the previous literature, propose a formal definition of a path (in a marketing context), and derive a unifying framework that allows us to classify different kinds of paths. We identify and discuss two primary dimensions (characteristics of the spatial configuration and the agent) as well as six underlying subdimensions. Based on this framework, we cover a range of important operational issues that should be taken into account as researchers begin to build formal models of path-related phenomena. We close with a brief look into the future of path-based models, and a call for researchers to address some of these emerging issues

    Investigation into the applications of genetic algorithms to control engineering

    Get PDF
    Bibliography: pages 117-120.This thesis report presents the results of a study carried out to determine possible uses of genetic algorithms to problems in control engineering. This thesis reviewed the literature on the subject of genetics and genetic algorithms and applied the algorithms to the problems of systems parameter identification and Pl/D controller tuning. More specifically, the study had the following objectives: To investigate possible uses of genetic algorithms to the task of system identification and Pl/D controller tuning. To do an in depth comparison of the proposed uses with orthodox traditional engineering thinking which is based on mathematical optimisation and empirical studies. To draw conclusions and present the findings in the form of a thesis. Genetic algorithms are a class of artificial intelligence methods inspired by the Darwinian principles of natural selection and survival of the fittest. The algorithm encodes potential solutions into chromosome-like data structures that. are evolved using genetic ·operators to determine the optimal solution of the problem. Fundamentally, the evolutionary nature of the algorithm is introduced through the operators called crossover and mutation. Crossover fundamentally takes two strings, selects a crossing point randomly and swaps segments of the strings on either side of the crossover point to create two new individuals. There are three variations of crossover which were considered in this thesis: single point crossover, two point crossover and uniform crossover. It was important that these be given careful consideration since much of the outcome of the algorithm is influenced by both the choice and the amount with which they are applied

    [Activity of Institute for Computer Applications in Science and Engineering]

    Get PDF
    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science

    What broke where for distributed and parallel applications — a whodunit story

    Get PDF
    Detection, diagnosis and mitigation of performance problems in today\u27s large-scale distributed and parallel systems is a difficult task. These large distributed and parallel systems are composed of various complex software and hardware components. When the system experiences some performance or correctness problem, developers struggle to understand the root cause of the problem and fix in a timely manner. In my thesis, I address these three components of the performance problems in computer systems. First, we focus on diagnosing performance problems in large-scale parallel applications running on supercomputers. We developed techniques to localize the performance problem for root-cause analysis. Parallel applications, most of which are complex scientific simulations running in supercomputers, can create up to millions of parallel tasks that run on different machines and communicate using the message passing paradigm. We developed a highly scalable and accurate automated debugging tool called PRODOMETER, which uses sophisticated algorithms to first, create a logical progress dependency graph of the tasks to highlight how the problem spread through the system manifesting as a system-wide performance issue. Second, uses this logical progress dependence graph to identify the task where the problem originated. Finally, PRODOMETER pinpoints the code region corresponding to the origin of the bug. Second, we developed a tool-chain that can detect performance anomaly using machine-learning techniques and can achieve very low false positive rate. Our input-aware performance anomaly detection system consists of a scalable data collection framework to collect performance related metrics from different granularity of code regions, an offline model creation and prediction-error characterization technique, and a threshold based anomaly-detection-engine for production runs. Our system requires few training runs and can handle unknown inputs and parameter combinations by dynamically calibrating the anomaly detection threshold according to the characteristics of the input data and the characteristics of the prediction-error of the models. Third, we developed performance problem mitigation scheme for erasure-coded distributed storage systems. Repair operations of the failed blocks in erasure-coded distributed storage system take really long time in networked constrained data-centers. The reason being, during the repair operation for erasure-coded distributed storage, a lot of data from multiple nodes are gathered into a single node and then a mathematical operation is performed to reconstruct the missing part. This process severely congests the links toward the destination where newly recreated data is to be hosted. We proposed a novel distributed repair technique, called Partial-Parallel-Repair (PPR) that performs this reconstruction in parallel on multiple nodes and eliminates network bottlenecks, and as a result, greatly speeds up the repair process. Fourth, we study how for a class of applications, performance can be improved (or performance problems can be mitigated) by selectively approximating some of the computations. For many applications, the main computation happens inside a loop that can be logically divided into a few temporal segments, we call phases. We found that while approximating the initial phases might severely degrade the quality of the results, approximating the computation for the later phases have very small impact on the final quality of the result. Based on this observation, we developed an optimization framework that for a given budget of quality-loss, would find the best approximation settings for each phase in the execution

    A Practical Guide to Surface Kinetic Monte Carlo Simulations

    Get PDF
    This review article is intended as a practical guide for newcomers to the field of kinetic Monte Carlo (KMC) simulations, and specifically to lattice KMC simulations as prevalently used for surface and interface applications. We will provide worked out examples using the kmos code, where we highlight the central approximations made in implementing a KMC model as well as possible pitfalls. This includes the mapping of the problem onto a lattice and the derivation of rate constant expressions for various elementary processes. Example KMC models will be presented within the application areas surface diffusion, crystal growth and heterogeneous catalysis, covering both transient and steady-state kinetics as well as the preparation of various initial states of the system. We highlight the sensitivity of KMC models to the elementary processes included, as well as to possible errors in the rate constants. For catalysis models in particular, a recurrent challenge is the occurrence of processes at very different timescales, e.g. fast diffusion processes and slow chemical reactions. We demonstrate how to overcome this timescale disparity problem using recently developed acceleration algorithms. Finally, we will discuss how to account for lateral interactions between the species adsorbed to the lattice, which can play an important role in all application areas covered here.Comment: This document is the final Author's version of a manuscript that has been peer reviewed and accepted for publication in Frontiers in Chemistry. To access the final edited and published work see https://www.frontiersin.org/articles/10.3389/fchem.2019.00202/abstrac

    How does rumination impact cognition? A first mechanistic model.

    Get PDF

    How does rumination impact cognition? A first mechanistic model.

    Get PDF
    Rumination is a process of uncontrolled, narrowly-foused neg- ative thinking that is often self-referential, and that is a hall- mark of depression. Despite its importance, little is known about its cognitive mechanisms. Rumination can be thought of as a specific, constrained form of mind-wandering. Here, we introduce a cognitive model of rumination that we devel- oped on the basis of our existing model of mind-wandering. The rumination model implements the hypothesis that rumina- tion is caused by maladaptive habits of thought. These habits of thought are modelled by adjusting the number of memory chunks and their associative structure, which changes the se- quence of memories that are retrieved during mind-wandering, such that during rumination the same set of negative memo- ries is retrieved repeatedly. The implementation of habits of thought was guided by empirical data from an experience sam- pling study in healthy and depressed participants. On the ba- sis of this empirically-derived memory structure, our model naturally predicts the declines in cognitive task performance that are typically observed in depressed patients. This study demonstrates how we can use cognitive models to better un- derstand the cognitive mechanisms underlying rumination and depression

    Currency security and forensics: a survey

    Get PDF
    By its definition, the word currency refers to an agreed medium for exchange, a nation’s currency is the formal medium enforced by the elected governing entity. Throughout history, issuers have faced one common threat: counterfeiting. Despite technological advancements, overcoming counterfeit production remains a distant future. Scientific determination of authenticity requires a deep understanding of the raw materials and manufacturing processes involved. This survey serves as a synthesis of the current literature to understand the technology and the mechanics involved in currency manufacture and security, whilst identifying gaps in the current literature. Ultimately, a robust currency is desire
    • …
    corecore