19,908 research outputs found
Measurement in marketing
We distinguish three senses of the concept of measurement (measurement as the selection of observable indicators of theoretical concepts, measurement as the collection of data from respondents, and measurement as the formulation of measurement models linking observable indicators to latent factors representing the theoretical concepts), and we review important issues related to measurement in each of these senses. With regard to measurement in the first sense, we distinguish the steps of construct definition and item generation, and we review scale development efforts reported in three major marketing journals since 2000 to illustrate these steps and derive practical guidelines. With regard to measurement in the second sense, we look at the survey process from the respondent's perspective and discuss the goals that may guide participants' behavior during a survey, the cognitive resources that respondents devote to answering survey questions, and the problems that may occur at the various steps of the survey process. Finally, with regard to measurement in the third sense, we cover both reflective and formative measurement models, and we explain how researchers can assess the quality of measurement in both types of measurement models and how they can ascertain the comparability of measurements across different populations of respondents or conditions of measurement. We also provide a detailed empirical example of measurement analysis for reflective measurement models
A load-sharing architecture for high performance optimistic simulations on multi-core machines
In Parallel Discrete Event Simulation (PDES), the simulation model is partitioned into a set of distinct Logical Processes (LPs) which are allowed to concurrently execute simulation events. In this work we present an innovative approach to load-sharing on multi-core/multiprocessor machines, targeted at the optimistic PDES paradigm, where LPs are speculatively allowed to process simulation events with no preventive verification of causal consistency, and actual consistency violations (if any) are recovered via rollback techniques. In our approach, each simulation kernel instance, in charge of hosting and executing a specific set of LPs, runs a set of worker threads, which can be dynamically activated/deactivated on the basis of a distributed algorithm. The latter relies in turn on an analytical model that provides indications on how to reassign processor/core usage across the kernels in order to handle the simulation workload as efficiently as possible. We also present a real implementation of our load-sharing architecture within the ROme OpTimistic Simulator (ROOT-Sim), namely an open-source C-based simulation platform implemented according to the PDES paradigm and the optimistic synchronization approach. Experimental results for an assessment of the validity of our proposal are presented as well
Nonlocal quantum information transfer without superluminal signalling and communication
It is a frequent assumption that - via superluminal information transfers -
superluminal signals capable of enabling communication are necessarily
exchanged in any quantum theory that posits hidden superluminal influences.
However, does the presence of hidden superluminal influences automatically
imply superluminal signalling and communication? The non-signalling theorem
mediates the apparent conflict between quantum mechanics and the theory of
special relativity. However, as a 'no-go' theorem there exist two opposing
interpretations of the non-signalling constraint: foundational and operational.
Concerning Bell's theorem, we argue that Bell employed both interpretations at
different times. Bell finally pursued an explicitly operational position on
non-signalling which is often associated with ontological quantum theory, e.g.,
de Broglie-Bohm theory. This position we refer to as "effective
non-signalling". By contrast, associated with orthodox quantum mechanics is the
foundational position referred to here as "axiomatic non-signalling". In search
of a decisive communication-theoretic criterion for differentiating between
"axiomatic" and "effective" non-signalling, we employ the operational framework
offered by Shannon's mathematical theory of communication. We find that an
effective non-signalling theorem represents two sub-theorems, which we call (1)
non-transfer-control (NTC) theorem, and (2) non-signification-control (NSC)
theorem. Employing NTC and NSC theorems, we report that effective, instead of
axiomatic, non-signalling is entirely sufficient for prohibiting nonlocal
communication. An effective non-signalling theorem allows for nonlocal quantum
information transfer yet - at the same time - effectively denies superluminal
signalling and communication.Comment: 21 pages, 5 figures; The article is published with open acces in
Foundations of Physics (2016
Cache-Aware Memory Manager for Optimistic Simulations
Parallel Discrete Event Simulation is a well known technique for executing complex general-purpose simulations where models are described as objects the interaction of which is expressed through the generation of impulsive events. In particular, Optimistic Simulation allows full exploitation of the available computational power, avoiding the need to compute safety properties for the events to be executed. Optimistic Simulation platforms internally rely on several data structures, which are meant to support operations aimed at ensuring correctness, inter-kernel communication and/or event scheduling. These housekeeping and management operations access them according to complex patterns, commonly suïŹering from misuse of memory caching architectures. In particular, operations like log/restore access data structures on a periodic basis, producing the replacement of in-cache buïŹers related to the actual working set of the application logic, producing a non-negligible performance drop.
In this work we propose generally-applicable design principles for a new memory management subsystem targeted at Optimistic Simulation platforms which can face this issue by wisely allocating memory buïŹers depending on their actual future access patterns, in order to enhance event-execution memory locality. Additionally, an application-transparent implementation within ROOT-Sim, an open-source generalpurpose optimistic simulation platform, is presented along with experimental results testing our proposal
Deliberate development of asset frontiers in innovative manufacturing businesses
Manufacturing companies need to be innovative to ensure long term success. This requires organisations to reconcile the conflicting temporal demands of a dynamic business environment and the more gradual development of infrastructure, systems and people. This challenge is explored by examining the relationship between a firmâs innovation propensity and the profile of its portfolio of manufacturing resources.
The Theory of Performance Frontiers is used to characterise the capability profile arising from a firmâs suite of assets and resources. The theory contends that the distance between a firmâs operating frontier (OF) and its asset frontier (AF) is related to the manufacturing unitâs ability to be agile and flexible. A new measure is developed and validated that represents the gap between the frontiers â the OF-AF Gap.
The organisationâs innovation propensity is shown to have a negative impact on firm performance unless it is accompanied by a correspondingly large OF-AF gap. It is therefore important that the gap is actively managed by addressing its three constituent elements.
Firstly, organisational learning should be planned along the technological trajectory of the business ahead of current needs. Secondly, product development resources should be balanced between exploitative and explorative projects, with exploration grounded in the fertile areas created by prior knowledge-acquisition activities. Thirdly, justification for investment in physical assets should not be limited to project-related benefits, but should incorporate the capability-building value new equipment brings to the organisation. The acquisition of equipment that has capability beyond immediate project-specific requirements then becomes more justifiable in a financial environment where return-on-investment is king.
The research concludes by developing a simple tool that allows an organisationâs OF-AF gap to be enumerated on a normalised scale. This unlocks the potential for firms to benchmark themselves against industry norms and to numerically incorporate the capability-building value of asset investments in financial justifications
Long-run marketing inferences from scanner data.
Good marketing decisions require managers' understanding of the nature of the market-response function relating performance measures such as sales and market share to variations in the marketing mix (product, price, distribution and communications efforts). Our paper focuses on the dynamic aspect of market-response functions, i.e. how current marketing actions affect current and future market response. While conventional econometrics has been the dominant methodology in empirical market-response analyses, time-series analysis offers unique opportunities for pushing the frontier in dynamic research. This paper examines the contributions an d the future outlook of time-series analysis in market-response modeling. We conclude first, that time series analysis has made a relatively limited overall contribution to the discipline, and investigate reasons why that has been the case. However, major advances in data (transactions-based databases and in modeling technology (long-term time-series modeling) create new opportunities for time-series techniques in marketing, in particular for the study of long-run marketing effectiveness. We discuss four major aspects of long -term time-series modeling, relate them to substantive marketing problems, and describe some early applications. Combining the new data with the new methods, we then present original empirical results on the long-term behavior of brand sales and category sales for four consumer products. We discuss the implications of our findings for future research in market response. Our observations lead us to identify three areas where additional research could enhance the diffusion of the identified time-series concepts in marketing.Data; Marketing;
Causality and Association: The Statistical and Legal Approaches
This paper discusses different needs and approaches to establishing
``causation'' that are relevant in legal cases involving statistical input
based on epidemiological (or more generally observational or population-based)
information. We distinguish between three versions of ``cause'': the first
involves negligence in providing or allowing exposure, the second involves
``cause'' as it is shown through a scientifically proved increased risk of an
outcome from the exposure in a population, and the third considers ``cause'' as
it might apply to an individual plaintiff based on the first two. The
population-oriented ``cause'' is that commonly addressed by statisticians, and
we propose a variation on the Bradford Hill approach to testing such causality
in an observational framework, and discuss how such a systematic series of
tests might be considered in a legal context. We review some current legal
approaches to using probabilistic statements, and link these with the
scientific methodology as developed here. In particular, we provide an approach
both to the idea of individual outcomes being caused on a balance of
probabilities, and to the idea of material contribution to such outcomes.
Statistical terminology and legal usage of terms such as ``proof on the balance
of probabilities'' or ``causation'' can easily become confused, largely due to
similar language describing dissimilar concepts; we conclude, however, that a
careful analysis can identify and separate those areas in which a legal
decision alone is required and those areas in which scientific approaches are
useful.Comment: Published in at http://dx.doi.org/10.1214/07-STS234 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Transparent support for partial rollback in software transactional memories
The Software Transactional Memory (STM) paradigm has gained momentum thanks to its ability to provide synchronization transparency in concurrent applications. With this paradigm, accesses to data structures that are shared among multiple threads are carried out within transactions, which are properly handled by the STM layer with no intervention by the application code. In this article we propose an enhancement of typical STM architectures which allows supporting partial rollback of active transactions, as opposed to the typical case where a rollback of a transaction entails squashing all the already-performed work. Our partial rollback scheme is still transparent to the application programmer and has been implemented for x86-64 architectures and for the ELF format, thus being largely usable on POSIX-compliant systems hosted on top of off-the-shelf architectures. We integrated it within the TinySTM open-source library and we present experimental results for the STAMP STM benchmark run on top of a 32-core HP ProLiant server. © 2013 Springer-Verlag
- âŠ