202,465 research outputs found
Elimination of Bias in Introspection: Methodological Advances, Refinements, and Recommendations
Building on past constructive criticism, the present study provides further methodological development focused on the elimination of bias that may occur during first-person observation. First, various sources of errors that may accompany introspection are distinguished based on previous critical literature. Four main errors are classified, namely attentional, attributional, conceptual, and expressional error. Furthermore, methodological recommendations for the possible elimination of these errors have been determined based on the analysis and focused excerpting of introspective scientific literature. The following groups of methodological recommendations were determined: 1) a better focusing of the subject’s attention to their mental processes, 2) providing suitable stimuli, and 3) the sharing of introspective experience between subjects. Furthermore, the potential of adjustments in introspective research designs for eliminating attentional, attributional, conceptual, and expressional error is discussed
Recommended from our members
Does epistemology matter for educational practice?
Lankshear, Peters & Knobel (2000) suggest that 'The digital age is throwing many of our educational practices and emphases and their underlying epistemological assumptions, beliefs, concepts and substantive theories into doubt'. In particular, because of new technology, educational philosophers must reconsider 'epistemological matters in relation to educational theory and practice' as a matter of 'very high priority'. Of course, philosophers need no excuse at all to reconsider anything; but since Lankshear, Peters & Knobel argue forcefully that 'key elements of the epistemological model that has underpinned education throughout the modern-industrial era' are brought into question by the fact of a 'digital age where more and more of our time, purposes and energies are invested in activities involving new communications and information technologies', it is perhaps worth asking whether the advent of new technology can, in itself, have profound implications for epistemology, and' more fundamentally – how exactly does epistemology 'underpin' or 'underlie' educational practice?
In what follows, the main practical educational questions that I have chosen to consider with respect to issues of epistemology are:
– What should be taught?
– How should it be taught?
– How can one tell what has been learned?
This paper is in four parts. The first part outlines the case made by Lankshear, Peters & Knobel that traditional versions of epistemology must be replaced by a post-modern social epistemology because of changed social practices brought about by new technology; and that educational practice must consequently be reconsidered. The second part of the paper considers some of the claims made about the influences of technology on contemporary knowledge practices. The third part of the paper suggests that the argument offered by Lankshear, Peters & Knobel works as a whole if 'epistemology' is identified with 'accounts of knowledge practices'. The final part considers whether there may be more to epistemology than just social epistemology
Approximate Decoding Approaches for Network Coded Correlated Data
This paper considers a framework where data from correlated sources are
transmitted with help of network coding in ad-hoc network topologies. The
correlated data are encoded independently at sensors and network coding is
employed in the intermediate nodes in order to improve the data delivery
performance. In such settings, we focus on the problem of reconstructing the
sources at decoder when perfect decoding is not possible due to losses or
bandwidth bottlenecks. We first show that the source data similarity can be
used at decoder to permit decoding based on a novel and simple approximate
decoding scheme. We analyze the influence of the network coding parameters and
in particular the size of finite coding fields on the decoding performance. We
further determine the optimal field size that maximizes the expected decoding
performance as a trade-off between information loss incurred by limiting the
resolution of the source data and the error probability in the reconstructed
data. Moreover, we show that the performance of the approximate decoding
improves when the accuracy of the source model increases even with simple
approximate decoding techniques. We provide illustrative examples about the
possible of our algorithms that can be deployed in sensor networks and
distributed imaging applications. In both cases, the experimental results
confirm the validity of our analysis and demonstrate the benefits of our low
complexity solution for delivery of correlated data sources
Contextual advantage for state discrimination
Finding quantitative aspects of quantum phenomena which cannot be explained
by any classical model has foundational importance for understanding the
boundary between classical and quantum theory. It also has practical
significance for identifying information processing tasks for which those
phenomena provide a quantum advantage. Using the framework of generalized
noncontextuality as our notion of classicality, we find one such nonclassical
feature within the phenomenology of quantum minimum error state discrimination.
Namely, we identify quantitative limits on the success probability for minimum
error state discrimination in any experiment described by a noncontextual
ontological model. These constraints constitute noncontextuality inequalities
that are violated by quantum theory, and this violation implies a quantum
advantage for state discrimination relative to noncontextual models.
Furthermore, our noncontextuality inequalities are robust to noise and are
operationally formulated, so that any experimental violation of the
inequalities is a witness of contextuality, independently of the validity of
quantum theory. Along the way, we introduce new methods for analyzing
noncontextuality scenarios, and demonstrate a tight connection between our
minimum error state discrimination scenario and a Bell scenario.Comment: 18 pages, 9 figure
Functional Bandits
We introduce the functional bandit problem, where the objective is to find an
arm that optimises a known functional of the unknown arm-reward distributions.
These problems arise in many settings such as maximum entropy methods in
natural language processing, and risk-averse decision-making, but current
best-arm identification techniques fail in these domains. We propose a new
approach, that combines functional estimation and arm elimination, to tackle
this problem. This method achieves provably efficient performance guarantees.
In addition, we illustrate this method on a number of important functionals in
risk management and information theory, and refine our generic theoretical
results in those cases
Hierarchical elimination-by-aspects and nested logit models of stated preferences for alternative fuel vehicles
1. INTRODUCTION
Since the late 1960s, transport demand analysis has been the context for significant developments in model forms for the representation of discrete choice behaviour. Such developments have adhered almost exclusively to
the behavioural paradigm of Random Utility Maximisation (RUM), first proposed by Marschak (1960) and Block and Marschak (1960). A common argument for the allegiance to RUM is that it ensures consistency with the fundamental axioms of microeconomic consumer theory and, it follows,
permits interface between the demand model and the concepts of welfare economics (e.g. Koppelman and Wen, 2001). The desire to better represent observed choice, which has driven developments in RUM models, has been somewhat at odds, however, with the frequent assault on the utility maximisation paradigm, and by implication
RUM, from a range of literatures. This critique has challenged the empirical validity of the fundamental axioms (e.g. Kahneman and Tversky, 2000; Mclntosh and Ryan, 2002; Saelensmide, 1999) and, more generally, the
realism of the notion of instrumental rationality inherent in utility maximisation (e.g. Hargreaves-Heap, 1992; McFadden, 1999; Camerer, 1998). Emanating from these literatures has been an alternative family of so-called
non-RUM models, which seek to offer greater realism in the representation of how individuals actually process choice tasks. The workshop on Methodological Developments at the 2000 Conference of the International Association for Travel Behaviour Research concluded: 'Non-RUM models
deserve to be evaluated side-by-side with RUM models to determine their practicality, ability to describe behaviour, and usefulness for transportation policy. The research agenda should include tests of these models' (Bolduc and McFadden, 2001 p326). The present paper, together with a companion paper, Batley and Daly (2003), offer a timely contribution to this research
priority. Batley and Daly (2003) present a detailed account of the theoretical derivation of RUM, and consider the relationships of two specific RUM forms;
nested logit [NL] (Ben-Akiva, 1974; Williams, 1977; Daly and Zachary, 1976; McFadden, 1978) and recursive nested extreme value [RNEV] (Daly, 2001 ; Bierlaire, 2002; Daly and Bierlaire, 2003); to two specific non-RUM forms;
elimination-by-aspects [EBA] (Tversky, 1972a, 1972b) and hierarchical EBA [HEBA] (Tversky and Sattath, 1979). In particular, Batley and Daly (2003) establish conditions under which NL and RNEV derive equivalent choice
probabilities to HEBA and EBA, respectively. These findings would seem to ameliorate the concern that the application of RUM models to data generated by non-RUM choice processes could introduce significant biases. That
aside, substantive issues remain as to how non-RUM models can best be specified so as to yield useful and robust information in both estimation and forecasting contexts, and how their empirical performance compares with
RUM models. Such issues are the focus of the present paper, which applies non-RUM models to a real empirical context
A heuristic model of bounded route choice in urban areas
There is substantial evidence to indicate that route choice in urban areas is complex cognitive process, conducted under uncertainty and formed on partial perspectives. Yet, conventional route choice models continue make simplistic assumptions around the nature of human cognitive ability, memory and preference. In this paper, a novel framework for route choice in urban areas is introduced, aiming to more accurately reflect the uncertain, bounded nature of route choice decision making. Two main advances are introduced. The first involves the definition of a hierarchical model of space representing the relationship between urban features and human cognition, combining findings from both the extensive previous literature on spatial cognition and a large route choice dataset. The second advance involves the development of heuristic rules for route choice decisions, building upon the hierarchical model of urban space. The heuristics describe the process by which quick, 'good enough' decisions are made when individuals are faced with uncertainty. This element of the model is once more constructed and parameterised according to findings from prior research and the trends identified within a large routing dataset. The paper outlines the implementation of the framework within a real-world context, validating the results against observed behaviours. Conclusions are offered as to the extension and improvement of this approach, outlining its potential as an alternative to other route choice modelling frameworks
Factors Dictating Carbene Formation at (PNP)Ir
The mechanistic subtleties involved with the interaction of an amido/bis(phosphine)-supported (PNP)Ir fragment with a series of linear and cyclic ethers have been investigated using density functional theory. Our analysis has revealed the factors dictating reaction direction toward either an iridium-supported carbene or a vinyl ether adduct. The (PNP)Ir structure will allow carbene formation only from accessible carbons α to the ethereal oxygen, such that d electron back-donation from the metal to the carbene ligand is possible. Should these conditions be unavailable, the main competing pathway to form vinyl ether can occur, but only if the (PNP)Ir framework does not sterically interfere with the reacting ether. In situations where steric hindrance prevents unimpeded access to both pathways, the reaction may progress to the initial C−H activation but no further. Our mechanistic analysis is density functional independent and whenever possible confirmed experimentally by trapping intermediate species experimentally. We have also highlighted an interesting systematic error present in the DFT analysis of reactions where steric environment alters considerably within a reaction
- …