62 research outputs found
Steady-state negative Wigner functions of nonlinear nanomechanical oscillators
We propose a scheme to prepare nanomechanical oscillators in nonclassical
steady states, characterized by a pronounced negative Wigner function. In our
optomechanical approach, the mechanical oscillator couples to multiple laser
driven resonances of an optical cavity. By lowering the resonance frequency of
the oscillator via an inhomogeneous electrostatic field, we significantly
enhance its intrinsic geometric nonlinearity per phonon. This causes the
motional sidebands to split into separate spectral lines for each phonon number
and transitions between individual phonon Fock states can be selectively
addressed. We show that this enables the preparation of the nanomechanical
oscillator in a single phonon Fock state. Our scheme can for example be
implemented with a carbon nanotube dispersively coupled to the evanescent field
of a state of the art whispering gallery mode microcavity
Enhancing non-classicality in mechanical systems
We study the effects of post-selection measurements on both the non-classicality of the state of a mechanical oscillator and the entanglement between two mechanical systems that are part of a distributed optomechanical network. We address the cases of both Gaussian and non-Gaussian measurements, identifying in which cases simple photon counting and Geiger-like measurements are effective in distilling a strongly non-classical mechanical state and enhancing the purely mechanical entanglement between two elements of the network
The intersection between Descriptivism and Meliorism in reasoning research: further proposals in support of 'soft normativism'
The rationality paradox centres on the observation that people are highly intelligent, yet show evidence of errors and biases in their thinking when measured against normative standards. Elqayam and Evans (e.g., 2011) reject normative standards in the psychological study of thinking, reasoning and deciding in favour of a ‘value-free’ descriptive approach to studying high-level cognition. In reviewing Elqayam and Evans’ position, we defend an alternative to descriptivism in the form of ‘soft normativism’, which allows for normative evaluations alongside the pursuit of descriptive research goals. We propose that normative theories have considerable value provided that researchers: (1) are alert to the philosophical quagmire of strong relativism; (2) are mindful of the biases that can arise from utilising normative benchmarks; and (3) engage in a focused analysis of the processing approach adopted by individual reasoners. We address the controversial ‘is–ought’ inference in this context and appeal to a ‘bridging solution’ to this contested inference that is based on the concept of ‘informal reflective equilibrium’. Furthermore, we draw on Elqayam and Evans’ recognition of a role for normative benchmarks in research programmes that are devised to enhance reasoning performance and we argue that such Meliorist research programmes have a valuable reciprocal relationship with descriptivist accounts of reasoning. In sum, we believe that descriptions of reasoning processes are fundamentally enriched by evaluations of reasoning quality, and argue that if such standards are discarded altogether then our explanations and descriptions of reasoning processes are severely undermined
Rationality and the experimental study of reasoning
A survey of the results obtained during the past three decades in some of the most widely used tasks and paradigms in the experimental study of reasoning is presented. It is shown that, at first sight, human performance suffers from serious shortcomings. However, after the problems of communication between experimenter and subject are taken into account, which leads to clarify the subject's representation of the tasks, one observes a better performance, although still far from perfect. Current theories of reasoning, of which the two most prominent are very briefly outlined, agree in identifying the load in working memory as the main source of limitation in performance. Finally, a recent view on human rationality prompted by the foregoing results is described
The logic-bias effect: The role of effortful processing in the resolution of belief-logic conflict.
According to the default interventionist dual-process account of reasoning, belief-based responses to reasoning tasks are based on Type 1 processes generated by default, which must be inhibited in order to produce an effortful, Type 2 output based on the validity of an argument. However, recent research has indicated that reasoning on the basis of beliefs may not be as fast and automatic as this account claims. In three experiments, we presented participants with a reasoning task that was to be completed while they were generating random numbers (RNG). We used the novel methodology introduced by Handley, Newstead & Trippas (Journal of Experimental Psychology: Learning, Memory, and Cognition, 37, 28-43, 2011), which required participants to make judgments based upon either the validity of a conditional argument or the believability of its conclusion. The results showed that belief-based judgments produced lower rates of accuracy overall and were influenced to a greater extent than validity judgments by the presence of a conflict between belief and logic for both simple and complex arguments. These findings were replicated in Experiment 3, in which we controlled for switching demands in a blocked design. Across all three experiments, we found a main effect of RNG, implying that both instructional sets require some effortful processing. However, in the blocked design RNG had its greatest impact on logic judgments, suggesting that distinct executive resources may be required for each type of judgment. We discuss the implications of our findings for the default interventionist account and offer a parallel competitive model as an alternative interpretation for our findings
Learning to represent exact numbers
This article focuses on how young children acquire concepts for exact, cardinal numbers (e.g., three, seven, two hundred, etc.). I believe that exact numbers are a conceptual structure that was invented by people, and that most children acquire gradually, over a period of months or years during early childhood. This article reviews studies that explore children’s number knowledge at various points during this acquisition process. Most of these studies were done in my own lab, and assume the theoretical framework proposed by Carey (2009). In this framework, the counting list (‘one,’ ‘two,’ ‘three,’ etc.) and the counting routine (i.e., reciting the list and pointing to objects, one at a time) form a placeholder structure. Over time, the placeholder structure is gradually filled in with meaning to become a conceptual structure that allows the child to represent exact numbers (e.g., There are 24 children in my class, so I need to bring 24 cupcakes for the party.) A number system is a socially shared, structured set of symbols that pose a learning challenge for children. But once children have acquired a number system, it allows them to represent information (i.e., large, exact cardinal values) that they had no way of representing before
- …