2,217,371 research outputs found
On the Gauge Invariance of the Decay Rate of False Vacuum
We study the gauge invariance of the decay rate of the false vacuum for the
model in which the scalar field responsible for the false vacuum decay has
gauge quantum number. In order to calculate the decay rate, one should
integrate out the field fluctuations around the classical path connecting the
false and true vacua (i.e., so-called bounce). Concentrating on the case where
the gauge symmetry is broken in the false vacuum, we show a systematic way to
perform such an integration and present a manifestly gauge-invariant formula of
the decay rate of the false vacuum.Comment: 17 pages, published versio
Data-analytical stability in second-level fMRI inference
We investigate the impact of decisions in the second-level (i.e. over subjects) inferential process in functional Magnetic Resonance Imaging (fMRI) on 1) the balance between false positives and false negatives and on 2) the data-analytical stability (Qiu et al., 2006; Roels et al., 2015), both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects (Beckmann et al., 2003). We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via permutation-based inference or via inference based on parametrical assumptions (Holmes et al., 1996). Third, we evaluate 3 commonly used procedures to address the multiple testing problem: family-wise error rate correction, false discovery rate correction and a two-step procedure with minimal cluster size (Lieberman and Cunningham, 2009; Bennett et al., 2009). Based on a simulation study and on real data we find that the two-step procedure with minimal cluster-size results in most stable results, followed by the family- wise error rate correction. The false discovery rate results in most variable results, both for permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference
Optimality in multiple comparison procedures
When many (m) null hypotheses are tested with a single dataset, the control
of the number of false rejections is often the principal consideration. Two
popular controlling rates are the probability of making at least one false
discovery (FWER) and the expected fraction of false discoveries among all
rejections (FDR). Scaled multiple comparison error rates form a new family that
bridges the gap between these two extremes. For example, the Scaled Expected
Value (SEV) limits the number of false positives relative to an arbitrary
increasing function of the number of rejections, that is, E(FP/s(R)). We
discuss the problem of how to choose in practice which procedure to use, with
elements of an optimality theory, by considering the number of false rejections
FP separately from the number of correct rejections TP. Using this framework we
will show how to choose an element in the new family mentioned above.Comment: arXiv admin note: text overlap with arXiv:1112.451
Generalizations of the Familywise Error Rate
Consider the problem of simultaneously testing null hypotheses H_1,...,H_s.
The usual approach to dealing with the multiplicity problem is to restrict
attention to procedures that control the familywise error rate (FWER), the
probability of even one false rejection. In many applications, particularly if
s is large, one might be willing to tolerate more than one false rejection
provided the number of such cases is controlled, thereby increasing the ability
of the procedure to detect false null hypotheses. This suggests replacing
control of the FWER by controlling the probability of k or more false
rejections, which we call the k-FWER. We derive both single-step and stepdown
procedures that control the k-FWER, without making any assumptions concerning
the dependence structure of the p-values of the individual tests. In
particular, we derive a stepdown procedure that is quite simple to apply, and
prove that it cannot be improved without violation of control of the k-FWER. We
also consider the false discovery proportion (FDP) defined by the number of
false rejections divided by the total number of rejections (defined to be 0 if
there are no rejections). The false discovery rate proposed by Benjamini and
Hochberg [J. Roy. Statist. Soc. Ser. B 57 (1995) 289-300] controls E(FDP).
Here, we construct methods such that, for any \gamma and \alpha,
P{FDP>\gamma}\le\alpha. Two stepdown methods are proposed. The first holds
under mild conditions on the dependence structure of p-values, while the second
is more conservative but holds without any dependence assumptions.Comment: Published at http://dx.doi.org/10.1214/009053605000000084 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
The neural representation of mental beliefs held by two agents
Neuroimaging research has demonstrated that mentalizing about false beliefs held by other people recruits the temporo-parietal junction (TPJ). However, earlier work was limited to a single agent that held a false belief. We investigated the effect of two agents that held similar or mixed false and/or true beliefs. Participants saw animated stories with two smurfs holding true or false beliefs (Story phase). At the end of each trial, they were requested to take the perspective of the self or one of the smurfs (Question phase). We predicted that an increasing number of smurfs holding a false belief would increase activation in the TPJ when participants have to report the belief of the smurf, because the incongruent belief should have a stronger influence if it is held by two compared with one agent. This prediction was confirmed as activation in the TPJ during the Story and Question phase increased when more smurfs held a false belief. Taking the perspective of the self led to stronger activation of the TPJ in the two conditions that involved a true belief and weakest activation in the condition of two false beliefs. These data suggest that activation in TPJ depends on the perspective participants take, and that the number of agents holding a false belief influences activation in the TPJ only when taking the agent's perspective
Processing of false belief passages during natural story comprehension: An fMRI study
The neural correlates of theory of mind (ToM) are typically studied using paradigms which require participants to draw explicit, task-related inferences (e.g., in the false belief task). In a natural setup, such as listening to stories, false belief mentalizing occurs incidentally as part of narrative processing. In our experiment, participants listened to auditorily presented stories with false belief passages (implicit false belief processing) and immediately after each story answered comprehension questions (explicit false belief processing), while neural responses were measured with functional magnetic resonance imaging (fMRI). All stories included (among other situations) one false belief condition and one closely matched control condition. For the implicit ToM processing, we modeled the hemodynamic response during the false belief passages in the story and compared it to the hemodynamic response during the closely matched control passages. For implicit mentalizing, we found activation in typical ToM processing regions, that is the angular gyrus (AG), superior medial frontal gyrus (SmFG), precuneus (PCUN), middle temporal gyrus (MTG) as well as in the inferior frontal gyrus (IFG) billaterally. For explicit ToM, we only found AG activation. The conjunction analysis highlighted the left AG and MTG as well as the bilateral IFG as overlapping ToM processing regions for both implicit and explicit modes. Implicit ToM processing during listening to false belief passages, recruits the left SmFG and billateral PCUN in addition to the “mentalizing network” known form explicit processing tasks
The Effect of Limited Attention and Delay on Negative Arousing False Memories
Previous research has shown that, in comparison to neutral stimuli, false memories for high arousing negative stimuli are greater after very fast presentation and limited attention at study. However, full compared to limited attention conditions still produce comparably more false memories for all stimuli types. Research has also shown that emotional stimuli benefit from a period of consolidation. What effect would such consolidation have on false memory formation even when attention is limited at study? The aim of the present study was to investigate the effect of fast presentation on false memory production for negatively-arousing and neutral items over time using the DRM paradigm. Sixty-Eight participants studied Negative and neutral DRM lists with fast or slow presentation conditions. Half completed a recognition test immediately and half completed a recognition test after one-week. Results revealed that, for fast presentation, negative critical lures increased after one week and were comparable to negative critical lures in the slow presentation encoding conditions. Neutral critical lures in the fast presentation condition did not change and remained lower compared to the slow presentation condition. These findings are the first demonstration that arousing negative false memories can increase over time when attention at encoding is limited
Evaluation of second-level inference in fMRI analysis
We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference
- …
