2,754 research outputs found

    Neural activity with spatial and temporal correlations as a basis to simulate fMRI data

    Get PDF
    In the development of data analysis techniques, simulation studies are constantly gaining more interest. The largest challenge in setting up a simulation study is to create realistic data. This is especially true for generating fMRI data, since there is no consensus about the biological and physical relationships underlying the BOLD signal. Most existing simulation studies start from empirically acquired resting data to obtain realistic noise and add known activity (e.g., Bianciardi et al., 2004). However, since you have no control over the noise, it is hard to use these kinds of data in simulation studies. Others use the Bloch equations to simulate fMRI data (e.g., Drobnjak et al., 2006). Even though they get realistic data, this process is very slow involving a lot of calculations which might be unnecessary in a simulation study. We propose a new basis for generating fMRI data starting from a neural activation map where the neural activity is correlated between different locations, both spatial and temporal. A biologically inspired model can then be used to simulate the BOLD respons

    neuRosim: an R package for simulation of fMRI magnitude data with realistic noise

    Get PDF
    Statistical analysis techniques for highly complex structured data such as fMRI data should be thoroughly validated. In this process, knowing the ground truth is essential. Unfortunately, establishing the ground truth of fMRI data is only possible with highly invasive procedures (i.e. intracranial EEG). Therefore, generating the data artificially is often the only viable solution. However, there is currently no consensus among researchers on how to simulate fMRI data. Research groups develop their own methods and use only inhouse software routines. A general validition of these methods is lacking, probably due to the nonexistance of well-documented and freely available software

    Three-Dimensional Extended Bargmann Supergravity

    Get PDF
    We show that three-dimensional General Relativity, augmented with two vector fields, allows for a non-relativistic limit, different from the standard limit leading to Newtonian gravity, that results into a well-defined action which is of the Chern-Simons type. We show that this three-dimensional `Extended Bargmann Gravity', after coupling to matter, leads to equations of motion allowing a wider class of background geometries than the ones that one encounters in Newtonian gravity. We give the supersymmetric generalization of these results and point out an important application in the context of calculating partition functions of non-relativistic field theories using localization techniques.Comment: 6 pages, v2: typo's corrected, reference updated, accepted for publication in Phys. Rev. Let

    Secondary generalisation in categorisation: an exemplar-based account

    Get PDF
    The parallel rule activation and rule synthesis (PRAS) model is a computational model for generalisation in category learning, proposed by Vandierendonck (1995). An important concept underlying the PRAS model is the distinction between primary and secondary generalisation. In Vandierendonck (1995), an empirical study is reported that provides support for the concept of secondary generalisation. In this paper, we re-analyse the data reported by Vandierendonck (1995) by fitting three different variants of the Generalised Context Model (GCM) which do not rely on secondary generalisation. Although some of the GCM variants outperformed the PRAS model in terms of global fit, they all have difficulty in providing a qualitatively good fit of a specific critical pattern

    A review of R-packages for random-intercept probit regression in small clusters

    Get PDF
    Generalized Linear Mixed Models (GLMMs) are widely used to model clustered categorical outcomes. To tackle the intractable integration over the random effects distributions, several approximation approaches have been developed for likelihood-based inference. As these seldom yield satisfactory results when analyzing binary outcomes from small clusters, estimation within the Structural Equation Modeling (SEM) framework is proposed as an alternative. We compare the performance of R-packages for random-intercept probit regression relying on: the Laplace approximation, adaptive Gaussian quadrature (AGQ), Penalized Quasi-Likelihood (PQL), an MCMC-implementation, and integrated nested Laplace approximation within the GLMM-framework, and a robust diagonally weighted least squares estimation within the SEM-framework. In terms of bias for the fixed and random effect estimators, SEM usually performs best for cluster size two, while AGQ prevails in terms of precision (mainly because of SEM's robust standard errors). As the cluster size increases, however, AGQ becomes the best choice for both bias and precision

    The influence of problem features and individual differences on strategic performance in simple arithmetic

    Get PDF
    The present study examined the influence of features differing across problems (problem size and operation) and differing across individuals (daily arithmetic practice, the amount of calculator use, arithmetic skill, and gender) on simple-arithmetic performance. Regression analyses were used to investigate the role of these variables in both strategy selection and strategy efficiency. Results showed that more-skilled and highly practiced students used memory retrieval more often and executed their strategies more efficiently than less-skilled and less practiced students. Furthermore, calculator use was correlated with retrieval efficiency and procedural efficiency but not with strategy selection. Only very small associations with gender were observed, with boys retrieving slightly faster than girls. Implications of the present findings for views on models of mental arithmetic are discussed

    Newton-Cartan supergravity with torsion and Schr\"odinger supergravity

    Get PDF
    We derive a torsionfull version of three-dimensional N=2 Newton-Cartan supergravity using a non-relativistic notion of the superconformal tensor calculus. The "superconformal" theory that we start with is Schr\"odinger supergravity which we obtain by gauging the Schr\"odinger superalgebra. We present two non-relativistic N=2 matter multiplets that can be used as compensators in the superconformal calculus. They lead to two different off-shell formulations which, in analogy with the relativistic case, we call "old minimal" and "new minimal" Newton-Cartan supergravity. We find similarities but also point out some differences with respect to the relativistic case.Comment: 30 page

    Newton-Cartan (super)gravity as a non-relativistic limit

    Get PDF
    We define a procedure that, starting from a relativistic theory of supergravity, leads to a consistent, non-relativistic version thereof. As a first application we use this limiting procedure to show how the Newton-Cartan formulation of non-relativistic gravity can be obtained from general relativity. Then we apply it in a supersymmetric case and derive a novel, non-relativistic, off-shell formulation of three-dimensional Newton-Cartan supergravity.Comment: 29 pages; v2: added comment about different NR gravities and more refs; v3: more refs, matches published versio
    • …
    corecore