61,535 research outputs found
Representation, Rightness, and the Fringe
So the central question here is phenomenological: What is the nature of the aesthetic zap? For it is this experience, or its promise, which gives art such a deep hold on human life. But the issue of representation, while secondary, is still pregnant with cognitive implications: Why is representation, of all the devices available to an artist, more likely to shift the odds in favour of eliciting and/or intensifying aesthetic experience? Assuming a Darwinian view of our species, it is likely that the answer to both questions will come from understanding how our capacity to enjoy art grows out of normal cognition
The Fringe: A Case Study in Explanatory Phenomenology
William James’ greatest achievement is, arguably, his analysis of the fringe — or, as he sometimes called it, transitive experience. In trying to understand this vague, elusive, often peripheral aspect of consciousness, James broke new ground. But in so doing he also began to lay down the first stratum of a radically new methodology, one that intersects first- and third-person findings in such a way that each is able to interrogate the other, and so further our understanding of both....\ud
\ud
But I think it is important to see that explanatory phenomenology can be completely scientific without necessarily having to (1) consider the neural substrate, (2) employ reductive arguments, or (3) operate at the third-person level. If I am right, explanatory phenomenology can be a remarkably plastic member of the set of first-person methodologies for the study of consciousness
Recommended from our members
Inadequate diet descriptions: a conundrum for animal model research.
Critical-point finite-size scaling in the microcanonical ensemble
We develop a scaling theory for the finite-size critical behavior of the
microcanonical entropy (density of states) of a system with a
critically-divergent heat capacity. The link between the microcanonical entropy
and the canonical energy distribution is exploited to establish the former, and
corroborate its predicted scaling form, in the case of the 3d Ising
universality class. We show that the scaling behavior emerges clearly when one
accounts for the effects of the negative background constant contribution to
the canonical critical specific heat. We show that this same constant plays a
significant role in determining the observed differences between the canonical
and microcanonical specific heats of systems of finite size, in the critical
region.Comment: 27 pages Revtex, 9 figure
Quantifying the Statistical Impact of GRAPPA in fcMRI Data with a Real-Valued Isomorphism
The interpolation of missing spatial frequencies through the generalized auto-calibrating partially parallel acquisitions (GRAPPA) parallel magnetic resonance imaging (MRI) model implies a correlation is induced between the acquired and reconstructed frequency measurements. As the parallel image reconstruction algorithms in many medical MRI scanners are based on the GRAPPA model, this study aims to quantify the statistical implications that the GRAPPA model has in functional connectivity studies. The linear mathematical framework derived in the work of Rowe , 2007, is adapted to represent the complex-valued GRAPPA image reconstruction operation in terms of a real-valued isomorphism, and a statistical analysis is performed on the effects that the GRAPPA operation has on reconstructed voxel means and correlations. The interpolation of missing spatial frequencies with the GRAPPA model is shown to result in an artificial correlation induced between voxels in the reconstructed images, and these artificial correlations are shown to reside in the low temporal frequency spectrum commonly associated with functional connectivity. Through a real-valued isomorphism, such as the one outlined in this manuscript, the exact artificial correlations induced by the GRAPPA model are not simply estimated, as they would be with simulations, but are precisely quantified. If these correlations are unaccounted for, they can incur an increase in false positives in functional connectivity studies
National bank notes and silver certificates
From 1883 to 1892, the circulation of national bank notes in the United States fell nearly 50 percent. Previous studies have attributed this to supply-side factors that led to a decline in the profitability of note issue during this period. This paper provides an alternative explanation. The decline in note issue was, in large part, demand-driven. The presence of a competing currency with superior features caused the public to substitute away from national bank notes.Paper money ; National bank notes ; Silver
The role of data in health care disparities in medicaid managed care
BACKGROUND: The Affordable Care Act includes provisions to standardize the collection of data on health care quality that can be used to measure disparities. We conducted a qualitative study among leaders of Medicaid managed care plans, that currently have access to standardized quality data stratified by race and ethnicity, to learn how they use it to address disparities. METHODS: We conducted semi-structured interviews with 21 health plan leaders across 9 Medicaid managed care plans in California. We used purposive sampling to maximize heterogeneity in geography and plan type (e.g., non-profit, commercial). We performed a thematic analysis based on iterative coding by two investigators. RESULTS: We found 4 major themes. Improving overall quality was tightly linked to a focus on standardized metrics that are integral to meeting regulatory or financial incentives. However, reducing disparities was not driven by standardized data, but by a mix of factors. Data were frequently only examined by race and ethnicity when overall performance was low. Disparities were attributed to either individual choices or cultural and linguistic factors, with plans focusing interventions on recently immigrated groups. CONCLUSIONS: While plans' efforts to address overall quality were often informed by standardized data, actions to reduce disparities were not, at least partly because there were few regulatory or financial incentives driving meaningful use of data on disparities. Standardized data, as envisaged by the Affordable Care Act, could become more useful for addressing disparities if they are combined with policies and regulations that promote health care equity
Incorporating Relaxivities to More Accurately Reconstruct MR Images
Purpose
To develop a mathematical model that incorporates the magnetic resonance relaxivities into the image reconstruction process in a single step.
Materials and methods
In magnetic resonance imaging, the complex-valued measurements of the acquired signal at each point in frequency space are expressed as a Fourier transformation of the proton spin density weighted by Fourier encoding anomalies: T2⁎, T1, and a phase determined by magnetic field inhomogeneity (∆B) according to the MR signal equation. Such anomalies alter the expected symmetry and the signal strength of the k-space observations, resulting in images distorted by image warping, blurring, and loss in image intensity. Although T1 on tissue relaxation time provides valuable quantitative information on tissue characteristics, the T1 recovery term is typically neglected by assuming a long repetition time. In this study, the linear framework presented in the work of Rowe et al., 2007, and of Nencka et al., 2009 is extended to develop a Fourier reconstruction operation in terms of a real-valued isomorphism that incorporates the effects of T2⁎, ∆B, and T1. This framework provides a way to precisely quantify the statistical properties of the corrected image-space data by offering a linear relationship between the observed frequency space measurements and reconstructed corrected image-space measurements. The model is illustrated both on theoretical data generated by considering T2⁎, T1, and/or ∆B effects, and on experimentally acquired fMRI data by focusing on the incorporation of T1. A comparison is also made between the activation statistics computed from the reconstructed data with and without the incorporation of T1 effects.
Result
Accounting for T1 effects in image reconstruction is shown to recover image contrast that exists prior to T1 equilibrium. The incorporation of T1 is also shown to induce negligible correlation in reconstructed images and preserve functional activations.
Conclusion
With the use of the proposed method, the effects of T2⁎ and ∆B can be corrected, and T1 can be incorporated into the time series image-space data during image reconstruction in a single step. Incorporation of T1 provides improved tissue segmentation over the course of time series and therefore can improve the precision of motion correction and image registration
- …
