3,492 research outputs found

    Temporal isolation of neural processes underlying face preference decisions

    Get PDF
    Decisions about whether we like someone are often made so rapidly from first impressions that it is difficult to examine the engagement of neural structures at specific points in time. Here, we used a temporally extended decision-making paradigm to examine brain activation with functional MRI (fMRI) at sequential stages of the decision-making process. Activity in reward-related brain structures—the nucleus accumbens (NAC) and orbitofrontal cortex (OFC)—was found to occur at temporally dissociable phases while subjects decided which of two unfamiliar faces they preferred. Increases in activation in the OFC occurred late in the trial, consistent with a role for this area in computing the decision of which face to choose. Signal increases in the NAC occurred early in the trial, consistent with a role for this area in initial preference formation. Moreover, early signal increases in the NAC also occurred while subjects performed a control task (judging face roundness) when these data were analyzed on the basis of which of those faces were subsequently chosen as preferred in a later task. The findings support a model in which rapid, automatic engagement of the NAC conveys a preference signal to the OFC, which in turn is used to guide choice

    Distractibility in daily life is reflected in the structure and function of human parietal cortex

    Get PDF
    We all appreciate that some of our friends and colleagues are more distractible than others. This variability can be captured by pencil and paper questionnaires in which individuals report such cognitive failures in their everyday life. Surprisingly, these self-report measures have high heritability, leading to the hypothesis that distractibility might have a basis in brain structure. In a large sample of healthy adults, we demonstrated that a simple self-report measure of everyday distractibility accurately predicted gray matter volume in a remarkably focal region of left superior parietal cortex. This region must play a causal role in reducing distractibility, because we found that disrupting its function with transcranial magnetic stimulation increased susceptibility to distraction. Finally, we showed that the self-report measure of distractibility reliably predicted our laboratory-based measure of attentional capture. Our findings distinguish a critical mechanism in the human brain causally involved in avoiding distractibility, which, importantly, bridges self-report judgments of cognitive failures in everyday life and a commonly used laboratory measure of distractibility to the structure of the human brai

    A method for viewing and interacting with medical volumes in virtual reality

    Get PDF
    The medical field has long benefited from advancements in diagnostic imaging technology. Medical images created through methods such as Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) are used by medical professionals to non-intrusively peer into the body to make decisions about surgeries. Over time, the viewing medium of medical images has evolved from X-ray film negatives to stereoscopic 3D displays, with each new development enhancing the viewer’s ability to discern detail or decreasing the time needed to produce and render a body scan. Though doctors and surgeons are trained to view medical images in 2D, some are choosing to view body scans in 3D through volume rendering. While traditional 2D displays can be used to display 3D data, a viewing method that incorporates depth would convey more information to the viewer. One device that has shown promise in medical image viewing applications is the Virtual Reality Head Mounted Display (VR HMD). VR HMDs have recently increased in popularity, with several commodity devices being released within the last few years. The Oculus Rift, HTC Vive, and Windows Mixed Reality HMDs like the Samsung Odyssey offer higher resolution screens, more accurate motion tracking, and lower prices than earlier HMDs. They also include motion-tracked handheld controllers meant for navigation and interaction in video games. Because of their popularity and low cost, medical volume viewing software that is compatible with these headsets would be accessible to a wide audience. However, the introduction of VR to medical volume rendering presents difficulties in implementing consistent user interactions and ensuring performance. Though all three headsets require unique driver software, they are compatible with OpenVR, a middleware that standardizes communication between the HMD, the HMD’s controllers, and VR software. However, the controllers included with the HMDs each has a slightly different control layout. Furthermore, buttons, triggers, touchpads, and joysticks that share the same hand position between devices do not report values to OpenVR in the same way. Implementing volume rendering functions like clipping and tissue density windowing on VR controllers could improve the user’s experience over mouse-and-keyboard schemes through the use of tracked hand and finger movements. To create a control scheme that is compatible with multiple HMD’s A way of mapping controls differently depending on the device was developed. Additionally, volume rendering is a computationally intensive process, and even more so when rendering for an HMD. By using techniques like GPU raytracing with modern GPUs, real-time framerates are achievable on desktop computers with traditional displays. However, the importance of achieving high framerates is even greater when viewing with a VR HMD due to its higher level of immersion. Because the 3D scene occupies most of the user’s field of view, low or choppy framerates contribute to feelings of motion sickness. This was mitigated through a decrease in volume rendering quality in situations where the framerate drops below acceptable levels. The volume rendering and VR interaction methods described in this thesis were demonstrated in an application developed for immersive viewing of medical volumes. This application places the user and a medical volume in a 3D VR environment, allowing the user to manually place clipping planes, adjust the tissue density window, and move the volume to achieve different viewing angles with handheld motion tracked controllers. The result shows that GPU raytraced medical volumes can be viewed and interacted with in VR using commodity hardware, and that a control scheme can be mapped to allow the same functions on different HMD controllers despite differences in layout

    Generating Stimuli for Neuroscience Using PsychoPy

    Get PDF
    PsychoPy is a software library written in Python, using OpenGL to generate very precise visual stimuli on standard personal computers. It is designed to allow the construction of as wide a variety of neuroscience experiments as possible, with the least effort. By writing scripts in standard Python syntax users can generate an enormous variety of visual and auditory stimuli and can interact with a wide range of external hardware (enabling its use in fMRI, EEG, MEG etc.). The structure of scripts is simple and intuitive. As a result, new experiments can be written very quickly, and trying to understand a previously written script is easy, even with minimal code comments. PsychoPy can also generate movies and image sequences to be used in demos or simulated neuroscience experiments. This paper describes the range of tools and stimuli that it provides and the environment in which experiments are conducted

    A field portable ultrasound system for bovine tissue characterization

    Get PDF
    Efforts by beef producers and beef packers to reduce carcass losses and to move the industry to a value-based marketing system have renewed their interest in the development of new electronic grading techniques for assessing beef carcass composition. A custom ultrasonic data acquisition system has been developed for the purpose of investigating the feasibility of beef tissue characterization. The unit developed is a compact, hand-held, six channel battery-operated data logger capable of sampling A-mode ultrasound signals at a rate of 15 megasamples per second. A custom epoxy encapsulated ultrasound transducer array was developed to fit the inner curvature of the thoracic cavity on top of the intercostal muscle between the 12th and 13th rib. Data collected by the ultrasound unit is temporarily stored in a handheld data terminal/computer for retrieval and analysis on a personal computer at a later time. Ultrasound samples from 39 carcasses were analyzed for backscatter energy content. The ultrasound records were partitioned into two groups based on the contact characteristics between the probe and tissue. Preliminary fat estimation in the longissimus dorsi muscle have resulted in correlation coefficients on two partitioned data sets of 0.81 and 0.82. Preliminary marbling estimation in the longissimus dorsi muscle have resulted in correlation coefficients on two partitioned groups of 0.59 and 0.70

    Attention-dependent modulation of neural activity in primary sensorimotor cortex

    Get PDF
    Although motor tasks at most times do not require much attention, there are findings that attention can alter neuronal activity not only in higher motor areas but also within the primary sensorimotor cortex. However, these findings are equivocal as attention effects were investigated only in either the dominant or the nondominant hand; attention was operationalized either as concentration (i.e., attention directed to motor task) or as distraction (i.e., attention directed away from motor task), the complexity of motor tasks varied and almost no left-handers were studied. Therefore, in this study, both right- and left-handers were investigated with an externally paced button press task in which subjects typed with the index finger of the dominant, nondominant, or both hands. We introduced four different attention levels: attention-modulation-free, distraction (counting backward), concentration on the moving finger, and divided concentration during bimanual movement. We found that distraction reduced neuronal activity in both contra- and ipsilateral primary sensorimotor cortex when the nondominant hand was tapping in both handedness groups. At the same time, distraction activated the dorsal frontoparietal attention network and deactivated the ventral default network. We conclude that difficulty and training status of both the motor and cognitive task, as well as usage of the dominant versus the nondominant hand, are crucial for the presence and magnitude of attention effects on sensorimotor cortex activity. In the case of a very simple button press task, attention modulation is seen for the nondominant hand under distraction and in both handedness groups

    Interaction of numerosity and time in prefrontal and parietal cortex

    Get PDF
    It has been proposed that numerical and temporal information are processed by partially overlapping magnitude systems. Interactions across different magnitude domains could occur both at the level of perception and decision-making. However, their neural correlates have been elusive. Here, using functional magnetic resonance imaging in humans, we show that the right intraparietal cortex (IPC) and inferior frontal gyrus (IFG) are jointly activated by duration and numerosity discrimination tasks, with a congruency effect in the right IFG. To determine whether the IPC and the IFG are involved in response conflict (or facilitation) or modulation of subjective passage of time by numerical information, we examined their functional roles using transcranial magnetic stimulation (TMS) and two different numerosity-time interaction tasks: duration discrimination and time reproduction tasks. Our results show that TMS of the right IFG impairs categorical duration discrimination, whereas that of the right IPC modulates the degree of influence of numerosity on time perception and impairs precise time estimation. These results indicate that the right IFG is specifically involved at the categorical decision stage, whereas bleeding of numerosity information on perception of time occurs within the IPC. Together, our findings suggest a two-stage model of numerosity-time interactions whereby the interaction at the perceptual level occurs within the parietal region and the interaction at categorical decisions takes place in the prefrontal cortex
    corecore