10 research outputs found

    Flexibility of a Conditioned Response: Exploring the Limits of Attentional Capture By Fear

    Get PDF
    Recent work from the attention capture literature suggests that attention may be captured by stimuli with learned aversive value, even when these fear conditioned stimuli (CS) are task-irrelevant and not physically salient. Moreover, relatively little work in the human fear conditioning literature has investigated whether conditioned fear responses can flexibly transfer to a neutral associate of a CS. We examined, for the first time, whether fear-conditioned capture effects were able to transfer to the associate of a CS. Twenty-seven participants encoded novel scene-object pairs. Following encoding, scenes were presented alone during a conditioning phase. Scenes co-terminated with shock 100% (CS100), 50% (CS50), or 0% (CS0) of the time, depending on the object that they had been paired with during encoding, while participants made shock expectancy ratings. Subsequent to conditioning, participants performed a visual search task; the search display occasionally contained one of the encoded objects as a distractor. Eye movements were recorded. Results indicated that, during search, significantly more overt eye movements were made, in error, to the object associate of a CS relative to baseline distractors, and target-directed saccades on trials containing a CS associate were slower relative to target-directed saccades on baseline trials. However, there were no differences in capture effects across the three CS conditions (which varied in threat learning history), suggesting that fear-conditioned capture effects to a CS may not transfer to novel associates encountered for the first time in the episodic context of an experiment

    Behavioral and Eye-Movement Correlates of Item-Specific and Relational Memory in Autism

    Get PDF
    Recent work has challenged past findings that documented relational memory impairments in autism. Previous studies have often relied solely on explicit behavioral responses to assess relational memory integrity, but successful performance on behavioral tasks may rely on other cognitive abilities (e.g., executive functioning) that are impaired in some autistic individuals. Eye-tracking tasks do not require explicit behavioral responses, and, further, eye movements provide an indirect measure of memory. The current study examined whether memory-specific viewing patterns toward scenes differ between autistic and non-autistic individuals. Using a long-term memory paradigm that equated for complexity between item and relational memory tasks, participants studied a series of scenes. Following the initial study phase, scenes were re-presented, accompanied by an orienting question that directed participants to attend to either features of an item (i.e., in the item condition) or spatial relationships between items (i.e., in the relational condition) that might be subsequently modified during test. At test, participants viewed scenes that were unchanged (i.e., repeated from study), scenes that underwent an “item” modification (an exemplar switch) or a “relational” modification (a location switch), and scenes that were never seen before. Eye movements were recorded throughout. There were no significant group differences in explicit recognition accuracy or the expression of eye-movement-based memory effects when scenes were intact, modified, or new. However, differences in subjective memory confidence, the associations between study- and test-related memory indices, and the impact of external sample characteristics on retrieval-related eye movements suggest subtle dissociations in the quality of memory representations and/or in the relationships between subcomponents of memory in autism

    Relational memory weakness in autism despite the use of a controlled encoding task

    Get PDF
    IntroductionRecent work challenged past findings that documented relational memory impairments in autism. Previous studies often relied solely on explicit behavioral responses to assess relational memory integrity, but successful performance on behavioral tasks may rely on other cognitive abilities (e.g., executive functioning) that are impacted in some autistic individuals. Eye-tracking tasks do not require explicit behavioral responses, and, further, eye movements provide an indirect measure of memory. The current study examined whether memory-specific viewing patterns toward scenes differ between autistic and non-autistic individuals.MethodsUsing a long-term memory paradigm that equated for complexity between item and relational memory tasks, participants studied a series of scenes. Following the initial study phase, scenes were re-presented, accompanied by an orienting question that directed participants to attend to either features of an item (i.e., in the item condition) or spatial relationships between items (i.e., in the relational condition) that might be subsequently modified during test. At test, participants viewed scenes that were unchanged (i.e., repeated from study), scenes that underwent an “item” modification (an exemplar switch) or a “relational” modification (a location switch), and scenes that had not been presented before. Eye movements were recorded throughout.ResultsDuring study, there were no significant group differences in viewing directed to regions of scenes that might be manipulated at test, suggesting comparable processing of scene details during encoding. However, there was a group difference in explicit recognition accuracy for scenes that underwent a relational change. Marginal group differences in the expression of memory-based viewing effects during test for relational scenes were consistent with this behavioral outcome, particularly when analyses were limited to scenes recognized correctly with high confidence. Group differences were also evident in correlational analyses that examined the association between study phase viewing and recognition accuracy and between performance on the Picture Sequence Memory Test and recognition accuracy.DiscussionTogether, our findings suggest differences in the integrity of relational memory representations and/or in the relationships between subcomponents of memory in autism

    Neural correlates of memory encoding and recognition for own-race and other-race faces in an associative-memory task

    No full text
    The ability to recognize faces of family members, friends, and acquaintances plays an important role in our daily interactions. The other-race effect is the reduced ability to recognize other-race faces as compared to own-race faces. Previous studies showed different patterns of event-related potentials (ERPs) associated with recollection and familiarity during memory encoding (i.e., Dm) and recognition (i.e., parietal old/new effect) for own-race and other-race faces in a subjective-recollection task (remember-know judgments). The present study investigated the same neural correlates of the other-race effect in an associative-memory task, in which Caucasian and East Asian participants learned and recognized own-race and other-race faces along with background colors. Participants made more false alarms for other-race faces indicating lower memory performance. During the study phase, subsequently recognized other-race faces (with and without correct background information) elicited more positive mean amplitudes than own-race faces, suggesting increased neural activation during encoding of other-race faces. During the test phase, recollection-related old/new effects dissociated between own-race and other-race faces. Old/new effects were significant only for own-race but not for other-race faces, indicating that recognition only of own-race faces was supported by recollection and led to more detailed memory retrieval. Most of these results replicated previous studies that used a subjective-recollection task. Our study also showed that the increased demand on memory encoding during an associative-memory task led to Dm patterns that indicated similarly deep memory encoding for own-race and other-race faces

    Control of Memory Retrieval Alters Memory-Based Eye Movements

    No full text
    Past work has shown that eye movements are affected by long-term memory across different tasks and instructional manipulations. In the current study, we tested whether these memory-based eye movements persist when memory retrieval is under intentional control. Participants encoded multiple scenes with six objects (three faces; three tools). Next, they completed a memory regulation and visual search task, while undergoing eye tracking. Here, scene cues were presented and participants either retrieved the encoded associate, suppressed it, or substituted it with a specific object from the other encoded category. Following a delay, a search display consisting of six dots intermixed with the six encoded objects was presented. Participants’ task was to fixate one remaining dot after five had disappeared. Incidental viewing of the objects was of interest. Results revealed that performance in a final recognition phase was impaired for suppressed pairs, but only when the associate was a tool. During the search task, incidental associate viewing was lower when participants attempted to control retrieval, whereas one object from the non-associate category was most viewed in the substitute condition. Additionally, viewing patterns in the search phase were related to final recognition performance, but the direction of this association differed between conditions. Overall, these results suggest that eye movements are attracted to information retrieved from long-term memory and held active (the associate in the retrieve condition, or an object from the other category in the substitute condition). Furthermore, the level of viewing may index the strength of the representation of retrieved information
    corecore