55 research outputs found

    Green and simple: Effective eco-labelling for busy consumers. ESRI Research Bulletin 202023 September 2020.

    Get PDF
    An experiment that shows standardised colour-coded labels indicating the environmental pros and cons of products are likely to be effective for influencing the choices of busy consumers buying day-to-day groceries

    Semantic and motor processes in infant perception of object-directed and tool-mediated action

    Get PDF
    Actions are the translation of internal states such as intentions into overt gestures and goals. Actions are communicative, because by observing another’s overt behaviour we can infer that person’s internal states. Infants’ abilities to execute actions are limited by developing motor processes. Their capacity to make inferences from others’ behaviour is hindered by their inability to engage in perspective-taking and other advanced social cognitive processes. Nonetheless, extensive evidence shows that infants perceive actions as goal-directed sequences that are meaningful, and that they respond to observed actions with motor resonance. The aims of this thesis were to determine how semantic and motor processing of observed action develop in infancy, whether these processes develop separately or in conjunction with one another, and how infants’ abilities to execute and plan actions affects ability to detect semantic and motor differences between actions. These aims were achieved by studying how infants processed grasping actions that varied on different dimensions. In Chapter 1, the literature on infant action perception from social, motor and semantic perspectives is reviewed and the objectives of the thesis are described. In Chapter 2, the ability of 16-month-olds to discriminate between the uses of a novel tool when motor simulation processes are uninformative was investigated. In Chapter 3, the attentional and semantic neural correlates of processing of observed grasps were measured in 9-month-olds, 11.5-month-olds, and adults. In Chapter 4, motor activation in 10-month-old infants in response to motorically similar but semantically distinct grasping actions was related to infants’ action planning skills. The results of these experiments show that there is a complex interplay between motor and semantic constituents of the action processing system, and that this interplay is developmentally dynamic. The implications of the results for understanding action processing in development are considered in Chapter 5

    Dissociating associative and motor aspects of action understanding:processing of dual-ended tools by 16-month-old infants.

    Get PDF
    When learning about the functions of novel tools, it is possible that infants may use associative and motoric processes. This study investigated the ability of 16-month-olds to associate the orientation in which an actor held a dual-function tool with the actor's prior demonstrated interest in one of two target objects, and their use of the tool on that target. The actors' hand posture did not differ between conditions. The infants were shown stimuli in which two actors acted upon novel objects with a novel tool, each actor employing a different function of the tool. Using an eye-tracker, infants' looking time at images depicting the actors holding the tool in an orientation congruent or incongruent with the actor's goal was measured. Infants preferred to look at the specific part of the tool that was incongruent with the actor's goal. Results show that the association formed involves the specific part of the tool, the actor, and the object the actor acted upon, but not the orientation of the tool. The capacity to form such associations is demonstrated in this study in the absence of motor information that would allow 16-month-olds to generate a specific representation of how the tool should be held for each action via mirroring processes

    Event-related potentials discriminate familiar and unusual goal outcomes in 5-month-olds and adults

    Get PDF
    Previous event-related potential (ERP) work has indicated that the neural processing of action sequences develops with age. While adults and 9-month-olds use a semantic processing system, perceiving actions activates attentional processes in 7-month-olds. However, presenting a sequence of action context, action execution and action conclusion could challenge infants' developing working memory capacities. A shortened stimulus presentation of a highly familiar action, presenting only the action conclusion of an eating action, may therefore enable semantic processing in even younger infants. The present study examined neural correlates of the processing of expected and unexpected action conclusions in adults and infants at 5 months of age. We analyzed ERP components reflecting semantic processing (N400), attentional processes (negative central in infants; P1, N2 in adults) and the infant positive slow wave (PSW), a marker of familiarity. In infants, the PSW was enhanced on left frontal channels in response to unexpected as compared to expected outcomes. We did not find differences between conditions in ERP waves reflecting semantic processing or overt attentional mechanisms. In adults, in addition to differences in attentional processes on the P1 and the N2, an N400 occurred only in response to the unexpected action outcome, suggesting semantic processing taking place even without a complete action sequence being present. Results indicate that infants are already sensitive to differences in action outcomes, although the underlying mechanism which is based on familiarity is relatively rudimentary when contrasted with adults. This finding points toward different cognitive mechanisms being involved in action processing during development

    Emergence of the cortical encoding of phonetic features in the first year of life.

    Get PDF
    Even prior to producing their first words, infants are developing a sophisticated speech processing system, with robust word recognition present by 4-6 months of age. These emergent linguistic skills, observed with behavioural investigations, are likely to rely on increasingly sophisticated neural underpinnings. The infant brain is known to robustly track the speech envelope, however previous cortical tracking studies were unable to demonstrate the presence of phonetic feature encoding. Here we utilise temporal response functions computed from electrophysiological responses to nursery rhymes to investigate the cortical encoding of phonetic features in a longitudinal cohort of infants when aged 4, 7 and 11 months, as well as adults. The analyses reveal an increasingly detailed and acoustically invariant phonetic encoding emerging over the first year of life, providing neurophysiological evidence that the pre-verbal human cortex learns phonetic categories. By contrast, we found no credible evidence for age-related increases in cortical tracking of the acoustic spectrogram

    My Hand or Yours? Markedly Different Sensitivity to Egocentric and Allocentric Views in the Hand Laterality Task

    Get PDF
    In the hand laterality task participants judge the handedness of visually presented stimuli – images of hands shown in a variety of postures and views - and indicate whether they perceive a right or left hand. The task engages kinaesthetic and sensorimotor processes and is considered a standard example of motor imagery. However, in this study we find that while motor imagery holds across egocentric views of the stimuli (where the hands are likely to be one's own), it does not appear to hold across allocentric views (where the hands are likely to be another person's). First, we find that psychophysical sensitivity, d', is clearly demarcated between egocentric and allocentric views, being high for the former and low for the latter. Secondly, using mixed effects methods to analyse the chronometric data, we find high positive correlation between response times across egocentric views, suggesting a common use of motor imagery across these views. Correlations are, however, considerably lower between egocentric and allocentric views, suggesting a switch from motor imagery across these perspectives. We relate these findings to research showing that the extrastriate body area discriminates egocentric (‘self’) and allocentric (‘other’) views of the human body and of body parts, including hands

    Decoding speech information from EEG data with 4-, 7- and 11-month-old infants: Using convolutional neural network, mutual information-based and backward linear models.

    Get PDF
    BackgroundComputational models that successfully decode neural activity into speech are increasing in the adult literature, with convolutional neural networks (CNNs), backward linear models, and mutual information (MI) models all being applied to neural data in relation to speech input. This is not the case in the infant literature.New methodThree different computational models, two novel for infants, were applied to decode low-frequency speech envelope information. Previously-employed backward linear models were compared to novel CNN and MI-based models. Fifty infants provided EEG recordings when aged 4, 7, and 11 months, while listening passively to natural speech (sung or chanted nursery rhymes) presented by video with a female singer.ResultsEach model computed speech information for these nursery rhymes in two different low-frequency bands, delta and theta, thought to provide different types of linguistic information. All three models demonstrated significant levels of performance for delta-band neural activity from 4 months of age, with two of three models also showing significant performance for theta-band activity. All models also demonstrated higher accuracy for the delta-band neural responses. None of the models showed developmental (age-related) effects.Comparisons with existing methodsThe data demonstrate that the choice of algorithm used to decode speech envelope information from neural activity in the infant brain determines the developmental conclusions that can be drawn.ConclusionsThe modelling shows that better understanding of the strengths and weaknesses of each modelling approach is fundamental to improving our understanding of how the human brain builds a language system

    Appraisal of space words and allocation of emotion words in bodily space

    Get PDF
    The body-specificity hypothesis (BSH) predicts that right-handers and left-handers allocate positive and negative concepts differently on the horizontal plane, i.e., while left-handers allocate negative concepts on the right-hand side of their bodily space, right-handers allocate such concepts to the left-hand side. Similar research shows that people, in general, tend to allocate positive and negative concepts in upper and lower areas, respectively, in relation to the vertical plane. Further research shows a higher salience of the vertical plane over the horizontal plane in the performance of sensorimotor tasks. The aim of the paper is to examine whether there should be a dominance of the vertical plane over the horizontal plane, not only at a sensorimotor level but also at a conceptual level. In Experiment 1, various participants from diverse linguistic backgrounds were asked to rate the words “up”, “down”, “left”, and “right”. In Experiment 2, right-handed participants from two linguistic backgrounds were asked to allocate emotion words into a square grid divided into four boxes of equal areas. Results suggest that the vertical plane is more salient than the horizontal plane regarding the allocation of emotion words and positively-valenced words were placed in upper locations whereas negatively-valenced words were placed in lower locations. Together, the results lend support to the BSH while also suggesting a higher saliency of the vertical plane over the horizontal plane in the allocation of valenced words.Fernando Marmolejo-Ramos, María Rosa Elosúa, Yuki Yamada, Nicholas Francis Hamm and Kimihiro Noguch

    Scripts for data processing

    No full text

    Neural representations of grasp congruence

    No full text
    corecore