15 research outputs found

    Neuromatch Academy: Teaching Computational Neuroscience with Global Accessibility

    Get PDF
    Neuromatch Academy (NMA) designed and ran a fully online 3-week Computational Neuroscience Summer School for 1757 students with 191 teaching assistants (TAs) working in virtual inverted (or flipped) classrooms and on small group projects. Fourteen languages, active community management, and low cost allowed for an unprecedented level of inclusivity and universal accessibility

    Neuromatch Academy: a 3-week, online summer school in computational neuroscience

    Get PDF

    Apparent speed increases at low luminance

    No full text

    Effect of perceived interpersonal closeness on the joint Simon effect in adolescents and adults

    Get PDF
    Abstract Here, we explored the role of perceived interpersonal closeness in joint action using the joint Simon task in adolescents and adults. In a two-choice reaction time task, spatially assigned responses to non-spatial stimulus features are faster when the stimulus and response are in congruent locations than not. This phenomenon is called Simon effect and is absent or strongly attenuated when a participant responds to only one of the stimuli. However, the effect reappears when two participants carry out the same go/no-go tasks cooperatively. This re-emergence of the Simon effect in joint action is called the joint Simon effect (JSE). In this study, we first replicated the standard and joint Simon effects in adolescents (n = 43), as well as adults (n = 39) with similar magnitude of the effects in the two age groups. The magnitude of the JSE was positively correlated with the level of closeness as measured by Inclusion of Other in the Self scale. This correlation was not significantly different in adolescents (n = 73) compared to adults (n = 71). Our findings show that joint action is sensitive to the social factor such as interpersonal closeness, and the underlying mechanisms are already mature by adolescence

    THINGS-data: A multimodal collection of large-scale datasets for investigating object representations in brain and behavior

    No full text
    Understanding object representations requires a broad, comprehensive sampling of the objects in our visual world with dense measurements of brain activity and behavior. Here we present THINGS-data, a multimodal collection of large-scale datasets comprising functional MRI, magnetoencephalographic recordings, and 4.70 million similarity judgments in response to thousands of photographic images for up to 1,854 object concepts. THINGS-data is unique in its breadth of richly-annotated objects, allowing for testing countless hypotheses at scale while assessing the reproducibility of previous findings. Beyond the unique insights promised by each individual dataset, the multimodality of THINGS-data allows combining datasets for a much broader view into object processing than previously possible. Our analyses demonstrate the high quality of the datasets and provide five examples of hypothesis-driven and data-driven applications. THINGS-data constitutes the core release of the THINGS initiative (https://things-initiative.org) for bridging the gap between disciplines and the advancement of cognitive neuroscience
    corecore