64 research outputs found

    Data and metadata acquisiton workflows as a basis for collaborative science

    No full text
    Globalization and digitization offers new possibilities for scientific collaborations and data sharing. Tofruitfully work together in an distributed, international collaboration it is important to have a common, detailedunderstanding of the shared data. This includes information describing the details of the experimental setup, theconditions under which the experiment was performed, and potential post-processing and evaluation steps. Thisinformation is of essential interest for all collaboration partners as it forms the basis for any analysis andinterpretation the various partners perform on the data. The implementation of such a collaborative workflowmust consider various aspects, e.g. common terminology and a generic data representation.Here, we present a solution implementing an exemplary data management workflow for a complexelectrophysiology experiment involving two laboratories and a collaborative team of more than 10 scientistsworking on the data. The workflow generates complete, extensively annotated, self contained, sharable andready to use datasets. For the raw data this implies the aggregation of data from multiple original recording filesinto a single, generic format suited to capture the content of all data files as well as metadata (NIX, [1]). Inaddition, quality checks are performed for each dataset and its results are associated with the original data. Thedatasets come with an extensive amount of metadata which is mostly tracked in a series of automaticallygenerated structured csv files. Using the odmltables tool [2] these files are transformed and enriched to create ahierarchically structured metadata collection in the odML format [3], which is then incorporated into the NIXfile.The workflow is designed in a modular fashion. This supports the flexible adjustment and extension of theworkflow and ensures that individual modules can be reused in the context of other experiments. The workflowis implemented using Snakemake [4], a workflow management system for reproducible and scalable dataanalyses. Using such a workflow management system has the advantage that most of the pre-processing andrestructuring of the data can be performed in a parallelized and automatized manner triggered by a change in theunderlying source files, e.g. generation or update of the underlying experimental dataset, the metadata or thepost-processing. The output files of the workflow can be adjusted to the needs of different interest groups withinthe collaboration, whereas the consistency of the data is guaranteed via the automatization of the generatingprocess. As an outlook we propose how to combine our workflow with a version control systems capable ofhandling large data (e.g. GIN [5]).The benefit of using a standardized workflow for preparation of the experimental data can be exploited whenin the subsequent steps use standardized, publicly available tools for data representation and analysis, e.g. Neo[6] and the Electrophysiology Analysis Toolkit (Elephant, [7]), respectively. For detailed information about dataquality checks and pre-processing steps and analysis of behaviour, see abstracts by Essink et al. and De Haan etal.References:[1] NIX, https://github.com/G-Node/nix[2] odmltables, https://pypi.python.org/pypi/odmltables/ & https://github.com/INM-6/python-odmltables[3] Grewe, J., Wachtler, T., & Benda, J. (2011). A Bottom-up Approach to Data Annotation inNeurophysiology. Frontiers in Neuroinformatics, 5, 16.[4] Snakemake, https://snakemake.readthedocs.io/en/stable/[5] GIN, https://web.gin.g-node.org/[6] Garcia S., et al.. (2014) Neo: an object model for handling electrophysiology data in multiple formats.Frontiers in Neuroinformatics 8:10: doi:10.3389/fninf.2014.00010[7] Elephant, http://neuralensemble.org/elephant

    A Behavioral Receptive Field for Ocular Following in Monkeys: Spatial Summation and Its Spatial Frequency Tuning

    No full text
    International audienceIn human and nonhuman primates, reflexive tracking eye movements can be initiated at very short latency in response to a rapid shift of the image. Previous studies in humans have shown that only a part of the central visual field is optimal for driving ocular following responses. Herein, we have investigated spatial summation of motion information, across a wide range of spatial frequencies and speeds of drifting gratings by recording short-latency ocular following responses in macaque monkeys. We show that the optimal stimulus size for driving ocular responses cover a small (diameter, ,20°), central part of the visual field that shrinks with higher spatial frequency. This signature of linear motion integration remains invariant with speed and temporal frequency. For low and medium spatial frequencies, we found a strong suppressive influence from surround motion, evidenced by a decrease of response amplitude for stimulus sizes larger than optimal. Such suppression disappears with gratings at high frequencies. The contribution of peripheral motion was investigated by presenting grating annuli of increasing eccentricity. We observed an exponential decay of response amplitude with grating eccentricity, the decrease being faster for higher spatial frequencies. Weaker surround suppression can thus be explained by sparser eccentric inputs at high frequencies. A difference-of-Gaussians model best renders the antagonistic contributions of peripheral and central motions. Its best-fit parameters coincide with several, well known spatial properties of area MT neuronal populations. These results describe the mechanism by which central motion information is automatically integrated in a context-dependent manner to drive ocular responses
    • …
    corecore