347 research outputs found

    Online detection of error-related potentials boosts the performance of mental typewriters

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Increasing the communication speed of brain-computer interfaces (BCIs) is a major aim of current BCI-research. The idea to automatically detect error-related potentials (ErrPs) in order to veto erroneous decisions of a BCI has been existing for more than one decade, but this approach was so far little investigated in online mode.</p> <p>Methods</p> <p>In our study with eleven participants, an ErrP detection mechanism was implemented in an electroencephalography (EEG) based gaze-independent visual speller.</p> <p>Results</p> <p>Single-trial ErrPs were detected with a mean accuracy of 89.1% (AUC 0.90). The spelling speed was increased on average by 49.0% using ErrP detection. The improvement in spelling speed due to error detection was largest for participants with low spelling accuracy.</p> <p>Conclusion</p> <p>The performance of BCIs can be increased by using an automatic error detection mechanism. The benefit for patients with motor disorders is potentially high since they often have rather low spelling accuracies compared to healthy people.</p

    Shortness coefficient of cyclically 4-edge-connected cubic graphs

    Get PDF
    Grünbaum and Malkevitch proved that the shortness coefficient of cyclically 4-edge-connected cubic planar graphs is at most 76/77. Recently, this was improved to 359/366 (< 52/53) and the question was raised whether this can be strengthened to 41/42, a natural bound inferred from one of the Faulkner-Younger graphs. We prove that the shortness coefficient of cyclically 4-edge-connected cubic planar graphs is at most 37/38 and that we also get the same value for cyclically 4-edge-connected cubic graphs of genus g for any prescribed genus g ≥ 0. We also show that 45/46 is an upper bound for the shortness coefficient of cyclically 4-edge-connected cubic graphs of genus g with face lengths bounded above by some constant larger than 22 for any prescribed g ≥ 0

    Brain-computer interfacing using modulations of alpha activity induced by covert shifts of attention

    Get PDF
    Contains fulltext : 99949.pdf (publisher's version ) (Open Access)Background: Visual brain-computer interfaces (BCIs) often yield high performance only when targets are fixated with the eyes. Furthermore, many paradigms use intense visual stimulation, which can be irritating especially in long BCI sessions. However, BCIs can more directly directly tap the neural processes underlying visual attention. Covert shifts of visual attention induce changes in oscillatory alpha activity in posterior cortex, even in the absence of visual stimulation. The aim was to investigate whether different pairs of directions of attention shifts can be reliably differentiated based on the electroencephalogram. To this end, healthy participants (N = 8) had to strictly fixate a central dot and covertly shift visual attention to one out of six cued directions. Results: Covert attention shifts induced a prolonged alpha synchronization over posterior electrode sites (PO and O electrodes). Spectral changes had specific topographies so that different pairs of directions could be differentiated. There was substantial variation across participants with respect to the direction pairs that could be reliably classified. Mean accuracy for the best-classifiable pair amounted to 74.6%. Furthermore, an alpha power index obtained during a relaxation measurement showed to be predictive of peak BCI performance (r = .66). Conclusions: Results confirm posterior alpha power modulations as a viable input modality for gaze-independent EEG-based BCIs. The pair of directions yielding optimal performance varies across participants. Consequently, participants with low control for standard directions such as left-right might resort to other pairs of directions including top and bottom. Additionally, a simple alpha index was shown to predict prospective BCI performance.10 p

    Survey on Unsupervised Domain Adaptation for Semantic Segmentation for Visual Perception in Automated Driving

    Get PDF
    Deep neural networks (DNNs) have proven their capabilities in the past years and play a significant role in environment perception for the challenging application of automated driving. They are employed for tasks such as detection, semantic segmentation, and sensor fusion. Despite tremendous research efforts, several issues still need to be addressed that limit the applicability of DNNs in automated driving. The bad generalization of DNNs to unseen domains is a major problem on the way to a safe, large-scale application, because manual annotation of new domains is costly, particularly for semantic segmentation. For this reason, methods are required to adapt DNNs to new domains without labeling effort. This task is termed unsupervised domain adaptation (UDA). While several different domain shifts challenge DNNs, the shift between synthetic and real data is of particular importance for automated driving, as it allows the use of simulation environments for DNN training. We present an overview of the current state of the art in this research field. We categorize and explain the different approaches for UDA. The number of considered publications is larger than any other survey on this topic. We also go far beyond the description of the UDA state-of-the-art, as we present a quantitative comparison of approaches and point out the latest trends in this field. We conduct a critical analysis of the state-of-the-art and highlight promising future research directions. With this survey, we aim to facilitate UDA research further and encourage scientists to exploit novel research directions

    Estimation of interdomain flexibility of N-terminus of factor H using residual dipolar couplings

    Get PDF
    Characterization of segmental flexibility is needed to understand the biological mechanisms of the very large category of functionally diverse proteins, exemplified by the regulators of complement activation, that consist of numerous compact modules or domains linked by short, potentially flexible, sequences of amino acid residues. The use of NMR-derived residual dipolar couplings (RDCs), in magnetically aligned media, to evaluate interdomain motion is established but only for two-domain proteins. We focused on the three N-terminal domains (called CCPs or SCRs) of the important complement regulator, human factor H (i.e. FH1-3). These domains cooperate to facilitate cleavage of the key complement activation-specific protein fragment, C3b, forming iC3b that no longer participates in the complement cascade. We refined a three-dimensional solution structure of recombinant FH1-3 based on nuclear Overhauser effects and RDCs. We then employed a rudimentary series of RDC datasets, collected in media containing magnetically aligned bicelles (disk-like particles formed from phospholipids) under three different conditions, to estimate interdomain motions. This circumvents a requirement of previous approaches for technically difficult collection of five independent RDC datasets. More than 80% of conformers of this predominantly extended three-domain molecule exhibit flexions of < 40 °. Such segmental flexibility (together with the local dynamics of the hypervariable loop within domain 3), could facilitate recognition of C3b via initial anchoring and eventual reorganization of modules to the conformation captured in the previously solved crystal structure of a C3b:FH1-4 complex

    The TeaComposition Initiative: Unleashing the power of international collaboration to understand litter decomposition

    Get PDF
    Collected harmonized data on global litter decomposition are of great relevance for scientists, policymakers, and for education of the next generation of researchers and environmental managers. Here we describe the TeaComposition initiative, a global and open research collaborative network to study organic matter decomposition in a standardized way allowing comparison of decomposition rate and carbon turnover across global and regional gradients of ecosystems, climate, soils etc. The TeaComposition initiative today involves 570 terrestrial and 300 aquatic ecosystems from nine biomes worldwide. Further, we describe how to get involved in the TeaComposition initiative by (a) implementing the standard protocol within your study site, (b) joining task forces in data analyses, syntheses and modelling efforts, (c) using collected data and samples for further analyses through joint projects, (d) using collected data for graduate seminars, and (e) strengthening synergies between biogeochemical research and a wide range of stakeholders. These collaborative efforts within/emerging from the TeaComposition initiative, thereby, will leverage our understanding on litter decomposition at the global scale and strengthen global collaborations essential for addressing grand scientific challenges in a rapidly changing world.This work was performed within the TeaComposition and TeaComposition H2O initiatives, carried by 290 institutions worldwide. We thank to UNILEVER for sponsoring the Lipton tea bags. The initiative is supported by the following grants: ILTER Initiative Grants, ClimMani Short-Term Scientific Missions Grants, INTERACT Remote Transnational Access and an Alfred Deakin Postdoctoral Research Fellowship. Nico Eisenhauer gratefully acknowledges the support of iDiv funded by the German Research Foundation (DFG– FZT 118, 202548816). ST-T was supported by the ARC DE210101029 and Deakin University’s ADPR Fellowship. Fernando T. Maestre acknowledges support from the European Research Council (ERC Grant agreement 647038 [BIODESERT]) and Generalitat Valenciana (CIDEGENT/2018/041)

    Calculation of partial isotope incorporation into peptides measured by mass spectrometry

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Stable isotope probing (SIP) technique was developed to link function, structure and activity of microbial cultures metabolizing carbon and nitrogen containing substrates to synthesize their biomass. Currently, available methods are restricted solely to the estimation of fully saturated heavy stable isotope incorporation and convenient methods with sufficient accuracy are still missing. However in order to track carbon fluxes in microbial communities new methods are required that allow the calculation of partial incorporation into biomolecules.</p> <p>Results</p> <p>In this study, we use the characteristics of the so-called 'half decimal place rule' (HDPR) in order to accurately calculate the partial<sup>13</sup>C incorporation in peptides from enzymatic digested proteins. Due to the clade-crossing universality of proteins within bacteria, any available high-resolution mass spectrometry generated dataset consisting of tryptically-digested peptides can be used as reference.</p> <p>We used a freely available peptide mass dataset from <it>Mycobacterium tuberculosis </it>consisting of 315,579 entries. From this the error of estimated versus known heavy stable isotope incorporation from an increasing number of randomly drawn peptide sub-samples (100 times each; no repetition) was calculated. To acquire an estimated incorporation error of less than 5 atom %, about 100 peptide masses were needed. Finally, for testing the general applicability of our method, peptide masses of tryptically digested proteins from <it>Pseudomonas putida </it>ML2 grown on labeled substrate of various known concentrations were used and<sup>13</sup>C isotopic incorporation was successfully predicted. An easy-to-use script <abbrgrp><abbr bid="B1">1</abbr></abbrgrp> was further developed to guide users through the calculation procedure for their own data series.</p> <p>Conclusion</p> <p>Our method is valuable for estimating<sup>13</sup>C incorporation into peptides/proteins accurately and with high sensitivity. Generally, our method holds promise for wider applications in qualitative and especially quantitative proteomics.</p

    On the reproducibility of extrusion-based bioprinting: round robin study on standardization in the field

    Get PDF
    The outcome of three-dimensional (3D) bioprinting heavily depends, amongst others, on the interaction between the developed bioink, the printing process, and the printing equipment. However, if this interplay is ensured, bioprinting promises unmatched possibilities in the health care area. To pave the way for comparing newly developed biomaterials, clinical studies, and medical applications (i.e. printed organs, patient-specific tissues), there is a great need for standardization of manufacturing methods in order to enable technology transfers. Despite the importance of such standardization, there is currently a tremendous lack of empirical data that examines the reproducibility and robustness of production in more than one location at a time. In this work, we present data derived from a round robin test for extrusion-based 3D printing performance comprising 12 different academic laboratories throughout Germany and analyze the respective prints using automated image analysis (IA) in three independent academic groups. The fabrication of objects from polymer solutions was standardized as much as currently possible to allow studying the comparability of results from different laboratories. This study has led to the conclusion that current standardization conditions still leave room for the intervention of operators due to missing automation of the equipment. This affects significantly the reproducibility and comparability of bioprinting experiments in multiple laboratories. Nevertheless, automated IA proved to be a suitable methodology for quality assurance as three independently developed workflows achieved similar results. Moreover, the extracted data describing geometric features showed how the function of printers affects the quality of the printed object. A significant step toward standardization of the process was made as an infrastructure for distribution of material and methods, as well as for data transfer and storage was successfully established

    On the reproducibility of extrusion-based bioprinting: round robin study on standardization in the field

    Get PDF
    The outcome of three-dimensional (3D) bioprinting heavily depends, amongst others, on the interaction between the developed bioink, the printing process, and the printing equipment. However, if this interplay is ensured, bioprinting promises unmatched possibilities in the health care area. To pave the way for comparing newly developed biomaterials, clinical studies, and medical applications (i.e. printed organs, patient-specific tissues), there is a great need for standardization of manufacturing methods in order to enable technology transfers. Despite the importance of such standardization, there is currently a tremendous lack of empirical data that examines the reproducibility and robustness of production in more than one location at a time. In this work, we present data derived from a round robin test for extrusion-based 3D printing performance comprising 12 different academic laboratories throughout Germany and analyze the respective prints using automated image analysis (IA) in three independent academic groups. The fabrication of objects from polymer solutions was standardized as much as currently possible to allow studying the comparability of results from different laboratories. This study has led to the conclusion that current standardization conditions still leave room for the intervention of operators due to missing automation of the equipment. This affects significantly the reproducibility and comparability of bioprinting experiments in multiple laboratories. Nevertheless, automated IA proved to be a suitable methodology for quality assurance as three independently developed workflows achieved similar results. Moreover, the extracted data describing geometric features showed how the function of printers affects the quality of the printed object. A significant step toward standardization of the process was made as an infrastructure for distribution of material and methods, as well as for data transfer and storage was successfully established
    corecore