22 research outputs found

    Path classification by stochastic linear recurrent neural networks

    No full text
    We investigate the functioning of a classifying biological neural network from the perspective of statistical learning theory, modelled, in a simplified setting, as a continuous-time stochastic recurrent neural network (RNN) with the identity activation function. In the purely stochastic (robust) regime, we give a generalisation error bound that holds with high probability, thus showing that the empirical risk minimiser is the best-in-class hypothesis. We show that RNNs retain a partial signature of the paths they are fed as the unique information exploited for training and classification tasks. We argue that these RNNs are easy to train and robust and support these observations with numerical experiments on both synthetic and real data. We also show a trade-off phenomenon between accuracy and robustness

    Investigation of a carbon fibrereinforced plastic grinding wheel for high-speed plunge-cut centreless grinding application

    Get PDF
    High-speed plunge-cut centreless grinding opens up enormous potential for the manufacturing of difficult-to-machine materials and to improve the surface quality while reducing the grinding forces. For this investigation, a new grinding wheel base body of carbon fibre-reinforced plastic (CFRP) was developed to achieve grinding wheel speeds up to 150 m/s in plunge-cut centreless grinding of hardened shafts. For evaluation of the performance characteristics, the grinding forces and the surface quality of different grinding tools were detected. These experiments were conducted using a newly developed measuring system to analyse the grinding forces in the workrest blade. The experimental results are described and discussed in this article

    Unfolding recurrence by Green's functions for optimized reservoir computing

    No full text
    Cortical networks are strongly recurrent, and neurons have intrinsic temporal dynamics. This sets them apart from deep feed-forward networks. Despite the tremendous progress in the application of feed-forward networks and their theoretical understanding, it remains unclear how the interplay of recurrence and non-linearities in recurrent cortical networks contributes to their function. The purpose of this work is to present a solvable recurrent network model that links to feed forward networks. By perturbative methods we transform the time-continuous, recurrent dynamics into an effective feed-forward structure of linear and non-linear temporal kernels. The resulting analytical expressions allow us to build optimal time-series classifiers from random reservoir networks. Firstly, this allows us to optimize not only the readout vectors, but also the input projection, demonstrating a strong potential performance gain. Secondly, the analysis exposes how the second order stimulus statistics is a crucial element that interacts with the non-linearity of the dynamics and boosts performance

    Unfolding recurrence by Green’s functions for optimized reservoir computing

    No full text
    Cortical networks are strongly recurrent, and neurons have intrinsic temporaldynamics. This sets them apart from deep feed-forward networks. Despite thetremendous progress in the application of feed-forward networks and their the-oretical understanding, it remains unclear how the interplay of recurrence andnon-linearities in recurrent cortical networks contributes to their function. Thepurpose of this work is to present a solvable recurrent network model that links tofeed forward networks. By perturbative methods we transform the time-continuous,recurrent dynamics into an effective feed-forward structure of linear and non-lineartemporal kernels. The resulting analytical expressions allow us to build optimaltime-series classifiers from random reservoir networks. Firstly, this allows us tooptimize not only the readout vectors, but also the input projection, demonstratinga strong potential performance gain. Secondly, the analysis exposes how the secondorder stimulus statistics is a crucial element that interacts with the non-linearity ofthe dynamics and boosts performance

    Unfolding recurrence by Green's functions for optimized reservoir computing

    No full text
    Cortical networks are strongly recurrent, and neurons have intrinsic temporal dynamics. This sets them apart from deep feed-forward networks. Despite the tremendous progress in the application of feed-forward networks and their theoretical understanding, it remains unclear how the interplay of recurrence and non-linearities in recurrent cortical networks contributes to their function. The purpose of this work is to present a solvable recurrent network model that links to feed forward networks. By perturbative methods we transform the time-continuous, recurrent dynamics into an effective feed-forward structure of linear and non-linear temporal kernels. The resulting analytical expressions allow us to build optimal time-series classifiers from random reservoir networks. Firstly, this allows us to optimize not only the readout vectors, but also the input projection, demonstrating a strong potential performance gain. Secondly, the analysis exposes how the second order stimulus statistics is a crucial element that interacts with the non-linearity of the dynamics and boosts performance

    Probing the Boundaries between Lewis-Basic and Redox Behavior of a Parent Borylene

    No full text
    The parent borylene (CAAC)(Me3_{3}P)BH, 1 (CAAC=cyclic alkyl(amino)carbene), acts both as a Lewis base and one-electron reducing agent towards group 13 trichlorides (ECl3_{3}, E=B, Al, Ga, In), yielding the adducts 1-ECl3_{3} and increasing proportions of the radical cation [1]•+^{•+} for the heavier group 13 analogues. With boron trihalides (BX3_{3}, X=F, Cl, Br, I) 1 undergoes sequential adduct formation and halide abstraction reactions to yield borylboronium cations and shows an increasing tendency towards redox processes for the heavier halides. Calculations confirm that 1 acts as a strong Lewis base towards EX3 and show a marked increase in the B−E bond dissociation energies down both group 13 and the halide group
    corecore