16 research outputs found
Synthetic biology routes to bio-artificial intelligence
The design of synthetic gene networks (SGNs) has advanced to the extent that novel genetic circuits are now being tested for their ability to recapitulate archetypal learning behaviours first defined in the fields of machine and animal learning. Here, we discuss the biological implementation of a perceptron algorithm for linear classification of input data. An expansion of this biological design that encompasses cellular 'teachers' and 'students' is also examined. We also discuss implementation of Pavlovian associative learning using SGNs and present an example of such a scheme and in silico simulation of its performance. In addition to designed SGNs, we also consider the option to establish conditions in which a population of SGNs can evolve diversity in order to better contend with complex input data. Finally, we compare recent ethical concerns in the field of artificial intelligence (AI) and the future challenges raised by bio-artificial intelligence (BI)
Effect of noise in intelligent cellular decision making
Similar to intelligent multicellular neural networks controlling human
brains, even single cells surprisingly are able to make intelligent decisions
to classify several external stimuli or to associate them. This happens because
of the fact that gene regulatory networks can perform as perceptrons, simple
intelligent schemes known from studies on Artificial Intelligence. We study the
role of genetic noise in intelligent decision making at the genetic level and
show that noise can play a constructive role helping cells to make a proper
decision. We show this using the example of a simple genetic classifier able to
classify two external stimuli
Mammalian Brain As a Network of Networks
Acknowledgements AZ, SG and AL acknowledge support from the Russian Science Foundation (16-12-00077). Authors thank T. Kuznetsova for Fig. 6.Peer reviewedPublisher PD
A training algorithm for networks of high-variability reservoirs
Physical reservoir computing approaches have gained increased attention in recent years due to their potential for low-energy high-performance computing. Despite recent successes, there are bounds to what one can achieve simply by making physical reservoirs larger. Therefore, we argue that a switch from single-reservoir computing to multi-reservoir and even deep physical reservoir computing is desirable. Given that error backpropagation cannot be used directly to train a large class of multi-reservoir systems, we propose an alternative framework that combines the power of backpropagation with the speed and simplicity of classic training algorithms. In this work we report our findings on a conducted experiment to evaluate the general feasibility of our approach. We train a network of 3 Echo State Networks to perform the well-known NARMA-10 task, where we use intermediate targets derived through backpropagation. Our results indicate that our proposed method is well-suited to train multi-reservoir systems in an efficient way
Few-molecule reservoir computing experimentally demonstrated with surface enhanced Raman scattering and ion-gating stimulation
Reservoir computing (RC) is a promising solution for achieving low power
consumption neuromorphic computing, although the large volume of the physical
reservoirs reported to date has been a serious drawback in their practical
application. Here, we report the development of a few-molecule RC that employs
the molecular vibration dynamics in the para-mercaptobenzoic acid (pMBA)
detected by surface enhanced Raman scattering (SERS) with tungsten oxide
nanorod/silver nanoparticles (WOx@Ag-NPs). The Raman signals of the pMBA
molecules, adsorbed at the SERS active site of WOx@Ag-NPs, were reversibly
perturbated by the application of voltage-induced local pH changes in the
vicinity of the molecules, and then used to perform RC of pattern recognition
and prediction tasks. In spite of the small number of molecules employed, our
system achieved good performance, including 95.1% to 97.7% accuracy in various
nonlinear waveform transformations and 94.3% accuracy in solving a second-order
nonlinear dynamic equation task. Our work provides a new concept of molecular
computing with practical computation capabilities.Comment: 22 pages, 4 figure
Reservoir Computing: computation with dynamical systems
In het onderzoeksgebied Machine Learning worden systemen onderzocht die kunnen leren op basis van voorbeelden. Binnen dit onderzoeksgebied zijn de recurrente neurale netwerken een belangrijke deelgroep. Deze netwerken zijn abstracte modellen van de werking van delen van de hersenen. Zij zijn in staat om zeer complexe temporele problemen op te lossen maar zijn over het algemeen zeer moeilijk om te trainen. Recentelijk zijn een aantal gelijkaardige methodes voorgesteld die dit trainingsprobleem elimineren. Deze methodes worden aangeduid met de naam Reservoir Computing. Reservoir Computing combineert de indrukwekkende rekenkracht van recurrente neurale netwerken met een eenvoudige trainingsmethode. Bovendien blijkt dat deze trainingsmethoden niet beperkt zijn tot neurale netwerken, maar kunnen toegepast worden op generieke dynamische systemen. Waarom deze systemen goed werken en welke eigenschappen bepalend zijn voor de prestatie is evenwel nog niet duidelijk.
Voor dit proefschrift is onderzoek gedaan naar de dynamische eigenschappen van generieke Reservoir Computing systemen. Zo is experimenteel aangetoond dat de idee van Reservoir Computing ook toepasbaar is op niet-neurale netwerken van dynamische knopen. Verder is een maat voorgesteld die gebruikt kan worden om het dynamisch regime van een reservoir te meten. Tenslotte is een adaptatieregel geïntroduceerd die voor een breed scala reservoirtypes de dynamica van het reservoir kan afregelen tot het gewenste dynamisch regime. De technieken beschreven in dit proefschrift zijn gedemonstreerd op verschillende academische en ingenieurstoepassingen