1,434 research outputs found

    Towards the implementation of distributed systems in synthetic biology

    Get PDF
    The design and construction of engineered biological systems has made great strides over the last few decades and a growing part of this is the application of mathematical and computational techniques to problems in synthetic biology. The use of distributed systems, in which an overall function is divided across multiple populations of cells, has the potential to increase the complexity of the systems we can build and overcome metabolic limitations. However, constructing biological distributed systems comes with its own set of challenges. In this thesis I present new tools for the design and control of distributed systems in synthetic biology. The first part of this thesis focuses on biological computers. I develop novel design algorithms for distributed digital and analogue computers composed of spatial patterns of communicating bacterial colonies. I prove mathematically that we can program arbitrary digital functions and develop an algorithm for the automated design of optimal spatial circuits. Furthermore, I show that bacterial neural networks can be built using our system and develop efficient design tools to do so. I verify these results using computational simulations. This work shows that we can build distributed biological computers using communicating bacterial colonies and different design tools can be used to program digital and analogue functions. The second part of this thesis utilises a technique from artificial intelligence, reinforcement learning, in first the control and then the understanding of biological systems. First, I show the potential utility of reinforcement learning to control and optimise interacting communities of microbes that produce a biomolecule. Second, I apply reinforcement learning to the design of optimal characterisation experiments within synthetic biology. This work shows that methods utilising reinforcement learning show promise for complex distributed bioprocessing in industry and the design of optimal experiments throughout biology

    Advanced photonic and electronic systems - WILGA 2017

    Get PDF
    WILGA annual symposium on advanced photonic and electronic systems has been organized by young scientist for young scientists since two decades. It traditionally gathers more than 350 young researchers and their tutors. Ph.D students and graduates present their recent achievements during well attended oral sessions. Wilga is a very good digest of Ph.D. works carried out at technical universities in electronics and photonics, as well as information sciences throughout Poland and some neighboring countries. Publishing patronage over Wilga keep Elektronika technical journal by SEP, IJET by PAN and Proceedings of SPIE. The latter world editorial series publishes annually more than 200 papers from Wilga. Wilga 2017 was the XL edition of this meeting. The following topical tracks were distinguished: photonics, electronics, information technologies and system research. The article is a digest of some chosen works presented during Wilga 2017 symposium. WILGA 2017 works were published in Proc. SPIE vol.10445

    Evolvable hardware platform for fault-tolerant reconfigurable sensor electronics

    Get PDF

    Symbolic tolerance and sensitivity analysis of large scale electronic circuits

    Get PDF
    Available from British Library Document Supply Centre-DSC:DXN029693 / BLDSC - British Library Document Supply CentreSIGLEGBUnited Kingdo

    Digital CMOS ISFET architectures and algorithmic methods for point-of-care diagnostics

    Get PDF
    Over the past decade, the surge of infectious diseases outbreaks across the globe is redefining how healthcare is provided and delivered to patients, with a clear trend towards distributed diagnosis at the Point-of-Care (PoC). In this context, Ion-Sensitive Field Effect Transistors (ISFETs) fabricated on standard CMOS technology have emerged as a promising solution to achieve a precise, deliverable and inexpensive platform that could be deployed worldwide to provide a rapid diagnosis of infectious diseases. This thesis presents advancements for the future of ISFET-based PoC diagnostic platforms, proposing and implementing a set of hardware and software methodologies to overcome its main challenges and enhance its sensing capabilities. The first part of this thesis focuses on novel hardware architectures that enable direct integration with computational capabilities while providing pixel programmability and adaptability required to overcome pressing challenges on ISFET-based PoC platforms. This section explores oscillator-based ISFET architectures, a set of sensing front-ends that encodes the chemical information on the duty cycle of a PWM signal. Two initial architectures are proposed and fabricated in AMS 0.35um, confirming multiple degrees of programmability and potential for multi-sensing. One of these architectures is optimised to create a dual-sensing pixel capable of sensing both temperature and chemical information on the same spatial point while modulating this information simultaneously on a single waveform. This dual-sensing capability, verified in silico using TSMC 0.18um process, is vital for DNA-based diagnosis where protocols such as LAMP or PCR require precise thermal control. The COVID-19 pandemic highlighted the need for a deliverable diagnosis that perform nucleic acid amplification tests at the PoC, requiring minimal footprint by integrating sensing and computational capabilities. In response to this challenge, a paradigm shift is proposed, advocating for integrating all elements of the portable diagnostic platform under a single piece of silicon, realising a ``Diagnosis-on-a-Chip". This approach is enabled by a novel Digital ISFET Pixel that integrates both ADC and memory with sensing elements on each pixel, enhancing its parallelism. Furthermore, this architecture removes the need for external instrumentation or memories and facilitates its integration with computational capabilities on-chip, such as the proposed ARM Cortex M3 system. These computational capabilities need to be complemented with software methods that enable sensing enhancement and new applications using ISFET arrays. The second part of this thesis is devoted to these methods. Leveraging the programmability capabilities available on oscillator-based architectures, various digital signal processing algorithms are implemented to overcome the most urgent ISFET non-idealities, such as trapped charge, drift and chemical noise. These methods enable fast trapped charge cancellation and enhanced dynamic range through real-time drift compensation, achieving over 36 hours of continuous monitoring without pixel saturation. Furthermore, the recent development of data-driven models and software methods open a wide range of opportunities for ISFET sensing and beyond. In the last section of this thesis, two examples of these opportunities are explored: the optimisation of image compression algorithms on chemical images generated by an ultra-high frame-rate ISFET array; and a proposed paradigm shift on surface Electromyography (sEMG) signals, moving from data-harvesting to information-focused sensing. These examples represent an initial step forward on a journey towards a new generation of miniaturised, precise and efficient sensors for PoC diagnostics.Open Acces

    Digital Filters and Signal Processing

    Get PDF
    Digital filters, together with signal processing, are being employed in the new technologies and information systems, and are implemented in different areas and applications. Digital filters and signal processing are used with no costs and they can be adapted to different cases with great flexibility and reliability. This book presents advanced developments in digital filters and signal process methods covering different cases studies. They present the main essence of the subject, with the principal approaches to the most recent mathematical models that are being employed worldwide

    Improving the quality of combined EEG-TMS neural recordings: artifact removal and time analysis

    Get PDF
    The combination of TMS (transcranial magnetic stimulation) and EEG (electroencephalography) allows a functional assessment of cortical regions in a controlled, non-invasive way and without the need of having subjects under study perform a task. Thanks to this combination, a characterization of the cerebral signals of schizophrenic patients (16) and healthy controls (15), through the TMS-evoked potentials (TEPs), will be performed. Two protocols will be evaluated: SICI (short-interval intracortical inhibition) and LICI (long-interval intracortical inhibition), which activate different inhibitory-type receptors (GABA-A and GABA-B, respectively). In both protocols, there are signals with a single TMS-pulses (SP) or with paired-pulses (two pulses, PP). Thus, besides the characterization, the different results obtained will be compared between type of pulses, protocols and groups of study. The methodology followed is the typical one for this type of signals: removal of the TMS-pulse(s); ICA (independent component analysis) application to delete the artefactuated or noisy components; signal reconstruction with the good components and bad channel and bad trial rejection. Once the pre-processing is finished and the signal is clean, the TMS-evoked potentials are obtained. For the purpose of finding the differences between PP and SP, the former signal is subtracted from the latter and a modulation ratio is computed. The TEPs found are P25, N30, P50, N100, P140, N160 and P200. Regarding the differences between types of pulses, the observed that the TEP amplitudes for LICI PP are lower than the ones for LICI SP. However, this difference is not clearly seen in SICI protocol. As to the differences between subjects, it seems that the controls have greater inhibition than the patients. In conclusion, the results are remarkably similar to the ones expected. Due to the small sample size and the fact that it is still a poorly studied filed, some results do not match completely with the ones in the literature, but they follow the same tendencyLa combinació de la TMS (estimulació magnètica transcranial, per les seves sigles en anglès) i l’electroencefalograma permet una avaluació funcional directa de regions corticals d’una manera controlada, no invasiva i sense necessitat de realitzar una tasca per part del subjecte d’estudi. És amb la combinació d’aquestes dues tècniques que es vol caracteritzar les senyal cerebrals de persones amb esquizofrènia (16) i persones sanes (15), obtenint els potencials evocats pel TMS (TEP). S’avaluen dos protocols: SICI (short-interval intracortical inhibition) i LICI (long-interval intracortical inhibition), que activen diferents receptors de tipus inhibitori (GABA-A i GABA-B, respectivament). Dins d’aquests protocols, hi ha senyals amb un sol pols de TMS (SP) o amb dos polsos (PP). Així doncs, a part de la caracterització de la senyal, també es comparen els resultats entre els diferents tipus de polsos (1 ó 2), els protocols i els dos grups d’estudi. La metodologia seguida és la típica per aquest tipus de senyals: eliminació del pols o polsos; aplicació d’ICA (anàlisis de components independents), per tal d’eliminar aquelles components artefactuades i/o sorolloses; reconstrucció de la senyal amb les components bones i eliminació de canals i experiments sorollosos. Un cop s’ha fet el pre-processament de la senyal i aquesta està neta, es busquen els potencials evocats pel pols(os) del TMS. Per buscar les diferències entre PP i SP, es resta la primera senyal a la segona i es calcula un rati de modulació. S’han trobat els TEPs P25, N30, P50, N100, P140, N160 i P200. Referent a les diferències entre tipus de polsos, s’observa que les amplituds dels TEPs és menor en LICI PP que en LICI SP. Tanmateix, aquesta diferencia no es veu tan clara en el protocol SICI. Pel que fa les diferències entre subjectes, sembla que els controls (les persones sanes) tenen més inhibició que no pas els pacients. En conclusió, els resultats obtinguts són bastant propers als esperats. Degut a la mida reduïda de la mostra i a que es tracta d’un camp encara poc estudiat, els resultats no són exactament els de la literatura però sí que van en la mateixa líniaLa combinación de la TMS (estimulación magnética transcranial, por sus siglas en inglés) y el electroencefalograma permite una avaluación funcional directa de regiones corticales de una manera controlada, no invasiva y sin necesidad de realizar una tarea por parte del sujeto de estudio. Es con la combinación de ambas técnicas que se quiere caracterizar las señales cerebrales de personas con esquizofrenia (16) y personas sanas (15), obteniendo los potenciales evocados por el TMS (TEPs). Se evalúan dos protocolos: SICI (short-interval intracortical inhibition) y LICI (long-interval intracortical inhibition), que activan diferentes receptores de tipo inhibitorio (GABA-A y GABA-B; respectivamente). Dentro de estos protocolos, hay señales con un solo pulso de TMS (SP) o con dos pulsos (PP). Así pues, a parte de la caracterización de la señal, también se comparan los resultados entre los diferentes tipos de pulsos, los protocolos y los dos grupos de estudio. La metodología seguida es la típica para este tipo de señales: eliminación del pulso o pulsos; aplicación de ICA (análisis de componentes independientes), con el fin de eliminar las componentes artefactuadas y/o ruidosas; reconstrucción de la señal con las componentes buenas y eliminación de canales y experimentos ruidosos. Una vez hecho el pre-procesado de la señal y ésta está limpia, se buscan los TEPs. Para buscar las diferencias entre PP y SP, se resta la primera señal a la segunda y se calcula una ratio de modulación. Se han encontrado los TEPs P25, N30, P50, N100, P140, N169 y P200. Referente a las diferencias entre tipos de pulsos, se observa que las amplitudes de los TEPs son menores en LICI PP que en LICI SP. Sin embargo, esta diferencia no se ve tan clara en SICI. En cuanto a las diferencias entre sujetos, parece ser que los controles (personas sanas) tienen más inhibición que los pacientes. En conclusión, los resultados son muy similares a los esperados. Debido al tamaño reducido de la muestra y a que es un campo poco estudiado, los resultados no son exactamente iguales a los de la literatura, pero sí que van en la misma líneaObjectius de Desenvolupament Sostenible::3 - Salut i Benesta

    Modelling methods for testability analysis of analog integrated circuits based on pole-zero analysis

    Get PDF
    Testability analysis for analog circuits provides valuable information for designers and test engineers. Such information includes a number of testable and nontestable elements of a circuit, ambiguity groups, and nodes to be tested. This information is useful for solving the fault diagnosis problem. In order to verify the functionality of analog circuits, a large number of specifications have to be checked. However, checking all circuit specifications can result in prohibitive testing times on expensive automated test equipment. Therefore, the test engineer has to select a finite subset of specifications to be measured. This subset of specifications must result in reducing the test time and guaranteeing that no faulty chips are shipped. This research develops a novel methodology for testability analysis of linear analog circuits based on pole-zero analysis and on pole-zero sensitivity analysis. Based on this methodology, a new interpretation of ambiguity groups is provided relying on the circuit theory. The testability analysis methodology can be employed as a guideline for constructing fault diagnosis equations and for selecting the test nodes. We have also proposed an algorithm for selecting specifications that need to be measured. The element testability concept will be introduced. This concept provides the degree of difficulty in testing circuit elements. The value of the element testability can easily be obtained using the pole sensitivities. Then, specifications which need to be measured can be selected based on this concept. Consequently, the selected measurements can be utilized for reducing the test time without sacrificing the fault coverage and maximizing the information for fault diagnosis

    Tools and Technologies for Enabling Characterisation in Synthetic Biology

    Get PDF
    Synthetic Biology represents a movement to utilise biological organisms for novel applications through the use of rigorous engineering principles. These principles rely on a solid and well versed understanding of the underlying biological components and functions (relevant to the application). In order to achieve this understanding, reliable behavioural and contextual information is required (more commonly known as characterisation data). Focussing on lowering the barrier of entry for current research facilities to regularly and easily perform characterisation assays will directly improve the communal knowledge base for Synthetic Biology and enable the further application of rational engineering principles. Whilst characterisation remains a fundamental principle for Synthetic Biology research, the high time costs, subjective measurement protocols, and ambiguous data analysis specifications, deter regular performance of characterisation assays. Vitally, this prevents the valid application of many of the key Synthetic Biology processes that have been derived to improve research yield (with regards to solving application problems) and directly prevent the intended goal of addressing the ad hoc nature of modern research from being realised. Designing new technologies and tools to facilitate rapid ‘hands off’ characterisation assays for research facilities will improve the uptake of characterisation within the research pipeline. To achieve this two core problem areas have been identified that limit current characterisation attempts in conventional research. Therefore, it was the primary aim of this investigation to overcome these two core problems to promote regular characterisation. The first issue identified as preventing the regular use of characterisation assays was the user-intensive methodologies and technologies available to researchers. There is currently no standardised characterisation equipment for assaying samples and the methodologies are heavily dependent on the researcher and their application for successful and complete characterisation. This study proposed a novel high throughput solution to the characterisation problem that was capable of low cost, concurrent, and rapid characterisation of simple biological DNA elements. By combining in vitro transcription-translation with microfluidics a potent solution to the characterisation problem was proposed. By utilising a completely in vitro approach along with excellent control abilities of microfluidic technologies, a prototype platform for high throughput characterisation was developed. The second issue identified was the lack of flexible, versatile software designed specifically for the data handling needs that are quickly arising within the characterisation speciality. The lack of general solutions in this area is problematic because of the increasing amount of data that is both required and generated for the characterisation output to be considered as rigorous and of value. To alleviate this issue a novel framework for laboratory data handling was developed that employs a plugin strategy for data submission and analysis. Employing a plugin strategy improves the shelf life of data handling software by allowing it to grow with the needs of the speciality. Another advantage to this strategy is the increased ability for well documented processing and analysis standards to arise that are available for all researchers. Finally, the software provided a powerful and flexible data storage schema that allowed all currently conceivable characterisation data types to be stored in a well-documented manner. The two solutions identified within this study increase the amount of enabling tools and technologies available to researchers within Synthetic Biology, which in turn will increase the uptake of regular characterisation. Consequently, this will potentially improve the lateral transfer of knowledge between research projects and reduce the need to perform ad hoc experiments to investigate facets of the fundamental biological components being utilised.Open Acces
    • …
    corecore