468 research outputs found

    Optimization of Cell-Aware Test

    Get PDF

    Optimization of Cell-Aware Test

    Get PDF

    Development of statistical and computational methods to estimate functional connectivity and topology in large-scale neuronal assemblies

    Get PDF
    One of the most fundamental features of a neural circuit is its connectivity since the single neuron activity is not due only to its intrinsic properties but especially to the direct or indirect influence of other neurons1. It is fundamental to elaborate research strategies aimed at a comprehensive structural description of neuronal interconnections as well as the networks\u2019 elements forming the human connectome. The connectome will significantly increase our understanding of how functional brain states emerge from their underlying structural substrate, and will provide new mechanistic insights into how brain function is affected if this structural substrate is disrupted. The connectome is characterized by three different types of connectivity: structural, functional and effective connectivity. It is evident that the final goal of a connectivity analysis is the reconstruction of the human connectome, thus, the application of statistical measures to the in vivo model in both physiological and pathological states. Since the system under study (i.e. brain areas, cell assemblies) is highly complex, to achieve the purpose described above, it is useful to adopt a reductionist approach. During my PhD work, I focused on a reduced and simplified model, represented by neural networks chronically coupled to Micro Electrodes Arrays (MEAs). Large networks of cortical neurons developing in vitro and chronically coupled to MEAs2 represent a well-established experimental model for studying the neuronal dynamics at the network level3, and for understanding the basic principles of information coding4 learning and memory5. Thus, during my PhD work, I developed and optimized statistical methods to infer functional connectivity from spike train data. In particular, I worked on correlation-based methods: cross-correlation and partial correlation, and information-theory based methods: Transfer Entropy (TE) and Joint Entropy (JE). More in detail, my PhD\u2019s aim has been applying functional connectivity methods to neural networks coupled to high density resolution system, like the 3Brain active pixel sensor array with 4096 electrodes6. To fulfill such an aim, I re-adapted the computational logic operations of the aforementioned connectivity methods. Moreover, I worked on a new method based on the cross-correlogram, able to detect both inhibitory and excitatory links. I called such an algorithm Filtered Normalized Cross-Correlation Histogram (FNCCH). The FNCCH shows a very high precision in detecting both inhibitory and excitatory functional links when applied to our developed in silico model. I worked also on a temporal and pattern extension of the TE algorithm. In this way, I developed a Delayed TE (DTE) and a Delayed High Order TE (DHOTE) version of the TE algorithm. These two extension of the TE algorithm are able to consider different temporal bins at different temporal delays for the pattern recognition with respect to the basic TE. I worked also on algorithm for the JE computation. Starting from the mathematical definition in7, I developed a customized version of JE capable to detect the delay associated to a functional link, together with a dedicated shuffling based thresholding approach. Finally, I embedded all of these connectivity methods into a user-friendly open source software named SPICODYN8. SPICODYN allows the user to perform a complete analysis on data acquired from any acquisition system. I used a standard format for the input data, providing the user with the possibility to perform a complete set of operations on the input data, including: raw data viewing, spike and burst detection and analysis, functional connectivity analysis, graph theory and topological analysis. SPICODYN inherits the backbone structure from TOOLCONNECT, a previously published software that allowed to perform a functional connectivity analysis on spike trains dat

    Exploring the Mysteries of System-Level Test

    Full text link
    System-level test, or SLT, is an increasingly important process step in today's integrated circuit testing flows. Broadly speaking, SLT aims at executing functional workloads in operational modes. In this paper, we consolidate available knowledge about what SLT is precisely and why it is used despite its considerable costs and complexities. We discuss the types or failures covered by SLT, and outline approaches to quality assessment, test generation and root-cause diagnosis in the context of SLT. Observing that the theoretical understanding for all these questions has not yet reached the level of maturity of the more conventional structural and functional test methods, we outline new and promising directions for methodical developments leveraging on recent findings from software engineering.Comment: 7 pages, 2 figure

    Intermittent fault diagnosis and health monitoring for electronic interconnects

    Get PDF
    Literature survey and correspondence with industrial sector shows that No-Fault-Found (NFF) is a major concern in through life engineering services, especially for defence, aerospace, and other transport industry. There are various occurrences and root causes that result in NFF events but intermittent interconnections are the most frustrating. This is because it disappears while testing, and missed out by diagnostic equipment. This thesis describes the challenging and most important area of intermittent fault detection and health monitoring that focuses towards NFF situation in electronics interconnections. After introduction, this thesis starts with literature survey and describes financial impact on aerospace and other transport industry. It highlights NFF technologies and discuss different facts and their impact on NFF. Then It goes into experimental study that how repeatedly intermittent fault could be replicated. It describes a novel fault replicator that can generate repeatedly IFs for further experimental study on diagnosis techniques/algorithms. The novel IF replicator provide for single and multipoint intermittent connection. The experimental work focuses on mechanically induced intermittent conditions in connectors. This work illustrates a test regime that can be used to repeatedly reproduce intermittency in electronic connectors whilst subjected to vibration ... [cont.]

    Risk Analysis for Smart Cities Urban Planners: Safety and Security in Public Spaces

    Get PDF
    Christopher Alexander in his famous writings "The Timeless Way of Building" and "A pattern language" defined a formal language for the description of a city. Alexander developed a generative grammar able to formally describe complex and articulated concepts of architecture and urban planning to define a common language that would facilitate both the participation of ordinary citizens and the collaboration between professionals in architectural and urban planning. In this research, a similar approach has been applied to let two domains communicate although they are very far in terms of lexicon, methodologies and objectives. These domains are urban planning, urban design and architecture, seen as the first domain both in terms of time and in terms of completeness of vision, and the one relating to the world of engineering, made by innumerable disciplines. In practice, there is a domain that defines the requirements and the overall vision (the first) and a domain (the second) which implements them with real infrastructures and systems. To put these two worlds seamlessly into communication, allowing the concepts of the first world to be translated into those of the second, Christopher Alexander’s idea has been followed by defining a common language. By applying Essence, the software engineering formal descriptive theory, using its customization rules, to the concept of a Smart City, a common language to completely trace the requirements at all levels has been defined. Since the focus was on risk analysis for safety and security in public spaces, existing risk models have been considered, evidencing a further gap also within the engineering world itself. Depending on the area being considered, risk management models have different and siloed approaches which ignore the interactions of one type of risk with the others. To allow effective communication between the two domains and within the engineering domain, a unified risk analysis framework has been developed. Then a framework (an ontology) capable of describing all the elements of a Smart City has been developed and combined with the common language to trace the requirements. Following the philosophy of the Vienna Circle, a creative process called Aufbau has then been defined to allow the generation of a detailed description of the Smart City, at any level, using the common language and the ontology above defined. Then, the risk analysis methodology has been applied to the city model produced by Aufbau. The research developed tools to apply such results to the entire life cycle of the Smart City. With these tools, it is possible to understand how much a given architectural, urban planning or urban design requirement is operational at a given moment. In this way, the narration can accurately describe how much the initial requirements set by architects, planners and urban designers and, above all, the values required by stakeholders, are satisfied, at any time. The impact of this research on urban planning is the ability to create a single model between the two worlds, leaving everyone free to express creativity and expertise in the appropriate forms but, at the same time, allowing both to fill the communication gap existing today. This new way of planning requires adequate IT tools and takes the form, from the engineering side, of harmonization of techniques already in use and greater clarity of objectives. On the side of architecture, urban planning and urban design, it is instead a powerful decision support tool, both in the planning and operational phases. This decision support tool for Urban Planning, based on the research results, is the starting point for the development of a meta-heuristic process using an evolutionary approach. Consequently, risk management, from Architecture/Urban Planning/Urban Design up to Engineering, in any phase of the Smart City’s life cycle, is seen as an “organism” that evolves.Christopher Alexander nei suoi famosi scritti "The Timeless Way of Building" e "A pattern language" ha definito un linguaggio formale per la descrizione di una città, sviluppando una grammatica in grado di descrivere formalmente concetti complessi e articolati di architettura e urbanistica, definendo un linguaggio comune per facilitare la partecipazione dei comuni cittadini e la collaborazione tra professionisti. In questa ricerca, un approccio simile è stato applicato per far dialogare due domini sebbene siano molto distanti in termini di lessico, metodologie e obiettivi. Essi sono l'urbanistica, l'urban design e l'architettura, visti come primo dominio sia in termini di tempo che di completezza di visione, e quello del mondo dell'ingegneria, con numerose discipline. In pratica, esiste un dominio che definisce i requisiti e la visione d'insieme (il primo) e un dominio (il secondo) che li implementa con infrastrutture e sistemi reali. Per metterli in perfetta comunicazione, permettendo di tradurre i concetti del primo in quelli del secondo, si è seguita l'idea di Alexander definendo un linguaggio. Applicando Essence, la teoria descrittiva formale dell'ingegneria del software al concetto di Smart City, è stato definito un linguaggio comune per tracciarne i requisiti a tutti i livelli. Essendo il focus l'analisi dei rischi per la sicurezza negli spazi pubblici, sono stati considerati i modelli di rischio esistenti, evidenziando un'ulteriore lacuna anche all'interno del mondo dell'ingegneria stessa. A seconda dell'area considerata, i modelli di gestione del rischio hanno approcci diversi e isolati che ignorano le interazioni di un tipo di rischio con gli altri. Per consentire una comunicazione efficace tra i due domini e all'interno del dominio dell'ingegneria, è stato sviluppato un quadro di analisi del rischio unificato. Quindi è stato sviluppato un framework (un'ontologia) in grado di descrivere tutti gli elementi di una Smart City e combinato con il linguaggio comune per tracciarne i requisiti. Seguendo la filosofia del Circolo di Vienna, è stato poi definito un processo creativo chiamato Aufbau per consentire la generazione di una descrizione dettagliata della Smart City, a qualsiasi livello, utilizzando il linguaggio comune e l'ontologia sopra definita. Infine, la metodologia dell'analisi del rischio è stata applicata al modello di città prodotto da Aufbau. La ricerca ha sviluppato strumenti per applicare tali risultati all'intero ciclo di vita della Smart City. Con questi strumenti è possibile capire quanto una data esigenza architettonica, urbanistica o urbanistica sia operativa in un dato momento. In questo modo, la narrazione può descrivere con precisione quanto i requisiti iniziali posti da architetti, pianificatori e urbanisti e, soprattutto, i valori richiesti dagli stakeholder, siano soddisfatti, in ogni momento. L'impatto di questa ricerca sull'urbanistica è la capacità di creare un modello unico tra i due mondi, lasciando ognuno libero di esprimere creatività e competenza nelle forme appropriate ma, allo stesso tempo, permettendo ad entrambi di colmare il gap comunicativo oggi esistente. Questo nuovo modo di progettare richiede strumenti informatici adeguati e si concretizza, dal lato ingegneristico, in un'armonizzazione delle tecniche già in uso e in una maggiore chiarezza degli obiettivi. Sul versante dell'architettura, dell'urbanistica e del disegno urbano, è invece un potente strumento di supporto alle decisioni, sia in fase progettuale che operativa. Questo strumento di supporto alle decisioni per la pianificazione urbana, basato sui risultati della ricerca, è il punto di partenza per lo sviluppo di un processo meta-euristico utilizzando un approccio evolutivo

    Modelling offshore wind farm operation and maintenance with view to estimating the benefits of condition monitoring

    Get PDF
    Offshore wind energy is progressing rapidly and playing an increasingly important role in electricity generation. Since the Kyoto Protocol in February 2005, Europe has been substantially increasing its installed wind capacity. Compared to onshore wind, offshore wind allows the installation of larger turbines, more extensive sites, and encounters higher wind speed with lower turbulence. On the other hand, harsh marine conditions and the limited access to the turbines are expected to increase the cost of operation and maintenance (O&M costs presently make up approximately 20-25% of the levelised total lifetime cost of a wind turbine). Efficient condition monitoring has the potential to reduce O&M costs. In the analysis of the cost effectiveness of condition monitoring, cost and operational data are crucial. Regrettably, wind farm operational data are generally kept confidential by manufacturers and wind farm operators, especially for the offshore ones.To facilitate progress, this thesis has investigated accessible SCADA and failure data from a large onshore wind farm and created a series of indirect analysis methods to overcome the data shortage including an onshore/offshore failure rate translator and a series of methods to distinguish yawing errors from wind turbine nacelle direction sensor errors. Wind turbine component reliability has been investigated by using this innovative component failure rate translation from onshore to offshore, and applies the translation technique to Failure Mode and Effect Analysis for offshore wind. An existing O&M cost model has been further developed and then compared to other available cost models. It is demonstrated that the improvements made to the model (including the data translation approach) have improved the applicability and reliability of the model. The extended cost model (called StraPCost+) has been used to establish a relationship between the effectiveness of reactive and condition-based maintenance strategies. The benchmarked cost model has then been applied to assess the O&M cost effectiveness for three offshore wind farms at different operational phases.Apart from the innovative methodologies developed, this thesis also provides detailed background and understanding of the state of the art for offshore wind technology, condition monitoring technology. The methodology of cost model developed in this thesis is presented in detail and compared with other cost models in both commercial and research domains.Offshore wind energy is progressing rapidly and playing an increasingly important role in electricity generation. Since the Kyoto Protocol in February 2005, Europe has been substantially increasing its installed wind capacity. Compared to onshore wind, offshore wind allows the installation of larger turbines, more extensive sites, and encounters higher wind speed with lower turbulence. On the other hand, harsh marine conditions and the limited access to the turbines are expected to increase the cost of operation and maintenance (O&M costs presently make up approximately 20-25% of the levelised total lifetime cost of a wind turbine). Efficient condition monitoring has the potential to reduce O&M costs. In the analysis of the cost effectiveness of condition monitoring, cost and operational data are crucial. Regrettably, wind farm operational data are generally kept confidential by manufacturers and wind farm operators, especially for the offshore ones.To facilitate progress, this thesis has investigated accessible SCADA and failure data from a large onshore wind farm and created a series of indirect analysis methods to overcome the data shortage including an onshore/offshore failure rate translator and a series of methods to distinguish yawing errors from wind turbine nacelle direction sensor errors. Wind turbine component reliability has been investigated by using this innovative component failure rate translation from onshore to offshore, and applies the translation technique to Failure Mode and Effect Analysis for offshore wind. An existing O&M cost model has been further developed and then compared to other available cost models. It is demonstrated that the improvements made to the model (including the data translation approach) have improved the applicability and reliability of the model. The extended cost model (called StraPCost+) has been used to establish a relationship between the effectiveness of reactive and condition-based maintenance strategies. The benchmarked cost model has then been applied to assess the O&M cost effectiveness for three offshore wind farms at different operational phases.Apart from the innovative methodologies developed, this thesis also provides detailed background and understanding of the state of the art for offshore wind technology, condition monitoring technology. The methodology of cost model developed in this thesis is presented in detail and compared with other cost models in both commercial and research domains

    Wireless Sensors and Actuators for Structural Health Monitoring of Fiber Composite Materials

    Get PDF
    This work evaluates and investigates the wireless generation and detection of Lamb-waves on fiber-reinforced materials using surface applied or embedded piezo elements. The general target is to achieve wireless systems or sensor networks for Structural Health Monitoring (SHM), a type of Non-Destructive-Evaluation (NDE). In this sense, a fully wireless measurement system that achieves power transmission implementing inductive coils is reported. This system allows a reduction of total system weight as well as better integration in the structure. A great concern is the characteristics of the material, in which the system is integrated, because the properties can have a direct impact on the strength of the magnetic field. Carbon-Fiber-Reinforced-Polymer (CFRP) is known to behave as an electrical conductor, shielding radio waves with increasing worse effects at higher frequencies. Due to the need of high power and voltage, interest is raised to evaluate the operation of piezo as actuators at the lower frequency ranges. To this end, actuating occurs at the International Scientific and Medical (ISM) band of 125 kHz or low-frequency (LF) range. The feasibility of such system is evaluated extensively in this work. Direct excitation, is done by combining the actuator bonded to the surface or embedded in the material with an inductive LF coil and setting the circuit in resonance. A more controlled possibility, also explored, is the use of electronics to generate a Hanning-windowed-sine to excite the PWAS in a narrow spectrum. In this case, only wireless power is transmitted to the actuator node, and this lastly implements a Piezo-driver to independently excite Lamb-waves. Sensing and data transfer, on the other hand, is done using the high-frequency (HF) 13.56 MHz. The HF range covers the requirements of faster sampling rate and lower energy content. A re-tuning of the antenna coils is performed to obtain better transmission qualities when the system is implemented in CFRP. Several quasi-isotropic (QI) CFRP plates with sensor and actuator nodes were made to measure the quality of transmission and the necessary energy to stimulate the actuator-sensor system. In order to produce baselines, measurements are prepared from a healthy plate under specific temperature and humidity conditions. The signals are evaluated to verify the functionality in the presence of defects. The measurements demonstrate that it is possible to wirelessly generate Lamb-waves while early results show the feasibility to determine the presence of structural failure. For instance, progress has been achieved detecting the presence of a failure in the form of drilled holes introduced to the structure. This work shows a complete set of experimental results of different sensor/-actuator nodes

    Nanoantennas for visible and infrared radiation

    Full text link
    Nanoantennas for visible and infrared radiation can strongly enhance the interaction of light with nanoscale matter by their ability to efficiently link propagating and spatially localized optical fields. This ability unlocks an enormous potential for applications ranging from nanoscale optical microscopy and spectroscopy over solar energy conversion, integrated optical nanocircuitry, opto-electronics and density-ofstates engineering to ultra-sensing as well as enhancement of optical nonlinearities. Here we review the current understanding of optical antennas based on the background of both well-developed radiowave antenna engineering and the emerging field of plasmonics. In particular, we address the plasmonic behavior that emerges due to the very high optical frequencies involved and the limitations in the choice of antenna materials and geometrical parameters imposed by nanofabrication. Finally, we give a brief account of the current status of the field and the major established and emerging lines of investigation in this vivid area of research.Comment: Review article with 76 pages, 21 figure
    • …
    corecore