37 research outputs found

    Complexity: Frontiers in Data-Driven Methods for Understanding, Prediction, and Control of Complex Systems 2022 on the Development of Information Theoretic Model Selection Criteria for the Analysis of Experimental Data

    Get PDF
    It can be argued that the identification of sound mathematical models is the ultimate goal of any scientific endeavour. On the other hand, particularly in the investigation of complex systems and nonlinear phenomena, discriminating between alternative models can be a very challenging task. Quite sophisticated model selection criteria are available but their deployment in practice can be problematic. In this work, the Akaike Information Criterion is reformulated with the help of purely information theoretic quantities, namely, the Gibbs-Shannon entropy and the Mutual Information. Systematic numerical tests have proven the improved performances of the proposed upgrades, including increased robustness against noise and the presence of outliers. The same modifications can be implemented to rewrite also Bayesian statistical criteria, such as the Schwartz indicator, in terms of information-theoretic quantities, proving the generality of the approach and the validity of the underlying assumptions

    Image-based methods to investigate synchronization between time series relevant for plasma fusion diagnostics

    Get PDF
    Advanced time series analysis and causality detection techniques have been successfully applied to the assessment of synchronization experiments in tokamaks, such as Edge Localized Modes (ELMs) and sawtooth pacing. Lag synchronization is a typical strategy for fusion plasma instability control by pace-making techniques. The major difficulty, in evaluating the efficiency of the pacing methods, is the coexistence of the causal effects with the periodic or quasi-periodic nature of the plasma instabilities. In the present work, a set of methods based on the image representation of time series, are investigated as tools for evaluating the efficiency of the pace-making techniques. The main options rely on the Gramian Angular Field (GAF), the Markov Transition Field (MTF), previously used for time series classification, and the Chaos Game Representation (CGR), employed for the visualization of large collections of long time series. The paper proposes an original variation of the Markov Transition Matrix, defined for a couple of time series. Additionally, a recently proposed method, based on the mapping of time series as cross-visibility networks and their representation as images, is included in this study. The performances of the method are evaluated on synthetic data and applied to JET measurements

    Considerations on Stellarator's Optimization from the Perspective of the Energy Confinement Time Scaling Laws

    Get PDF
    The Stellarator is a magnetic configuration considered a realistic candidate for a future thermonuclear fusion commercial reactor. The most widely accepted scaling law of the energy confinement time for the Stellarator is the ISS04, which employs a renormalisation factor, fren, specific to each device and each level of optimisation for individual machines. The fren coefficient is believed to account for higher order effects not ascribable to variations in the 0D quantities, the only ones included in the database used to derive ISS04, the International Stellarator Confinement database. This hypothesis is put to the test with symbolic regression, which allows relaxing the assumption that the scaling laws must be in power monomial form. Specific and more general scaling laws for the different magnetic configurations have been identified and perform better than ISS04, even without relying on any renormalisation factor. The proposed new scalings typically present a coefficient of determination R2 around 0.9, which indicates that they basically exploit all the information included in the database. More importantly, the different optimisation levels are correctly reproduced and can be traced back to variations in the 0D quantities. These results indicate that fren is not indispensable to interpret the data because the different levels of optimisation leave clear signatures in the 0D quantities. Moreover, the main mechanism dominating transport, in reasonably optimised configurations, is expected to be turbulence, confirmed by a comparative analysis of the Tokamak in L mode, which shows very similar values of the energy confinement time. Not resorting to any renormalisation factor, the new scaling laws can also be extrapolated to the parameter regions of the most important reactor designs available

    Alternative Detection of n = 1 Modes Slowing Down on ASDEX Upgrade

    Get PDF
    Disruptions in tokamaks are very often associated with the slowing down of magneto-hydrodynamic (MHD) instabilities and their subsequent locking to the wall. To improve the understanding of the chain of events ending with a disruption, a statistically robust and physically based criterion has been devised to track the slowing down of modes with toroidal mode numbers n = 1 and mostly poloidal mode numberm= 2, providing an alternative and earlier detection tool compared to simple threshold based indicators. A database of 370 discharges of axially symmetric divertor experiment—upgrade (AUG) has been studied and results compared with other indicators used in real time. The estimator is based on a weighted average value of the fast Fourier transform of the perturbed radial n = 1 magnetic field, caused by the rotation of the modes. The use of a carrier sinusoidal wave helps alleviating the spurious influence of non-sinusoidal magnetic perturbations induced by other instabilities like Edge localized modes (ELMs). The indicator constitutes a good candidate for further studies including machine learning approaches for mitigation and avoidance since, by deploying it systematically to evaluate the time instance for the expected locking, multi-machine databases can be populated. Furthermore, it can be thought as a contribution to a wider approach to dynamically tracking the chain of events leading to disruptions

    Effects of environmental conditions on COVID-19 morbidity as an example of multicausality: a multi-city case study in Italy

    Get PDF
    The coronavirus disease 2019 (COVID-19), caused by the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), broke out in December 2019 in Wuhan city, in the Hubei province of China. Since then, it has spread practically all over the world, disrupting many human activities. In temperate climates overwhelming evidence indicates that its incidence increases significantly during the cold season. Italy was one of the first nations, in which COVID-19 reached epidemic proportions, already at the beginning of 2020. There is therefore enough data to perform a systematic investigation of the correlation between the spread of the virus and the environmental conditions. The objective of this study is the investigation of the relationship between the virus diffusion and the weather, including temperature, wind, humidity and air quality, before the rollout of any vaccine and including rapid variation of the pollutants (not only their long term effects as reported in the literature). Regarding them methodology, given the complexity of the problem and the sparse data, robust statistical tools based on ranking (Spearman and Kendall correlation coefficients) and innovative dynamical system analysis techniques (recurrence plots) have been deployed to disentangle the different influences. In terms of results, the evidence indicates that, even if temperature plays a fundamental role, the morbidity of COVID-19 depends also on other factors. At the aggregate level of major cities, air pollution and the environmental quantities affecting it, particularly the wind intensity, have no negligible effect. This evidence should motivate a rethinking of the public policies related to the containment of this type of airborne infectious diseases, particularly information gathering and traffic management

    An Unsupervised Spectrogram Cross-Correlation Method to Assess ELM Triggering Efficiency by Pellets

    Get PDF
    The high confinement mode (H-mode) is considered the optimal regime for the production of energy through nuclear fusion for industrial purposes since it allows to increase the energy confinement time of the plasma roughly by a factor of two. Consequently, it has been selected at the moment as the standard scenario for the next generation of devices, such as ITER. However, pressure-driven edge instabilities, known as edge localized modes (ELMs), are a distinct feature of this plasma regime. Their extrapolated thermal and particle peak loads on the plasma-facing components (PFC) of the next generation of devices are expected to be so high as to damage such structures, compromising the normal operations of the reactors themselves. Consequently, the induced loads have to be controlled; this can be achieved by mitigating ELMs. A possibility then lays in increasing the ELMs frequency to lower the loads on the PFCs. As already demonstrated at JET, the pellet pacing of ELMs is considered one of the most promising techniques for such scope, and its optimization is therefore of great interest for present and future operations of nuclear fusion facilities. In this work, we suggest a method to access primary pieces of information to perform statistics, assess and characterize the pacing efficiency. The method, tested on JET data, is based on the clustering (k-means) of convoluted signals, using so-called spectrogram cross-correlation, between the measured pellets and ELMs time traces. Results have also been obtained by taking advantage of a new type of diagnostic for measuring the ELMs dynamic, based on synthetic diamond sensors, faster than the standard spectroscopic cameras used at JET

    Dust tracking techniques applied to the STARDUST facility: First results

    Get PDF
    An important issue related to future nuclear fusion reactors fueled with deuterium and tritium is the creation of large amounts of dust due to several mechanisms (disruptions, ELMs and VDEs). The dust size expected in nuclear fusion experiments (such as ITER) is in the order of microns (between 0.1 and 1000 μm). Almost the total amount of this dust remains in the vacuum vessel (VV). This radiological dust can re-suspend in case of LOVA (loss of vacuum accident) and these phenomena can cause explosions and serious damages to the health of the operators and to the integrity of the device. The authors have developed a facility, STARDUST, in order to reproduce the thermo fluid-dynamic conditions comparable to those expected inside the VV of the next generation of experiments such as ITER in case of LOVA. The dust used inside the STARDUST facility presents particle sizes and physical characteristics comparable with those that created inside the VV of nuclear fusion experiments. In this facility an experimental campaign has been conducted with the purpose of tracking the dust re-suspended at low pressurization rates (comparable to those expected in case of LOVA in ITER and suggested by the General Safety and Security Report ITER-GSSR) using a fast camera with a frame rate from 1000 to 10,000 images per second. The velocity fields of the mobilized dust are derived from the imaging of a two-dimensional slice of the flow illuminated by optically adapted laser beam. The aim of this work is to demonstrate the possibility of dust tracking by means of image processing with the objective of determining the velocity field values of dust re-suspended during a LOVA

    Progettazione, realizzazione e primi test in campo di un sistema Lidar a Nd:YAG per l'allerta precoce di incendi boschivi

    No full text
    L’incendio boschivo è un problema grave e di difficile approccio che interessa, purtroppo, una considerevole parte del nostro pianeta: basti pensare che, nel solo decennio scorso e solo in Italia, sono andati perduti circa 500 mila ettari di bosco. Gli incendi causano danni che si ripercuotono sull’ambiente e sull’uomo sia in modo diretto che indiretto. Quelli diretti sono facilmente valutabili, anche visivamente, e sono costituiti dalla perdita di massa legnosa. I secondi, di più difficile stima, sono connessi alle funzioni “senza prezzo” del bosco, vale a dire la difesa idrogeologica, la produzione di ossigeno e la conservazione della conformazione e delle caratteristiche di vegetazione del territorio. L’importanza di questi fattori pone in rilievo la necessità di disporre di metodiche e tecnologie per poter fronteggiare con successo l’incendio boschivo, il che significa necessariamente affiancare ai sistemi già presenti in tutto il mondo nuovi apparati con caratteristiche di prontezza ed affidabilità più vantaggiose. Uno dei punti critici che interessa i sistemi attualmente in funzione è il ritardo che si registra tra l’innesco dell’incendio ed il momento in cui i centri operativi competenti ottengono informazioni sufficienti per disporre le opportune contromisure. Se si riuscisse ad intervenire nelle prime fasi dell’incendio, si otterrebbe una riduzione notevole dei danni ed un risparmio sensibile delle risorse necessarie alla sua estinzione. E’ facile comprendere come il telerilevamento possa offrire un valido contributo alla risoluzione di questo problema. In particolare, il telerilevamento attivo basato sulle tecniche Lidar/Dial è uno tra gli strumenti più adatti a monitorare con regolarità un incendio boschivo, per la possibilità di operare da un punto fisso su una superficie estesa e senza la necessità, per chi esegue la misura, di accedere di persona al sito in esame. Per tali motivi è stato progettato, realizzato e messo a punto un sistema Lidar/Dial mobile per la rivelazione precoce degli incendi boschivi e per la riduzione dei falsi allarmi. Il sistema Lidar, presentato in questo lavoro, è stato interamene progettato ed assemblato presso il laboratorio di ricerca di Elettronica Quantistica e Plasmi dell’Università di Roma “Tor Vergata” in collaborazione con i laboratori del Crati s.c.r.l di Lamezia Terme, nell’ambito del contratto di ricerca industriale SAI (Sistema Allerta Incendi) finanziato dal Miur (n. 7979/DSPAR/2002-DM 593/2000 art. 5). Il sistema si basa, principalmente su due componenti strutturali: un Lidar per le misurazioni in campo ed un software, sviluppato ad hoc, per la minimizzazione dei falsi allarmi. Il primo componente è il cuore del sistema in quanto emette un fascio laser che, interagendo con i prodotti di combustione rilasciati in atmosfera durante un incendio boschivo, ne rileva la presenza. Il secondo componente opera al fine di minimizzare i falsi allarmi dovuti alla presenza di fumi che non si originano da incendi boschivi (fumi industriali, particolato atmosferico sollevato da vento forte, fumi che si originano da pratiche agricole “sicure”, nebbie, ecc), e di diminuire il tempo impiegato dal sistema per l’elaborazione dei segnali Lidar. La realizzazione del dimostratore finale è stata possibile solo dopo un attento studio delle proprietà ottiche dei prodotti emessi durante la combustione di materiale vegetale, e di come essi interagiscano con il fascio laser. E’ per questo motivo che inizialmente sono state testate due sorgenti laser, una operante nell’IR e l’altra negli UV, al fine da stabilire quale fosse la migliore per i nostri scopi. Sono state effettuate, quindi, misure del coefficiente di retrodiffusione da fumo in cella utilizzando una sorgente laser a CO2 ed un laser Nd:YAG operante alle tre lunghezze d’onda 1064 nm, 532 nm e 355 nm. Dai risultati ottenuti si è visto che il laser a stato solido è quello più idoneo ad essere impiegato per la rivelazione di incendi boschivi, in quanto i valori ottenuti per i coefficienti di retrodiffusione del fumo sono più alti (circa due ordini di grandezza), assicurando così una maggiore sensibilità del sistema Scelta la sorgente laser, tutto il lavoro si è concentrato nel dotare il sistema di quelle caratteristiche fondamentali per un suo impiego, quali l’autonomia elettrica, la gestione in remoto ed la trasportabilità. Il sistema è stato quindi installato su un supporto mobile, facilmente trasportabile da qualunque autovettura, e dotato di tutta quella strumentazione necessaria alla sua alimentazione. Per quanto riguarda il controllo remoto, è stato sviluppato un software che consente la gestione da una sala controllo di tutta la strumentazione che compone il sistema di rivelazione incendi boschivi (compresi i sistemi di supporto). E’ stato infine realizzato il programma per il riconoscimento dei picchi e la minimizzazione dei falsi allarmi. Sono state effettuate diverse campagne di misure al fine di testare il sistema dopo ogni fase critica della progettazione. Questo approccio al collaudo del sistema ne ha permesso le ottimizzazioni step-by-step. Nel primo capitolo verrà data una breve introduzione degli aspetti teorici legati alla tecnica Lidar/Dial ed illustrate le proprietà ottiche dei prodotti emessi durante la combustione di materiale vegetale. Nel capitolo seguente sarà inizialmente presentato il modello matematico sviluppato per studiare l’andamento dei prodotti di combustione in un ambiente confinato (cella). Successivamente, saranno mostrate le misure preliminari dei parametri ottici significativi di alcuni materiali vegetali tipici delle zone e dei boschi della Calabria, effettuate in una cella realizzata ad-hoc. I risultati sperimentali saranno, quindi, confrontati con quelli ottenuti dalle simulazioni. A questo punto, il terzo capitolo sarà interamente dedicato alla progettazione e realizzazione del dimostratore SAI e verranno presentati i risultati dei primi test in campo, evidenziando i problemi riscontrati. Questi saranno risolti apportando alcune modifiche strutturali al sistema che saranno discusse nel capitolo quarto, insieme ai risultati ottenuti nella seconda campagna di misure. Nel quinto capitolo sarà mostrato il funzionamento in remoto del sistema, testato effettuando misure dello strato limite planetario (PBL) nell’arco delle ventiquattro ore. Infine, nel sesto capitolo, verrà illustrato nel dettaglio il funzionamento del software per il riconoscimento dei picchi da fumo e la minimizzazione dei falsi allarmi e presentate l’ultima campagna di misure effettuata.Forest fires can be the cause of serious environmental and economic damages. For this reason considerable effort has been directed toward forest protection and fire fighting. Lidar and dial techniques in laser remote sensing represent two methodologies that allow the exploration of the atmosphere. They are often used to acquire information necessary to create and/or validate several models relevant to different topics of atmospheric physics. Furthermore, they can also be employed in environmental control for monitoring particulate and gases. In the last years, experimental and theoretical investigations have shown that lidar is a powerful tool to detect the tenuous smoke plumes produced by forest fires at an early stage, but the problem of false alarm occurrence is not resolved, yet. In the present work, it has been developed a technique to minimize false alarm in the detection of forest fire by lidar based on a measurement of secondary components emitted in a combustion process. Usually to detect a fire, a rapid increase in the aerosol amount is measured. If the backscattering signal reports a peak, the presence of a forest fire will be probable. Our idea to confirm this hypothesis is to measure the secondary components emitted in a forest fire with an aim to minimize the false alarms. As is known in a combustion process of vegetable fuel, a large amount of water vapor is emitted in to the atmosphere. If an increment to the humidity concentration level is measured, a second fire warning is obtained. To develop our system, the measurements of smoke backscattering coefficients for several kind of vegetable fuel using a CO2 laser source and a Nd:YAG Q-switched have been carried out. Thank to this work, it has been possible to decide which laser was the best for our aim. The experimental results have shown that smoke backscattering coefficients measured with the Nd:YAG are bigger than those obtained with the CO2, guaranteeing a better efficiency of the system. Another reason for which we have chosen the Nd:YAG is that a solid state laser is more compact than a gas one, and therefore, easily transportable. The next step of our work it has been focused on how to make the system easy carried on and self-powered. Thus, all instrumentation have been installed on a mobile system equipped with a power generator and an uninterrupted power supply (UPS). Moreover, it has been developed a software to allow the remote control of the lidar system thus that one can be checked if the equipments work correctly or there are some problems by a control room located not necessarily near the surveillance zone. The designed system has been tested during an experimental campaign carried out in the Parco Nazionale dell’Abruzzo, locality “Valle di Canneto”. These measurements have been more important to understand the problems shown by our system during its working and how to resolve them. After this campaign we have improved the system with hardware and software modifications. In particular, the program for recognizing the smoke-plume due to the combustion of vegetable matter has been developed. New measurements campaigns have been carried out, in the same place, thus to optimize the our lidar system. The obtained results during these campaigns have shown how our system is able to localize a smoke-plume, recognizing if the peak on the backscattering lidar signal is due to the combustion of vegetable fuel or to a fixed targets. Moreover, with our system has been possible to evaluate the increment of atmospheric water vapour concentration connected to the forest fire, using Raman technique. In this thesis will be shown the experimental set-up of our lidar system and the results obtained during the several measurements campaigns realized to estimate the capability of our lidar system for revealing the forest fire event and for minimizing the false alarms occurrence

    Improving Entropy Estimates of Complex Network Topology for the Characterization of Coupling in Dynamical Systems

    No full text
    A new measure for the characterization of interconnected dynamical systems coupling is proposed. The method is based on the representation of time series as weighted cross-visibility networks. The weights are introduced as the metric distance between connected nodes. The structure of the networks, depending on the coupling strength, is quantified via the entropy of the weighted adjacency matrix. The method has been tested on several coupled model systems with different individual properties. The results show that the proposed measure is able to distinguish the degree of coupling of the studied dynamical systems. The original use of the geodesic distance on Gaussian manifolds as a metric distance, which is able to take into account the noise inherently superimposed on the experimental data, provides significantly better results in the calculation of the entropy, improving the reliability of the coupling estimates. The application to the interaction between the El Niño Southern Oscillation (ENSO) and the Indian Ocean Dipole and to the influence of ENSO on influenza pandemic occurrence illustrates the potential of the method for real-life problems

    Development of robust indicators for the identification of electron temperature profile anomalies and application to JET

    No full text
    Recent experience with metallic devices operating in ITER relevant regions of the operational space, has shown that the disruptivity of these plasmas is unacceptably high. The main causes of the disruptions are linked to impurity accumulation in the core and edge cooling, resulting in unstable current profiles. Avoidance and prevention of the consequent instabilities require the early detection of anomalous electron temperature profiles. A series of indicators have been developed and their performances compared, to find the most suitable inputs for disruption predictors. Their properties are assessed on the basis of information content, reliability and real-time availability. The best performing ones provide much better results than the ones reported in the literature, as shown by both numerical tests with synthetic data and the analysis of experimental signals from JET with the ITER-like wall. They provide better accuracy, lower false alarms and earlier detection. The improved discriminatory capability of the developed indicators is expected to significantly improve the performance of the most advanced predictors recently reported in the literature
    corecore