20 research outputs found

    The Numerical Sausage

    Full text link
    The renormalization group equation describing the evolution of the metric of the non linear sigma models poses some nice mathematical problems involving functional analysis, differential geometry and numerical analysis. We describe the techniques which allow a numerical study of the solutions in the case of a two-dimensional target space (deformation of the O(3)  σO(3)\; \sigma--model. Our analysis shows that the so-called sausages define an attracting manifold in the U(1) symmetric case, at one-loop level. The paper describes i) the known analytical solutions, ii) the spectral method which realizes the numerical integrator and allows to estimate the spectrum of zero--modes, iii) the solution of variational equations around the solutions, and finally iv) the algorithms which reconstruct the surface as embedded in R3R^3.Comment: 15 pages, uuencoded postscript fil

    Deformation and stress in hydrothermal regions: The case of a disk-shaped inclusion in a half-space

    No full text
    Hydrothermal regions are affected by a wide variety of phenomena, including ground inflation and deflation episodes. Among them, calderas offer the opportunity to study the complex interactions between magmatic processes at depth and permeable rocks saturated with fluids in the upper sedimentary layers. One of such regions is the Campi Flegrei caldera in southern Italy, where several source models have been applied over the years to reproduce the ground displacement and seismicity observed during the most recent phase of major unrest (1982\u20131984). The present work aims at introducing a new source model consisting of a thermo-poro-elastic inclusion embedded in a homogeneous poroelastic half-space. The inclusion is meant to represent a permeable rock layer stressed and strained by hot and pressurized volatiles released upward by an underlying magmatic reservoir and is modeled as a thin horizontal disk inside which a sudden change of temperature and pore pressure occurs. We provide semi-analytical solutions for the displacement and stress fields both within and outside the source and check them by comparison with those obtained through a fully numerical approach. Results provided by our model are compared with two other deformation source models often used to describe volcanic environments in terms of pressurized cavities describing a spherical magma chamber (Mogi source) or a sill-like magma intrusion (Fialko source). For the Campi Flegrei 1982\u201384 unrest, our model provides a better reproduction of ground deformation data and manages to explain the widespread presence of compressive focal mechanisms, since the stress field promoted both inside and outside the thermo-poro-elastic inclusion is very different from pressurized cavities

    Investigating the Impact of Signal-to-Noise Ratio on EEGResting-State source reconstruction

    No full text
    Introduction: Identifying sources of electroencephalography (EEG) activity is a complex problem that requires models of thehead and tissues [1,2]. The effect of Signal-To-Noise Ratio (SNR) on source localization accuracy is oftenevaluated considering the task-evoked cortical activity [3]. However, elucidating spontaneous activation of thebrain, i.e., in the absence of a stimulus or task, is not immediate as the signal is of low amplitude and theunderlying neural sources are challenging to examine [4]. In the EEG resting-state signal, the effect of SNR iscritical to be determined as prior information. Moreover, many studies have used spherical heads to investigatethe localization errors of dipoles [5]. Here, we present a simulation study to investigate the effect of differentSNR values on the performance of source estimation (SNR LOC) using the Minimum Norm Estimation (MNE)[6] and a realistic head model. Methods: We simulated synthetic resting-state EEG signals with different known SNRs [7]. The signal was 1 min longand sampled at 256 Hz. It was generated from synthetic source time courses, using two non-linear dipolarcoupled sources located in the primary motor cortex and fifty uncorrelated noise sources randomly distributedover the whole cortex. The two non-linearly coupled sources, with quadratic nonlinearity, presented a timedelay of 15 ms [8]. Using a BEM volume conductor model based on the New York Head model [9] andimposing the EEG electrode locations, the leadfield matrix for the simulated sources was computed accordingto [10]. The source space consisted of a cortical layer of 10016 distributed points registered to a commontemplate. The SNR of the simulated EEG signal was set equal to [1, 5, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100].The simulated EEG data were analyzed using Independent Component Analysis (ICA) to remove artifacts andretain ICs of brain origin. The inverse solution was obtained using MNE on brain ICs, with differentregularization parameters (defined as λ proportional to 1/SNR2) that were used to balance the accuracy andsmoothness of the solution. The variation of the SNR LOC values [0.1, 1:10, 50, 100] influences the numericalsolution of the inverse problem in terms of the spread and position of the source reconstruction. Theperformance was evaluated using three different metrics: the localization error, the measure of the sourceextension, and the source fragmentation. The localization error was defined as the distance between theinverse solution peak and the true location of the generating source. The source extension was measured asthe Euclidean distance between the point of the source with the highest intensity and all the points of a Regionof Interest, i.e., where the inverse problem solution is higher than 80% of the solution range. To evaluate thesource fragmentation, we apply the K-means clustering with the Calinski-Harabasz criterion, to the outliers ofthe distances from the peak of the highest intensity. Results: Fig.1A shows the localization error: this decreases as the SNR LOC increases, with a 1/SNR trend. Fig.2Bshows the distribution of median distances between the peak of the inverse solution and the true location of thegenerated source. As the SNR LOC increases, the sources become narrower. The repeated-measure ANOVA,with a four-level within-subject factor, indicates statistically significant differences (p<0.05) between thedistributions, and a post-hoc test was carried out with Bonferroni correction. Regarding the sourcefragmentation (Figure 2), the number of clusters for SNR LOC equal to 100 was significantly higher than forSNR LOC equal to 0.1, 3, and 10, which present no statistically significant differences. Conclusions: Evaluating the effect of SNR LOC strongly influences the spatial resolution of the source-level analysis: anSNR loc value of 10 appears to be a good trade-off between the three metrics, as it provides a focused sourcereconstruction and ensures a low localization error

    Analysis and optimization of clinical pathway of a cancer patient in a University Hospital

    No full text
    A clinical pathway can be defined as a macro process that includes the complete management of a health problem. It could be considered the equivalent of diagnostic and therapeutic pathway, but the word clinical includes also the person assistance to self-care and the psychological and social support. Managing and organizing a clinical pathway in order to exploit as efficiently as possible all available resources could be the aim of all hospitals. The simulation is a modern approach that allows understanding, with a logical sequence, the entire process, in order to identify, analyze and underline characteristics, advantages and problems of the specific context. Nevertheless, simulation is not effective if the process analysis and the mathematical model are overlooked. The purpose of this work is to characterize clinical pathways, and try to understand and optimize weak points. The various phases of the work allowed conducting a precise, clear and detailed analysis, in order to develop a more efficient process. Finally, the simulation model is able to consider all possible variables that could modify the efficiency of the process and also confirms that the reorganization proposal could be effective and sustainable before a real implementation

    Impact assessment and risk analysis in the redevelopment of a healthcare structure

    No full text
    Any upgrading process aims for improving and optimizing operational conditions of every work environment. It generally includes the development of a change focused on organizational, technological and structural advancements in the expected conditions of specific active fields of interest. The upgrading process has to be always well planned and organized in order to be a source of growth for the structure considered and not becoming an obstacle for the daily routine of the structure itself. This aspect is much more relevant in case of health facilities since everyday activities must be constantly kept at a standard level and under an accurate control. To meet the needs of technological and legislative progress, the change following the upgrading process must be a long-term one. However, it will represent a significant variable both for the employees working conditions and the quality of the health care given to the patients. In this work an objective and complete procedure has been developed to quantify, in an impartial and univocal way, the impact that an upgrading process can have on health activities. In order to prevent and neutralize all the possible risks for patients, employees and health workers, it is of utmost importance the objectively evaluation of effects and hazards that these processes involve

    Calcolo dell’Età Vascolare: un nuovo e concreto target terapeutico

    No full text
    Scopo: Le Linee Guida per la gestione dei pazienti con fattori di rischio cardiovascolare attualmente richiedono la misurazione del rischio cardiovascolare come punto di partenza per stabilire obiettivi terapeutici e strategie di trattamento. La stima dell’EtĂ  Vascolare di un soggetto, indice di quanto le sue arterie vengano “invecchiate” dalla presenza ed entitĂ  dei principali fattori di rischio cardiovascolare, potrebbe rappresentare un nuovo target terapeutico per i medici che si occupano di rischio cardiovascolare. Scopo di questo lavoro Ăš la promozione del concetto di EtĂ  Vascolare tra i Medici che si occupano del rischio cardiovascolare dei pazienti, attraverso lo sviluppo di una App medica, basata sul Database europeo SCORE, da impiegarsi per il calcolo sia del rischio cardiovascolare che dell’EtĂ  Vascolare in popolazioni diverse di soggetti. Metodi: È stato inizialmente sviluppato un software basato su un algoritmo per il calcolo dell’EtĂ  Vascolare; il software Ăš stato quindi sviluppato e implementato per dispositivi mobili. L’EtĂ  Vascolare, Ăš stata calcolata in base alla definizione data da D’Agostino et al (Circulation, 2008): etĂ  che una persona avrebbe con lo stesso rischio cardiovascolare calcolato e tutti i fattori di rischio nella norma. I fattori di rischio considerati sono stati etĂ , genere, fumo, colesterolo totale e pressione sanguigna sistolica. Per il calcolo sono state impiegate le equazioni dello studio SCORE per paesi europei a basso rischio cardiovascolare, come l’Italia (Conroy et al, Eur Heart J, 2003; Cuende et al, Eur Heart J, 2010), che collegano la presenza di fattori di rischio al rischio a 10 anni di eventi cardiovascolari fatali. In seguito allo sviluppo del software, Ăš stata realizzata l’interfaccia di una App specificamente rivolta ai medici per computer, tablet e cellulari, collegata a un sito che fornisce informazioni riguardo all’App e all’EtĂ  Vascolare. In parallelo allo sviluppo del software sono stati raccolti i dati di popolazioni diverse di soggetti, con ampia distribuzione geografica in Italia. Come criterio di inclusione Ăš stata considerata la fascia d’etĂ  tra tra 40 e 65 anni, perchĂ© Ăš proprio in questa fascia d’etĂ  che Ăš possibile intervenire con efficacia sui propri fattori di rischio modificabili. In totale sono stati valutati 904 soggetti (77% maschi e 23% femmine; 38% fumatori), in cui sono stati valutati sia il rischio cardiovascolare che l’EtĂ  Vascolare impiegando il software. Risultati: È stato sviluppato un software per il calcolo del rischio a 10 anni di eventi cardiovascolari fatali e dell’EtĂ  Vascolare in accordo con le equazioni dello studio SCORE. L’etĂ  media anagrafica Ăš risultata 54±7 anni, mentre l’EtĂ  Vascolare Ăš risultata superiore (60±10 anni), con una differenza media tra l’EtĂ  Vascolare e l’etĂ  anagrafica del 12±10% (valore normalizzato rispetto all’etĂ  anagrafica). A questi valori corrispondeva un rischio cardiovascolare medio del 2,3±2,2%. Conclusioni: I dati analizzati grazie al software che permette di calcolare l’EtĂ  Vascolare, mostrano che questo dato puĂČ essere molto diverso dall’etĂ  anagrafica. Questi dati saranno successivamente correlati a valutazioni strumentali non invasive quali lo spessore medio-intimale carotideo e la rigiditĂ  arteriosa che sono considerati marcatori precoci di danno vascolare, per validare ulteriormente il concetto di EtĂ  Vascolare. La diffusione di un modello di questo tipo, fondato non piĂč solo sul calcolo del rischio in modo tradizionale, partendo dalle stesse identiche basi epidemiologico-statistiche, potrebbe portare ad un consistente miglioramento del trattamento dei fattori di rischio cardiovascolare, motivando maggiormente il paziente a modificare i propri stili di vita e ad aderire in maniera piĂč consapevole alle indicazioni terapeutiche

    Looking for a Simple Assessment Tool for a Complex Task: Short-Term Evaluation of Changes in Fisheries Management Measures in the Pomo/Jabuka Pits Area (Central Adriatic Sea)

    No full text
    A Before–Intermediate–After Multiple Sites (BIAMS) analysis, namely a modified version of the Before–After–Control–Impact (BACI) approach, was used to evaluate the possible effects of fishery management measures implemented in the Pomo/Jabuka Pits area, a historically highly exploited ground for Italian and Croatian fisheries, whose impact may have contributed over the years to the modification of the ecosystem. Since 2015, the area was subject to fishing regulations changing the type of restrictions over time and space, until the definitive establishment in 2018 of a Fishery Restricted Area. These changes in the regulatory regime result in complex signals to be interpreted. The analysis was carried out on abundance indices (i.e., kg/km2 and N/km2) of five commercially or ecologically relevant species, obtained in the period 2012–2019 from two annual trawl surveys. BIAMS was based on the selection of a Closure factor, declined in three levels (i.e., BEFORE/INTERMEDIATE/AFTER) and accounting for regulation changes in time, and on three adjacent strata (i.e., “A”, “B”, and “ext ITA”) a posteriori determined according to the latest regulations. BIAMS allowed us to identify early effects (i.e., changes in abundances), overcoming the unavailability of a proper independent control site; furthermore, the selection of adjacent strata allowed the inference of possible interactions among them

    Looking for a Simple Assessment Tool for a Complex Task: Short-Term Evaluation of Changes in Fisheries Management Measures in the Pomo/Jabuka Pits Area (Central Adriatic Sea)

    No full text
    A Before–Intermediate–After Multiple Sites (BIAMS) analysis, namely a modified version of the Before–After–Control–Impact (BACI) approach, was used to evaluate the possible effects of fishery management measures implemented in the Pomo/Jabuka Pits area, a historically highly exploited ground for Italian and Croatian fisheries, whose impact may have contributed over the years to the modification of the ecosystem. Since 2015, the area was subject to fishing regulations changing the type of restrictions over time and space, until the definitive establishment in 2018 of a Fishery Restricted Area. These changes in the regulatory regime result in complex signals to be interpreted. The analysis was carried out on abundance indices (i.e., kg/km2 and N/km2) of five commercially or ecologically relevant species, obtained in the period 2012–2019 from two annual trawl surveys. BIAMS was based on the selection of a Closure factor, declined in three levels (i.e., BEFORE/INTERMEDIATE/AFTER) and accounting for regulation changes in time, and on three adjacent strata (i.e., “A”, “B”, and “ext ITA”) a posteriori determined according to the latest regulations. BIAMS allowed us to identify early effects (i.e., changes in abundances), overcoming the unavailability of a proper independent control site; furthermore, the selection of adjacent strata allowed the inference of possible interactions among them
    corecore