218 research outputs found
A new algorithm for estimating pedestrian flows during massive touristic events, optimized for an existing camera setup
In questa tesi presento un nuovo algoritmo per l'analisi di filmati che permette di calcolare il flusso di persone che attraversano un passaggio anche in presenza di condizioni sfavorevoli della telecamera. Il lavoro di tesi si è concentrato sull'analisi di una serie di sequenze video precedentemente estratte da una telecamera di sicurezza rivolta verso il Ponte della Costituzione a Venezia, con lo scopo di stimare il flusso pedonale sul ponte. La scarsa qualità dei video dovuta alla bassa risoluzione ed il posizionamento non ottimale della telecamera, che provoca numerose sovrapposizioni, causano il fallimento di molte tecniche di computer vision esistenti, perciò è stato necessario creare una nuova soluzione. È stata inoltre effettuata una verifica dell'algoritmo attraverso un programma che lo implementa, analizzando sia dati artificiali che reali
Cheratometria computerizzata intraoperatoria nei trapianti di cornea
Il parametro che maggiormente influenza la qualità della visione ed il tempo di recupero funzionale nei pazienti sottoposti a trapianto di cornea è a tutt’oggi l`astigmatismo postoperatorio. Da ciò deriva l`importanza di riuscire a prevedere in modo attendibile le modificazioni della curvatura corneale indotte dalle varie manovre chirurgiche e di verificare l`eventuale corrispondenza tra misure intraoperatorie e risultato finale. La difficoltà di ottenere, in sede intraoperatoria, misure cheratometriche attendibili senza disagio per il chirurgo e senza rallentare i tempi chirurgici è responsabile del fatto che mancano nella letteratura recente studi quantitativi relativi alle modificazioni dell’architettura corneale nel corso dell’intervento chirurgico. Grazie alla collaborazione tra C.N.R. e la Sezione di Oftalmologia del Dipartimento di Neuroscienze dell’Università di Pisa e’ stato realizzato e testato un prototipo di cheratometro intraoperatorio di Troutman che consente di monitorare le variazioni di astigmatismo corneale in tempo reale nel corso dell’intervento con riferimento sia al ΔK sia all’asse, rispondendo a criteri di precisione e affidabilità , unitamente a requisiti di semplicità e da automatismo operativo.
Il cheratometro di Troutman è costituito da un supporto metallico a forma di anello, che si può fissare meccanicamente attorno all’ottica del microscopio operatorio, che contiene 12 sorgenti luminose puntiformi disposte ad intervalli regolari di 30° lungo la superficie inferiore dell’anello su una circonferenza di 78 mm di diametro con centro sull’asse ottico del microscopio.
Mediante la telecamera, che ottiene le immagini dall’ottica del microscopio operatorio coassialmente al cheratometro, è possibile ottenere le immagini digitalizzate di riflessione delle mire del cheratometro dalla superficie corneale ed elaborare i dati cheratometrici dopo averli acquisiti mediante una scheda di acquisizione collegata al sistema di elaborazione.
Dopo aver sviluppato il software per il funzionamento dello strumento, ed aver testato l’attendibilità delle misurazioni con lo strumento in esame su sfere precalibrate, è stato approfondito lo studio dell’attendibilità dei risultati cheratometrici su 100 volontari sani confrontando i risultati ottenuti con strumenti (cheratometro di Javal e topografo corneale) che costituiscono il gold-standard per le misurazioni dei parametri corneali.
La differenza media tra i valori di astigmatismo ottenuti con il cheratometro intraoperatorio e l’oftalmometro di Javal è stato di 0,35 diottrie (range 0,00 – 0,95 diottrie, deviazione standard + 0,21 diottrie); quella tra i valori di astigmatismo ottenuti con il cheratometro intraoperatorio e il topografo corneale è stato di 0,33 diottrie (range 0,00 – 1,00 diottrie, deviazione standard + 0,27 diottrie). Considerando che un difetto di 0,25-0,50 diottrie di astigmatismo è clinicamente considerato fisiologico si capisce che tali errori di misurazione risultano essere completamente trascurabili.
Verificato il raggiungimento di uno standard di misurazione accettabile si è proceduto con la sperimentazione clinica cominciando ad analizzare intraoperatoriamente dei soggetti sottoposti precedentemente a trapianto di cornea che necessitavano di una regolazione della sutura corneale continua per la presenza di un astigmatismo elevato.
Sono stati esaminati 50 alla distanza media di 1 mese dall’intervento di trapianto (range 20-45 giorni). L’astigmatismo medio pre-regolazione era di 7.01 diottrie (range 5.00-9,25 diottrie) che poteva essere associato anche a difetti di tipo sferico come miopia ed ipermetropia; due giorni dopo la regolazione della tensione della sutura l’astigmatismo medio risultava essere diminuito fino ad una media di 1,125 diottrie (range 0,50-2.00 diottrie), tale astigmatismo a mostrato la tendenza a crescere lievemente nei giorni seguenti, per attestarsi a distanza di 28 giorni dall’intervento a 1.14 diottrie, fatto probabilmente dovuto ad una naturale ridistribuzione delle tensioni della cicatrice tra il lembo trapiantato ed il letto ricevente, ma si è sempre mantenuto su valori nettamente inferiori alle misurazioni iniziali pre-regolazione.Si può dunque affermare che l’impiego del cheratometro di Troutman in sede intraoperatoria consente di ottenere misure cheratometriche predittive del risultato postoperatorio a 28 giorni dando un valido sostegno alle manovre chirurgiche atte a ridurre l’astigmatismo corneale in soggetti sottoposti a trapianto di cornea
Tracing two decades of carbon emissions using a network approach
Carbon emissions are currently attributed to producers although a
consumption-aware accounting is advocated. After constructing the Carbon Trade
Network, we trace the flow of emissions over the past two decades. Our analysis
reveals the presence of an unexpected, positive feedback: despite individual
exchanges have become less carbon-intensive, the increase in trading activity
has ultimately risen the amount of emissions directed from `net exporters'
towards `net importers'. Adopting a consumption-aware accounting would
re-distribute responsibility between the two groups, possibly reducing
disparities
Current status of gravitational-wave observations
The first generation of gravitational wave interferometric detectors has
taken data at, or close to, their design sensitivity. This data has been
searched for a broad range of gravitational wave signatures. An overview of
gravitational wave search methods and results are presented. Searches for
gravitational waves from unmodelled burst sources, compact binary coalescences,
continuous wave sources and stochastic backgrounds are discussed.Comment: 21 pages, LaTeX, uses svjour3.cls, 1 figure, for GRG special issue on
Einstein Telescop
J-SPACE: a Julia package for the simulation of spatial models of cancer evolution and of sequencing experiments
Background: The combined effects of biological variability and measurement-related errors on cancer sequencing data remain largely unexplored. However, the spatio-temporal simulation of multi-cellular systems provides a powerful instrument to address this issue. In particular, efficient algorithmic frameworks are needed to overcome the harsh trade-off between scalability and expressivity, so to allow one to simulate both realistic cancer evolution scenarios and the related sequencing experiments, which can then be used to benchmark downstream bioinformatics methods.Result: We introduce a Julia package for SPAtial Cancer Evolution (J-SPACE), which allows one to model and simulate a broad set of experimental scenarios, phenomenological rules and sequencing settings.Specifically, J-SPACE simulates the spatial dynamics of cells as a continuous-time multi-type birth-death stochastic process on a arbitrary graph, employing different rules of interaction and an optimised Gillespie algorithm. The evolutionary dynamics of genomic alterations (single-nucleotide variants and indels) is simulated either under the Infinite Sites Assumption or several different substitution models, including one based on mutational signatures. After mimicking the spatial sampling of tumour cells, J-SPACE returns the related phylogenetic model, and allows one to generate synthetic reads from several Next-Generation Sequencing (NGS) platforms, via the ART read simulator. The results are finally returned in standard FASTA, FASTQ, SAM, ALN and Newick file formats.Conclusion: J-SPACE is designed to efficiently simulate the heterogeneous behaviour of a large number of cancer cells and produces a rich set of outputs. Our framework is useful to investigate the emergent spatial dynamics of cancer subpopulations, as well as to assess the impact of incomplete sampling and of experiment-specific errors. Importantly, the output of J-SPACE is designed to allow the performance assessment of downstream bioinformatics pipelines processing NGS data. J-SPACE is freely available at: https://github.com/BIMIB-DISCo/J-Space.jl
A chi-squared time-frequency discriminator for gravitational wave detection
Searches for known waveforms in gravitational wave detector data are often
done using matched filtering. When used on real instrumental data, matched
filtering often does not perform as well as might be expected, because
non-stationary and non-Gaussian detector noise produces large spurious filter
outputs (events). This paper describes a chi-squared time-frequency test which
is one way to discriminate such spurious events from the events that would be
produced by genuine signals. The method works well only for broad-band signals.
The case where the filter template does not exactly match the signal waveform
is also considered, and upper bounds are found for the expected value of
chi-squared.Comment: 18 pages, five figures, RevTex
Eco-Friendly Engineered Nanomaterials Coupled with Filtering Fine-Mesh Net as a Promising Tool to Remediate Contaminated Freshwater Sludges: An Ecotoxicity Investigation
The use of eco-friendly engineered nanomaterials represents a recent solution for an effective and safe treatment of contaminated dredging sludge. In this study, an eco-designed engineered material based on cross-linked nanocellulose (CNS) was applied for the first time to decontaminate a real matrix from heavy metals (namely Zn, Ni, Cu, and Fe) and other undesired elements (mainly Ba and As) in a lab-scale study, with the aim to design a safe solution for the remediation of contaminated matrices. Contaminated freshwater sludge was treated with CNS coupled with a filtering fine-mesh net, and the obtained waters were tested for acute and sublethal toxicity. In order to check the safety of the proposed treatment system, toxicity tests were conducted by exposing the bacterium Aliivibrio fischeri and the crustacean Heterocypris incongruens, while subtoxicity biomarkers such as lysosomal membrane stability, genetic, and chromosomal damage assessment were performed on the freshwater bivalve Dreissena polymorpha. Dredging sludge was found to be genotoxic, and such genotoxicity was mitigated by the combined use of CNS and a filtering fine-mesh net. Chemical analyses confirmed the results by highlighting the abetment of target contaminants, indicating the present model as a promising tool in freshwater sludge nanoremediation
Noise parametric identification and whitening for LIGO 40-meter interferometer data
We report the analysis we made on data taken by Caltech 40-meter prototype
interferometer to identify the noise power spectral density and to whiten the
sequence of noise. We concentrate our study on data taken in November 1994, in
particular we analyzed two frames of data: the 18nov94.2.frame and the
19nov94.2.frame.
We show that it is possible to whiten these data, to a good degree of
whiteness, using a high order whitening filter. Moreover we can choose to
whiten only restricted band of frequencies around the region we are interested
in, obtaining a higher level of whiteness.Comment: 11 pages, 15 figures, accepted for publication by Physical Review
Cellular Responses Induced by Zinc in Zebra Mussel Haemocytes. Loss of DNA Integrity as a Cellular Mechanism to Evaluate the Suitability of Nanocellulose-Based Materials in Nanoremediation
: Zinc environmental levels are increasing due to human activities, posing a threat to
ecosystems and human health. Therefore, new tools able to remediate Zn contamination in freshwater
are highly recommended. Specimens of Dreissena polymorpha (zebra mussel) were exposed for 48 h
and 7 days to a wide range of ZnCl2 nominal concentrations (1–10–50–100 mg/L), including those
environmentally relevant. Cellulose-based nanosponges (CNS) were also tested to assess their
safety and suitability for Zn removal from freshwater. Zebra mussels were exposed to 50 mg/L
ZnCl2 alone or incubated with 1.25 g/L of CNS (2 h) and then removed by filtration. The effect of
Zn decontamination induced by CNS has been verified by the acute toxicity bioassay Microtox®.
DNA primary damage was investigated by the Comet assay; micronuclei frequency and nuclear
morphological alterations were assessed by Cytome assay in mussels’ haemocytes. The results
confirmed the genotoxic effect of ZnCl2
in zebra mussel haemocytes at 48 h and 7-day exposure
time. Zinc concentrations were measured in CNS, suggesting that cellulose-based nanosponges were
able to remove Zn(II) by reducing its levels in exposure waters and soft tissues of D. polymorpha in
agreement with the observed restoration of genetic damage exerted by zinc exposure alon
Next-generation ultra-compact calorimeters based on oriented crystals
Calorimeters based on oriented crystals provide unparalleled compactness and resolution in measuring the energy of electromagnetic particles. Recent experiments performed at CERN and DESY beamlines by the AXIAL/ELIOT experiments demonstrated a significant reduction in the radiation length inside tungsten and PbWO4, the latter being the scintillator used for the CMS ECAL, observed when the incident particle trajectory is aligned with a lattice axis within ∼1∘. This remarkable effect, being observed over the wide energy range from a few GeV to 1 TeV or higher, paves the way for the development of innovative calorimeters based on oriented crystals, featuring a design significantly more compact than currently achievable while rivaling the current state of the art in terms of energy resolution in the range of interest for present and future forward detectors (such as the KLEVER Small Angle Calorimeter at CERN SPS) and source-pointing space-borne γ-ray telescopes
- …