244 research outputs found

    3D Modelling from Real Data

    Get PDF
    The genesis of a 3D model has basically two definitely different paths. Firstly we can consider the CAD generated models, where the shape is defined according to a user drawing action, operating with different mathematical “bricks” like B-Splines, NURBS or subdivision surfaces (mathematical CAD modelling), or directly drawing small polygonal planar facets in space, approximating with them complex free form shapes (polygonal CAD modelling). This approach can be used for both ideal elements (a project, a fantasy shape in the mind of a designer, a 3D cartoon, etc.) or for real objects. In the latter case the object has to be first surveyed in order to generate a drawing coherent with the real stuff. If the surveying process is not only a rough acquisition of simple distances with a substantial amount of manual drawing, a scene can be modelled in 3D by capturing with a digital instrument many points of its geometrical features and connecting them by polygons to produce a 3D result similar to a polygonal CAD model, with the difference that the shape generated is in this case an accurate 3D acquisition of a real object (reality-based polygonal modelling). Considering only device operating on the ground, 3D capturing techniques for the generation of reality-based 3D models may span from passive sensors and image data (Remondino and El-Hakim, 2006), optical active sensors and range data (Blais, 2004; Shan & Toth, 2008; Vosselman and Maas, 2010), classical surveying (e.g. total stations or Global Navigation Satellite System - GNSS), 2D maps (Yin et al., 2009) or an integration of the aforementioned methods (Stumpfel et al., 2003; Guidi et al., 2003; Beraldin, 2004; Stamos et al., 2008; Guidi et al., 2009a; Remondino et al., 2009; Callieri et al., 2011). The choice depends on the required resolution and accuracy, object dimensions, location constraints, instrument’s portability and usability, surface characteristics, working team experience, project’s budget, final goal, etc. Although aware of the potentialities of the image-based approach and its recent developments in automated and dense image matching for non-expert the easy usability and reliability of optical active sensors in acquiring 3D data is generally a good motivation to decline image-based approaches. Moreover the great advantage of active sensors is the fact that they deliver immediately dense and detailed 3D point clouds, whose coordinate are metrically defined. On the other hand image data require some processing and a mathematical formulation to transform the two-dimensional image measurements into metric three-dimensional coordinates. Image-based modelling techniques (mainly photogrammetry and computer vision) are generally preferred in cases of monuments or architectures with regular geometric shapes, low budget projects, good experience of the working team, time or location constraints for the data acquisition and processing. This chapter is intended as an updated review of reality-based 3D modelling in terrestrial applications, with the different categories of 3D sensing devices and the related data processing pipelines

    Image pre-processing for optimizing automated photogrammetry performances

    Get PDF
    The purpose of this paper is to analyze how optical pre-processing with polarizing filters and digital pre-processing with HDR imaging, may improve the automated 3D modeling pipeline based on SFM and Image Matching, with special emphasis on optically non-cooperative surfaces of shiny or dark materials. Because of the automatic detection of homologous points, the presence of highlights due to shiny materials, or nearly uniform dark patches produced by low reflectance materials, may produce erroneous matching involving wrong 3D point estimations, and consequently holes and topological errors on the mesh originated by the associated dense 3D cloud. This is due to the limited dynamic range of the 8 bit digital images that are matched each other for generating 3D data. The same 256 levels can be more usefully employed if the actual dynamic range is compressed, avoiding luminance clipping on the darker and lighter image areas. Such approach is here considered both using optical filtering and HDR processing with tone mapping, with experimental evaluation on different Cultural Heritage objects characterized by non-cooperative optical behavior. Three test images of each object have been captured from different positions, changing the shooting conditions (filter/no-filter) and the image processing (no processing/HDR processing), in order to have the same 3 camera orientations with different optical and digital pre-processing, and applying the same automated process to each photo set

    Image-based 3D capture of cultural heritage artifacts an experimental study about 3D data quality

    Get PDF
    The paper presents an analysis of the 3D data quality generated from small-medium objects by well-known automatic photogrammetry packages based on Structure from Motion (SfM) and Image Matching (IM). The work aims at comparing different shooting configurations and image redundancy, using as high-quality reference the 3D data acquired by triangulation-based laser scanners characterized by a low measurement uncertainty. Two set of tests are presented: i) a laboratory 3D measurement made with the two active and passive approaches, where the image-based 3D acquisition makes use of different camera orientations leading to different image redundancy; ii) a 3D digitization in the field with an industrial laser scanner and two sets of images taken with different overlap levels. The results in the field confirm the relationship between measurement uncertainty and image overlap that emerged in the Lab tests

    Predictive methods for Adaptive Radiation Therapy: effects of the organ motion, of the deformable registration algorithms and of the dose accumulation.

    Get PDF
    Il lavoro di ricerca è finanziato dal Ministero della Salute - Bando Giovani Ricercatori 2010 MoH (GR-2010-2318757) “Dose warping methods for IGRT and Adaptive RT: dose accumulation based on organ motion and anatomical variations of the patients during radiation therapy treatments”. La ricerca ha sviluppato metodi predittivi per Adaptive Radiation Therapy. Il paziente è soggetto a macro-micro variazioni anatomiche intra-inter frazione e funzionali durante le fasi di preparazione del piano terapeutico e la ripetitività durante le sedute di radioterapia sono affette da fattori quali movimento d’organo e variazione morfologica, che possono influenzare il programma terapeutico. I sistemi avanzati di calcolo e accumulo di dose consentono la registrazione delle dosi erogate, tenendo in considerazione variazioni locali e globali. Il ricorso a tecnologiche e risorse umane infinite per verificare, istante per istante, la dose erogata a un singolo paziente, sarebbero impensabili nella pratica clinica. I modelli predittivi mediante reti neurali o epidemiologici contribuiscono al monitoraggio del paziente, mediante metodi fisici e statistici. Sono stati valutati fattori di movimento d’organo, le variazioni anatomiche, la deformazione delle immagini e l’accumulo di dose come principali elementi ed effetti nell’utilizzo di metodi predittivi. Questi fattori richiedono la validazione e sviluppo, mediante analisi Bayesiane e misure sperimentali che confermino qualità e accuratezza degli algoritmi. Lo sviluppo e utilizzo di metodiche di dose accumulation e la verifica delle dosi erogate, considerando movimento e deformazione, ha portato allo sviluppo di prototipi robotici, per dosimetria in-vivo e valutazione del movimento, mediante LEGO®. Lo sviluppo delle reti neurali, delle metodiche epidemiologiche e di Support Vector Machine ha consentito di estendere, mediante un progetto di data-mining, le metodologie a centri di livello nazionale. La ricerca mostra le criticità dei metodi predittivi dimostrando l’efficienza delle reti neurali e modelli epidemiologici, nei trattamenti avanzati del tumore della testa e collo, prostata, pancreas e polmone.The research is funded by the Ministry of Health - Young Researchers in 2010 MoH (GR-2010-2318757) "Dose warping methods for IGRT and Adaptive RT: dose accumulation based on organ motion and anatomical variations of the patients during radiation therapy treatments". The research has developed predictive methods for Adaptive Radiation Therapy. The patient is subjected to macro-micro anatomical and functional variations during the preparation of the treatment plan and during radiation therapy treatments. They are affected by factors such as organs movement and morphological variations, which may influence the therapeutic program. The advanced systems for dose calculation and accumulation allow the recording of doses delivered, taking into account local and global variations. The use of technological and human resources endless to check, moment by moment, the dose delivered to an individual patient, would be unthinkable in clinical practice. Predictive models using neural networks or epidemiological contribute to the monitoring of the patient by physical and statistics methods. Were evaluated factors of organ motion, anatomical variations, deformable registration and dose accumulation as principal elements and effects in the use of predictive methods. These factors require validation and development, using Bayesian analysis and experimental measurements that confirm the quality and accuracy of the algorithms. The development and use of methods of dose accumulation and verification of doses delivered, considering movement and deformation, has led to the development of robotic prototypes for in-vivo dosimetry and evaluation of the movement, using LEGO. The development of neural networks, of epidemiological methods and Support Vector Machine have allowed extending, by means of a data-mining project, the methodologies in nationwide centers. Research shows the criticality of predictive methods proving the efficiency of neural networks and epidemiological models, for advanced treatments of the head and neck, prostate, pancreas and lung cancer

    Case study: IBM Watson Analytics cloud platform as Analytics-as-a-Service system for heart failure early detection

    Get PDF
    In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS) using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detecting the presence or absence of Heart Failure disease using nothing more than the electrocardiographic signal, in particular through the analysis of Heart Rate Variability. The obtained results are comparable with those coming from the literature, in terms of accuracy and predictive power. Advantages and drawbacks of cloud versus static approaches are discussed in the last sections

    3d Modeling of Boccaccio's Hometown through a Multisensor Survey

    Get PDF

    Condition-Based Maintenance of HVAC on a High-Speed Train for Fault Detection

    Get PDF
    Reliability-centered maintenance (RCM) is a well-established method for preventive maintenance planning. This paper focuses on the optimization of a maintenance plan for an HVAC (heating, ventilation and air conditioning) system located on high-speed trains. The first steps of the RCM procedure help in identifying the most critical items of the system in terms of safety and availability by means of a failure modes and effects analysis. Then, RMC proposes the optimal maintenance tasks for each item making up the system. However, the decision-making diagram that leads to the maintenance choice is extremely generic, with a consequent high subjectivity in the task selection. This paper proposes a new fuzzy-based decision-making diagram to minimize the subjectivity of the task choice and preserve the cost-efficiency of the procedure. It uses a case from the railway industry to illustrate the suggested approach, but the procedure could be easily applied to different industrial and technological fields. The results of the proposed fuzzy approach highlight the importance of an accurate diagnostics (with an overall 86% of the task as diagnostic-based maintenance) and condition monitoring strategy (covering 54% of the tasks) to optimize the maintenance plan and to minimize the system availability. The findings show that the framework strongly mitigates the issues related to the classical RCM procedure, notably the high subjectivity of experts. It lays the groundwork for a general fuzzy-based reliability-centered maintenance method.This research received no external fundin

    A multi-layer monitoring system for clinical management of Congestive Heart Failure

    Get PDF
    BACKGROUND: Congestive Heart Failure (CHF) is a serious cardiac condition that brings high risks of urgent hospitalization and death. Remote monitoring systems are well-suited to managing patients suffering from CHF, and can reduce deaths and re-hospitalizations, as shown by the literature, including multiple systematic reviews. METHODS: The monitoring system proposed in this paper aims at helping CHF stakeholders make appropriate decisions in managing the disease and preventing cardiac events, such as decompensation, which can lead to hospitalization or death. Monitoring activities are stratified into three layers: scheduled visits to a hospital following up on a cardiac event, home monitoring visits by nurses, and patient's self-monitoring performed at home using specialized equipment. Appropriate hardware, desktop and mobile software applications were developed to enable a patient's monitoring by all stakeholders. For the first two layers, we designed and implemented a Decision Support System (DSS) using machine learning (Random Forest algorithm) to predict the number of decompensations per year and to assess the heart failure severity based on a variety of clinical data. For the third layer, custom-designed sensors (the Blue Scale system) for electrocardiogram (EKG), pulse transit times, bio-impedance and weight allowed frequent collection of CHF-related data in the comfort of the patient's home. We also performed a short-term Heart Rate Variability (HRV) analysis on electrocardiograms self-acquired by 15 healthy volunteers and compared the obtained parameters with those of 15 CHF patients from PhysioNet's PhysioBank archives. RESULTS: We report numerical performances of the DSS, calculated as multiclass accuracy, sensitivity and specificity in a 10-fold cross-validation. The obtained average accuracies are: 71.9% in predicting the number of decompensations and 81.3% in severity assessment. The most serious class in severity assessment is detected with good sensitivity and specificity (0.87 / 0.95), while, in predicting decompensation, high specificity combined with good sensitivity prevents false alarms. The HRV parameters extracted from the self-measured EKG using the Blue Scale system of sensors are comparable with those reported in the literature about healthy people. CONCLUSIONS: The performance of DSSs trained with new patients confirmed the results of previous work, and emphasizes the strong correlation between some CHF markers, such as brain natriuretic peptide (BNP) and ejection fraction (EF), with the outputs of interest. Comparing HRV parameters from healthy volunteers with HRV parameters obtained from PhysioBank archives, we confirm the literature that considers the HRV a promising method for distinguishing healthy from CHF patients

    Editorial

    Get PDF
    No abstract for the editorial

    FMECA Assessment for Railway Safety-Critical Systems Investigating a New Risk Threshold Method

    Get PDF
    This paper develops a Failure Mode, Effects and Criticality Analysis (FMECA) for a heating, ventilation and air conditioning (HVAC) system in railway. HVAC is a safety critical system which must ensure emergency ventilation in case of fire and in case of loss of primary ventilation functions. A study of the HVAC’s critical areas is mandatory to optimize its reliability and availability and consequently to guarantee a low operation and maintenance cost. The first part of the paper describes the FMECA which is performed and reported to highlight the main criticalities of the HVAC system under analysis. Secondly, the paper deals with the problem of the evaluation of a threshold risk value, which can distinguish negligible and critical failure modes. Literature barely considers the problem of an objective risk threshold estimation. Therefore, a new analytical method based on finite difference is introduced to find a univocal risk threshold value. The method is then tested on two Risk Priority Number datasets related to the same HVAC. The threshold obtained in both cases is a good tradeoff between the risk mitigation and the cost investment for the corrective actions required to mitigate the risk level. Finally, the threshold obtained with the proposed method is compared with the methods available in literature. The comparison shows that the proposed finite difference method is a well-structured technique, with a low computational cost. Furthermore, the proposed approach provides results in line with the literature, but it completely deletes the problem of subjectivity
    • …
    corecore