1,872 research outputs found

    Airborne Visible/Infrared Imaging spectrometer AVIS: Design, characterization and calibration

    Get PDF
    The Airborne Visible/Infrared imaging Spectrometer AVIS is a hyperspectral imager designed for environmental monitoring purposes. The sensor, which was constructed entirely from commercially available components, has been successfully deployed during several experiments between 1999 and 2007. We describe the instrument design and present the results of laboratory characterization and calibration of the system's second generation, AVIS-2, which is currently being operated. The processing of the data is described and examples of remote sensing reflectance data are presented

    Development of a Surgical Assistance System for Guiding Transcatheter Aortic Valve Implantation

    Get PDF
    Development of image-guided interventional systems is growing up rapidly in the recent years. These new systems become an essential part of the modern minimally invasive surgical procedures, especially for the cardiac surgery. Transcatheter aortic valve implantation (TAVI) is a recently developed surgical technique to treat severe aortic valve stenosis in elderly and high-risk patients. The placement of stented aortic valve prosthesis is crucial and typically performed under live 2D fluoroscopy guidance. To assist the placement of the prosthesis during the surgical procedure, a new fluoroscopy-based TAVI assistance system has been developed. The developed assistance system integrates a 3D geometrical aortic mesh model and anatomical valve landmarks with live 2D fluoroscopic images. The 3D aortic mesh model and landmarks are reconstructed from interventional angiographic and fluoroscopic C-arm CT system, and a target area of valve implantation is automatically estimated using these aortic mesh models. Based on template-based tracking approach, the overlay of visualized 3D aortic mesh model, landmarks and target area of implantation onto fluoroscopic images is updated by approximating the aortic root motion from a pigtail catheter motion without contrast agent. A rigid intensity-based registration method is also used to track continuously the aortic root motion in the presence of contrast agent. Moreover, the aortic valve prosthesis is tracked in fluoroscopic images to guide the surgeon to perform the appropriate placement of prosthesis into the estimated target area of implantation. An interactive graphical user interface for the surgeon is developed to initialize the system algorithms, control the visualization view of the guidance results, and correct manually overlay errors if needed. Retrospective experiments were carried out on several patient datasets from the clinical routine of the TAVI in a hybrid operating room. The maximum displacement errors were small for both the dynamic overlay of aortic mesh models and tracking the prosthesis, and within the clinically accepted ranges. High success rates of the developed assistance system were obtained for all tested patient datasets. The results show that the developed surgical assistance system provides a helpful tool for the surgeon by automatically defining the desired placement position of the prosthesis during the surgical procedure of the TAVI.Die Entwicklung bildgefĂŒhrter interventioneller Systeme wĂ€chst rasant in den letzten Jahren. Diese neuen Systeme werden zunehmend ein wesentlicher Bestandteil der technischen Ausstattung bei modernen minimal-invasiven chirurgischen Eingriffen. Diese Entwicklung gilt besonders fĂŒr die Herzchirurgie. Transkatheter Aortenklappen-Implantation (TAKI) ist eine neue entwickelte Operationstechnik zur Behandlung der schweren Aortenklappen-Stenose bei alten und Hochrisiko-Patienten. Die Platzierung der Aortenklappenprothese ist entscheidend und wird in der Regel unter live-2D-fluoroskopischen Bildgebung durchgefĂŒhrt. Zur UnterstĂŒtzung der Platzierung der Prothese wĂ€hrend des chirurgischen Eingriffs wurde in dieser Arbeit ein neues Fluoroskopie-basiertes TAKI Assistenzsystem entwickelt. Das entwickelte Assistenzsystem ĂŒberlagert eine 3D-Geometrie des Aorten-Netzmodells und anatomischen Landmarken auf live-2D-fluoroskopische Bilder. Das 3D-Aorten-Netzmodell und die Landmarken werden auf Basis der interventionellen Angiographie und Fluoroskopie mittels eines C-Arm-CT-Systems rekonstruiert. Unter Verwendung dieser Aorten-Netzmodelle wird das Zielgebiet der Klappen-Implantation automatisch geschĂ€tzt. Mit Hilfe eines auf Template Matching basierenden Tracking-Ansatzes wird die Überlagerung des visualisierten 3D-Aorten-Netzmodells, der berechneten Landmarken und der Zielbereich der Implantation auf fluoroskopischen Bildern korrekt ĂŒberlagert. Eine kompensation der Aortenwurzelbewegung erfolgt durch Bewegungsverfolgung eines Pigtail-Katheters in Bildsequenzen ohne Kontrastmittel. Eine starrere IntensitĂ€tsbasierte Registrierungsmethode wurde verwendet, um kontinuierlich die Aortenwurzelbewegung in Bildsequenzen mit Kontrastmittelgabe zu detektieren. Die Aortenklappenprothese wird in die fluoroskopischen Bilder eingeblendet und dient dem Chirurg als Leitfaden fĂŒr die richtige Platzierung der realen Prothese. Eine interaktive Benutzerschnittstelle fĂŒr den Chirurg wurde zur Initialisierung der Systemsalgorithmen, zur Steuerung der Visualisierung und fĂŒr manuelle Korrektur eventueller Überlagerungsfehler entwickelt. Retrospektive Experimente wurden an mehreren Patienten-DatensĂ€tze aus der klinischen Routine der TAKI in einem Hybrid-OP durchgefĂŒhrt. Hohe Erfolgsraten des entwickelten Assistenzsystems wurden fĂŒr alle getesteten Patienten-DatensĂ€tze erzielt. Die Ergebnisse zeigen, dass das entwickelte chirurgische Assistenzsystem ein hilfreiches Werkzeug fĂŒr den Chirurg bei der Platzierung Position der Prothese wĂ€hrend des chirurgischen Eingriffs der TAKI bietet

    The quantification of pressure and saturation changes in clastic reservoirs using 4D seismic data

    Get PDF
    The problem of quantifying pressure and saturation changes from 4D seismic data is an area of active research faced with many challenges concerning the non-uniqueness of seismic data inversion, non-repeatability noise in the data, the formulation of the inverse problem, and the use of appropriate constraints. The majority of the inversion methods rely on empirical rock-physics model calibrations linking elastic properties to expected pressure and saturation changes. Model-driven techniques indeed provide a theoretical framework for the practical interpretation of the 4D seismic response but pressure and saturation separation based on this approach are inconsistent with the observed 4D seismic response and insights from reservoir engineering. The outcome is a bias in estimated pressure and saturation changes and for some a leakage between the two. Others have addressed some of this bias using the causality between the induced-production and the observed 4D seismic response to formulate a direct, quick and less compute-intensive inversion - characterised by data-driven techniques. But challenges still remain as to the accuracy of the causality link- as defined by the reservoir’s sensitivity to production effects, and in defining appropriate constraints to tackle non-uniqueness of the seismic inversion and uncertainties in the 4D seismic data. The main contributions of this thesis are the enhancement of data-driven inversion approach by using multiple monitor 4D seismic data to quantify the reservoir’s sensitivity to pressure and saturation changes, together with the introduction of engineering-consistent constraints provided by multiple history-matched fluid-flow simulation models. A study using observed 4D seismic data (amplitudes and times shifts) acquired at different monitor times on four producing North Sea clastic fields demonstrates the reliability of the seismic-based method to decouple the reservoir’s sensitivity specific to each field’s geological characteristics. A natural extension is to combine multiple monitor 4D seismic data in an inversion scheme that solves for the reservoir sensitivity to pressure and saturation changes, the pressure and saturation changes themselves and the uncertainties in the inversion solution. At least two monitor 4D seismic datasets are required to solve for the reservoir’s sensitivity, and offset stacks (near, mid, and far) are required to decouple pressure, water and gas saturation changes. The generation and use of geologically-constrained and production-constrained multiple simulation models provided spatial constraints to the solution space, making the inversion scheme robust. Within the inversion, the fitness to spatial historical data, i.e. 4D seismic data acquired at different monitor times is analysed. The added benefit of using multiple monitor data is that it allows for a soft “close-the-loop” between the engineering and the 4D seismic domain. One step in the inversion scheme is repeated for as many history-matched simulation models as generated. Each model provides pressure and saturation input to the inversion to obtain maps of the reservoir’s sensitivity. By computing the norm of residuals for each inversion based on each model input, the best model (having the lowest norm of residuals) can be identified, besides the use of a history-matching objective. The inversion scheme thus marks the first step for a seismic-assisted history matching procedure, suggesting that pressure and saturation inversion is best done within the history-matching process. In addition, analysis of uncertainties in quantitative 4D seismic data interpretation is performed by developing a seismic modelling method that links the shot timings of a real field towed streamer and a permanent reservoir monitoring (PRM) acquisition to the reservoir under production. It is found that pressure and saturation fluctuations that occur during the shooting of monitor acquisitions creates a complicated spatio-temporal imprint on the pre-stack data, and errors if 4D seismic data is analysed in the post-stack domain. Pressure and saturation changes as imaged across the offset stacks (near, mid and far offset) are not the same, adding to the problems in separating pressure and saturation changes using offset stacks of 4D seismic data. The approximate modelling relay that the NRMS errors between offset stacks (up to 7.5%) caused by the intra-survey effects are likely at the limit of 4D seismic measurements using towed streamer technology, but are potentially observable, particularly for PRM technology. Intra-survey effects should thus be considered during 4D survey planning as well as during data processing and analysis. It is recommended that the shot timestamps of the acquisition is used to sort the seismic data immediately after pre-stack migration and before any stacking. The seismic data should also be shot quickly in a consistent pattern to optimise time and fold coverage. It is common to relate the simulation model output to a specific time within the acquisition (start, middle or end of survey), but this study reveals that it is best to take an average of simulation model predictions output at fine time intervals over the entire length of the acquisition, as this is a better temporal comparison to the acquired post-stack 4D seismic data

    Defect and thickness inspection system for cast thin films using machine vision and full-field transmission densitometry

    Get PDF
    Quick mass production of homogeneous thin film material is required in paper, plastic, fabric, and thin film industries. Due to the high feed rates and small thicknesses, machine vision and other nondestructive evaluation techniques are used to ensure consistent, defect-free material by continuously assessing post-production quality. One of the fastest growing inspection areas is for 0.5-500 micrometer thick thin films, which are used for semiconductor wafers, amorphous photovoltaics, optical films, plastics, and organic and inorganic membranes. As a demonstration application, a prototype roll-feed imaging system has been designed to inspect high-temperature polymer electrolyte membrane (PEM), used for fuel cells, after being die cast onto a moving transparent substrate. The inspection system continuously detects thin film defects and classifies them with a neural network into categories of holes, bubbles, thinning, and gels, with a 1.2% false alarm rate, 7.1% escape rate, and classification accuracy of 96.1%. In slot die casting processes, defect types are indicative of a misbalance in the mass flow rate and web speed; so, based on the classified defects, the inspection system informs the operator of corrective adjustments to these manufacturing parameters. Thickness uniformity is also critical to membrane functionality, so a real-time, full-field transmission densitometer has been created to measure the bi-directional thickness profile of the semi-transparent PEM between 25-400 micrometers. The local thickness of the 75 mm x 100 mm imaged area is determined by converting the optical density of the sample to thickness with the Beer-Lambert law. The PEM extinction coefficient is determined to be 1.4 D/mm and the average thickness error is found to be 4.7%. Finally, the defect inspection and thickness profilometry systems are compiled into a specially-designed graphical user interface for intuitive real-time operation and visualization.M.S.Committee Chair: Tequila Harris; Committee Member: Levent Degertekin; Committee Member: Wayne Dale

    Research and Development of the Passive Optoelectronic Rangefinder

    Get PDF

    Optimization strategies for respiratory motion management in stereotactic body radiation therapy

    Get PDF
    Various challenges arise during the treatment of lung tumors with stereotactic body radiation therapy (SBRT), which is a form of hypofractionated high precision conformal radiation therapy delivered to small targets. The dose is applied in only a few fractions and respiratory organ and tumor motion is a source of uncertainty additional to interfractional set-up errors. Respiratory organ and tumor motion is highly patient-specific and it affects the whole radiotherapy treatment chain. In this thesis, motion management techniques for SBRT are evaluated and improved in a clinical setting. A clinical need for improvement has been present at the LMU university hospital for each issue addressed in this thesis: Initially, the usage of respiratory correlated computed tomography (4DCT), which is vital for SBRT treatment, was seen as impractical and prone to uncertainties in the data reconstruction in its current form. Therefore, the 4DCT reconstruction workflow has been improved to minimize these potential error sources. Secondly, treatment planning for tumors affected by respiratory motion was evaluated and subsequently improved. Finally, the treatment technique of respiratory gating was implemented at the clinic, which led to the need of evaluating the respiratory gating characteristics of the novel system configuration. At first, the 4DCT reconstruction workflow used in clinical practice was investigated, as in the presence of respiratory motion the knowledge of tumor position over time is essential in SBRT treatments. Using 4DCT, the full motion range of the individual tumor can be determined. However, certain 4DCT reconstruction methods can under- or overestimate tumor motion due to limitations in the data acquisition scheme and due to the incorrect sorting of certain X-ray computed tomography (CT) image slices into different respiratory phases. As the regular clinical workflow of cycle-based sorting (CBS) without maximum inspiration detection (and therefore no clear starting point for the individual breathing cycles) seemed to be affected by these potential errors, the usage of CBS with correct maximum detection and another sorting algorithm of the respiration states, so-called local amplitude-based sorting (LAS), both have been implemented for a reduction of image artifacts and improved 4DCT quality. The three phase binning algorithms have been investigated in a phantom study (using 10 different breathing waveforms) and in a patient study (with 10 different patients). The mis-representation of the tumor volume was reduced in both implemented sorting algorithms compared to the previously used CBS approach (without correct maximum detection) in the phantom and the patient study. The clinical recommendation was the use of CBS with improved maximum detection, as too many manual interventions would be needed for the LAS workflow. Secondly, a combination of the actual patient breathing trace during treatment, the log files generated by the linear accelerator (LINAC), and Monte Carlo (MC) four-dimensional (4D) dose calculations for each individual fraction was implemented as a 4D dose evaluation tool. This workflow was tested in a clinical environment for SBRT treatment planning on multiple CT datasets featuring: a native free-breathing 3DCT, an average intensity projection (AIP) as well as a maximum intensity projection (MIP), both obtained from the patient's 4DCT, and density overrides (DOs) in a 3DCT. This study has been carried out for 5 SBRT patients for three-dimensional conformal radiation therapy (3D-CRT) and volumetric modulated arc therapy (VMAT) treatment plans. The dose has been recalculated on each 4DCT breathing phase according the the patient's breathing waveform and accumulated to the gross tumor volume (GTV) at the end-of-exhale (EOE) breathing phase using deformable image registration. Even though the least differences in planned and recalculated dose were found for AIP and MIP treatment planning, the results indicate a strong dependency on individual tumor motion due to the variability of breathing motion in general, and on tumor size. The combination of the patient's individual breathing trace during each SBRT fraction with 4D MC dose calculation based on the LINAC log file information leads to a good approximation of actual dose delivery. Finally, in order to ensure precise and accurate treatment for respiratory gating techniques, the technical characteristics of the LINAC in combination with a breathing motion monitoring system as s surrogate for tumor motion have to be identified. The dose delivery accuracy and the latency of a surface imaging system in connection with a modern medical LINAC were investigated using a dynamic breathing motion phantom. The dosimetric evaluation has been carried out using a static 2D-diode array. The measurement of the dose difference between gated and ungated radiation delivery was found to be below 1% (for clinical relevant gating levels of about 30%). The beam-on latency, or time delay, determined using radiographic films was found to be up to 851 ms±100 ms. With these known parameters, an adjustment of the pre-selected gating level or the internal target volume (ITV) margins could be made. With the highly patient-specific character of respiratory motion, lung SBRT faces many additional challenges besides the specific issues addressed in this thesis. However, the findings of this thesis have improved clinical workflows at the Department of Radiation Oncology of the LMU University hospital. In a future perspective, a workflow using evaluation of the actual 4D dose in combination with accurate 4DCT image acquisition and specialized treatment delivery (such as respiratory gating) has the potential for a safe further reduction of treatment margins and increased sparing of organs-at-risk (OARs) in SBRT without compromising tumor dose targeting accuracy

    Statistische Feed-Forward Regelung fĂŒr die Montage in Fertigungsprozessen mit Grosser Streuung

    Get PDF
    This thesis embraces the challenge of developing an assembling technique to deal with the problem of producing assemblies that have low dimensional variation by means of mating components that have high dimensional variation. The main objectives of the proposed model are the reduction of the resulting variation, the reduction of the scrap levels and the improvement of the process capability indices. The innovative approach proposes the dynamic management of specifications, target and tolerance, using the Statistical Dynamic Specifications Method (SDSM) and the assembling of complementing groups using the Statistical Feed-Forward Control Model (SFFCM). While SDSM can be seen as a statistical tool that helps determine the right specification adjustments using reduced samples taken from groups of items produced consecutively during a short-time interval; SFFCM can be seen as a monitoring tool that helps counter the effect of an eventual ĂŽdetectableö long-term component present in the variation. With the help of proprietary software, the production of lots of one thousand assemblies made of two components coming from processes characterized for producing items that present high dimensional variation was simulated. For the analyzed conditions, in comparison to a fully randomized assembling, simulation results revealed an average reduction by 89% of the mean shift, an average reduction by 14% of the standard deviation, an average improvement of the actual capability index of the assembling process by 16%, an average improvement of the potential capability index of the assembling process by 101% and an average reduction of the assemblies out of tolerance by 100%. In conclusion, the proposed SFFCM-based assembling technique effectively helped achieve the mayor objectives of this thesis: reduce the process variation, reduce the scrap level and improve the process capability indices.Die BeitrĂ€ge dieser Arbeit zum Ingenieurwissen sind das hier vorgestellte Statistical Feed-Forward Control Model (SFFCM) als Kern einer innovativen Montage-Technik und die Statistical Dynamic Specifications Method (SDSM), um dynamische Spezifikationen und Toleranzen zu verwalten. Diese Arbeit nimmt die Herausforderung an, eine neuen Montage-Technik zu entwickeln, um die Herstellung von Baugruppen geringer Variation durch Paarung von Komponenten hoher Variation zu erreichen. Die wichtigsten Ergebnisse des vorgeschlagenen Modells sind die Reduzierung der resultierenden Variation, die Reduzierung der Ausschussrate und die Verbesserung der ProzessfĂ€higkeits-Indizes. Mit Hilfe der speziell entwickelten Dynamic Assembling Simulation Software (DASS) wurde eine große Reihe von Experimenten entwickelt, um die Produktion von vielen 1.000 Tsd. Baugruppen aus zwei Komponenten hoher Variation zu simulieren, so dass die einzelnen und kombinierten EinflĂŒsse verschiedener die Produktion betreffender Faktoren ausgewertet werden können. Die Simulationsergebnisse zeigten, im Vergleich zur vollstĂ€ndig randomisierten Montage, eine durchschnittliche Reduktion des verschobenen Erwartungswerts um 89%, eine durchschnittliche Reduktion der Standardabweichung um 14%, eine durchschnittliche Verbesserung des ProzessfĂ€higkeitsindex Cp des Montageprozesses um 16%, eine durchschnittliche Verbesserung des ProzessfĂ€higkeitsindex cpk des Montageprozesses um 101% und eine durchschnittliche Reduktion der Elemente außerhalb der Toleranz um 100%. Im Ergebnis trĂ€gt die vorgeschlagene SFFCM-basierte Montagetechnik wirksam dazu bei, die Hauptziele dieser Arbeit zu erreichen: reduzieren der Prozessvariation, reduzieren der Ausschussrate und verbessern der ProzessfĂ€higkeitsindizes. Zusammengefasst ist es möglich, mit Komponenten hoher Streuung zu Baugruppen mit geringer Maßabweichung zu gelangen

    Reviewing Traffic ClassificationData Traffic Monitoring and Analysis

    Get PDF
    Traffic classification has received increasing attention in the last years. It aims at offering the ability to automatically recognize the application that has generated a given stream of packets from the direct and passive observation of the individual packets, or stream of packets, flowing in the network. This ability is instrumental to a number of activities that are of extreme interest to carriers, Internet service providers and network administrators in general. Indeed, traffic classification is the basic block that is required to enable any traffic management operations, from differentiating traffic pricing and treatment (e.g., policing, shaping, etc.), to security operations (e.g., firewalling, filtering, anomaly detection, etc.). Up to few years ago, almost any Internet application was using well-known transport layer protocol ports that easily allowed its identification. More recently, the number of applications using random or non-standard ports has dramatically increased (e.g. Skype, BitTorrent, VPNs, etc.). Moreover, often network applications are configured to use well-known protocol ports assigned to other applications (e.g. TCP port 80 originally reserved for Web traffic) attempting to disguise their presence. For these reasons, and for the importance of correctly classifying traffic flows, novel approaches based respectively on packet inspection, statistical and machine learning techniques, and behavioral methods have been investigated and are becoming standard practice. In this chapter, we discuss the main trend in the field of traffic classification and we describe some of the main proposals of the research community. We complete this chapter by developing two examples of behavioral classifiers: both use supervised machine learning algorithms for classifications, but each is based on different features to describe the traffic. After presenting them, we compare their performance using a large dataset, showing the benefits and drawback of each approac

    Online Analysis of Dynamic Streaming Data

    Get PDF
    Die Arbeit zum Thema "Online Analysis of Dynamic Streaming Data" beschĂ€ftigt sich mit der Distanzmessung dynamischer, semistrukturierter Daten in kontinuierlichen Datenströmen um Analysen auf diesen Datenstrukturen bereits zur Laufzeit zu ermöglichen. Hierzu wird eine Formalisierung zur Distanzberechnung fĂŒr statische und dynamische BĂ€ume eingefĂŒhrt und durch eine explizite Betrachtung der Dynamik von Attributen einzelner Knoten der BĂ€ume ergĂ€nzt. Die Echtzeitanalyse basierend auf der Distanzmessung wird durch ein dichte-basiertes Clustering ergĂ€nzt, um eine Anwendung des Clustering, einer Klassifikation, aber auch einer Anomalieerkennung zu demonstrieren. Die Ergebnisse dieser Arbeit basieren auf einer theoretischen Analyse der eingefĂŒhrten Formalisierung von Distanzmessungen fĂŒr dynamische BĂ€ume. Diese Analysen werden unterlegt mit empirischen Messungen auf Basis von Monitoring-Daten von Batchjobs aus dem Batchsystem des GridKa Daten- und Rechenzentrums. Die Evaluation der vorgeschlagenen Formalisierung sowie der darauf aufbauenden Echtzeitanalysemethoden zeigen die Effizienz und Skalierbarkeit des Verfahrens. Zudem wird gezeigt, dass die Betrachtung von Attributen und Attribut-Statistiken von besonderer Bedeutung fĂŒr die QualitĂ€t der Ergebnisse von Analysen dynamischer, semistrukturierter Daten ist. Außerdem zeigt die Evaluation, dass die QualitĂ€t der Ergebnisse durch eine unabhĂ€ngige Kombination mehrerer Distanzen weiter verbessert werden kann. Insbesondere wird durch die Ergebnisse dieser Arbeit die Analyse sich ĂŒber die Zeit verĂ€ndernder Daten ermöglicht

    Normal tissue complication probability modelling

    Get PDF
    • 

    corecore