9,326 research outputs found
2D and 3D reconstructions in acousto-electric tomography
We propose and test stable algorithms for the reconstruction of the internal
conductivity of a biological object using acousto-electric measurements.
Namely, the conventional impedance tomography scheme is supplemented by
scanning the object with acoustic waves that slightly perturb the conductivity
and cause the change in the electric potential measured on the boundary of the
object. These perturbations of the potential are then used as the data for the
reconstruction of the conductivity. The present method does not rely on
"perfectly focused" acoustic beams. Instead, more realistic propagating
spherical fronts are utilized, and then the measurements that would correspond
to perfect focusing are synthesized. In other words, we use \emph{synthetic
focusing}. Numerical experiments with simulated data show that our techniques
produce high quality images, both in 2D and 3D, and that they remain accurate
in the presence of high-level noise in the data. Local uniqueness and stability
for the problem also hold
Class Hierarchy Complementation: Soundly Completing a Partial Type Graph
We present the problem of class hierarchy complementa-
tion: given a partially known hierarchy of classes together
with subtyping constraints (âA has to be a transitive sub-
type of Bâ) complete the hierarchy so that it satisfies all con-
straints. The problem has immediate practical application to
the analysis of partial programsâe.g., it arises in the process
of providing a sound handling of âphantom classesâ in the
Soot program analysis framework. We provide algorithms to
solve the hierarchy complementation problem in the single
inheritance and multiple inheritance settings. We also show
that the problem in a language such as Java, with single in-
heritance but multiple subtyping and distinguished class vs.
interface types, can be decomposed into separate single- and
multiple-subtyping instances. We implement our algorithms
in a tool, JPhantom, which complements partial Java byte-
code programs so that the result is guaranteed to satisfy the
Java verifier requirements. JPhantom is highly scalable and
runs in mere seconds even for large input applications and
complex constraints (with a maximum of 14s for a 19MB
binary)
Monte-Carlo Simulations of Radiation-Induced Activation in a Fast-Neutron and Gamma- Based Cargo Inspection System
An air cargo inspection system combining two nuclear reaction based
techniques, namely Fast-Neutron Resonance Radiography and Dual-Discrete-Energy
Gamma Radiography is currently being developed. This system is expected to
allow detection of standard and improvised explosives as well as special
nuclear materials. An important aspect for the applicability of nuclear
techniques in an airport inspection facility is the inventory and lifetimes of
radioactive isotopes produced by the neutron and gamma radiation inside the
cargo, as well as the dose delivered by these isotopes to people in contact
with the cargo during and following the interrogation procedure. Using MCNPX
and CINDER90 we have calculated the activation levels for several typical
inspection scenarios. One example is the activation of various metal samples
embedded in a cotton-filled container. To validate the simulation results, a
benchmark experiment was performed, in which metal samples were activated by
fast-neutrons in a water-filled glass jar. The induced activity was determined
by analyzing the gamma spectra. Based on the calculated radioactive inventory
in the container, the dose levels due to the induced gamma radiation were
calculated at several distances from the container and in relevant time windows
after the irradiation, in order to evaluate the radiation exposure of the cargo
handling staff, air crew and passengers during flight. The possibility of
remanent long-lived radioactive inventory after cargo is delivered to the
client is also of concern and was evaluated.Comment: Proceedings of FNDA 201
Development of quality standards for multi-center, longitudinal magnetic resonance imaging studies in clinical neuroscience
Magnetic resonance imaging (MRI) data is generated by a complex procedure. Many possible sources of error exist which can lead to a worse signal. For example, hidden defective components of a MRI-scanner, changes in the static magnetic field caused by a person simply moving in the MRI scanner room as well as changes in the measurement sequences can negatively affect the signal-to-noise ratio (SNR). A comprehensive, reproducible, quality assurance (QA) procedure is necessary, to ensure reproducible results both from the MRI equipment and the human operator of the equipment. To examine the quality of the MRI data, there are two possibilities. On the one hand, water or gel-filled objects, so-called "phantoms", are regularly measured. Based on this signal, which in the best case should always be stable, the general performance of the MRI scanner can be tested. On the other hand, the actually interesting data, mostly human data, are checked directly for certain signal parameters (e.g., SNR, motion parameters).
This thesis consists of two parts. In the first part a study-specific QA-protocol was developed for a large multicenter MRI-study, FOR2107. The aim of FOR2107 is to investigate the causes and course of affective disorders, unipolar depression and bipolar disorders, taking clinical and neurobiological effects into account. The main aspect of FOR2107 is the MRI-measurement of more than 2000 subjects in a longitudinal design (currently repeated measurements after 2 years, further measurements planned after 5 years). To bring MRI-data and disease history together, MRI-data must provide stable results over the course of the study. Ensuring this stability is dealt with in this part of the work. An extensive QA, based on phantom measurements, human data analysis, protocol compliance testing, etc., was set up. In addition to the development of parameters for the characterization of MRI-data, the used QA-protocols were improved during the study. The differences between sites and the impact of these differences on human data analysis were analyzed. The comprehensive quality assurance for the FOR2107 study showed significant differences in MRI-signal (for human and phantom data) between the centers. Occurring problems could easily be recognized in time and be corrected, and must be included for current and future analyses of human data.
For the second part of this thesis, a QA-protocol (and the freely available associated software "LAB-QA2GO") has been developed and tested, and can be used for individual studies or to control the quality of an MRI-scanner. This routine was developed because at many sites and in many studies, no explicit QA is performed nevertheless suitable, freely available QA-software for MRI-measurements is available. With LAB-QA2GO, it is possible to set up a QA-protocol for an MRI-scanner or a study without much effort and IT knowledge.
Both parts of the thesis deal with the implementation of QA-procedures. High quality data and study results can be achieved only by the usage of appropriate QA-procedures, as presented in this work. Therefore, QA-measures should be implemented at all levels of a project and should be implemented permanently in project and evaluation routines
A Data-Driven Edge-Preserving D-bar Method for Electrical Impedance Tomography
In Electrical Impedance Tomography (EIT), the internal conductivity of a body
is recovered via current and voltage measurements taken at its surface. The
reconstruction task is a highly ill-posed nonlinear inverse problem, which is
very sensitive to noise, and requires the use of regularized solution methods,
of which D-bar is the only proven method. The resulting EIT images have low
spatial resolution due to smoothing caused by low-pass filtered regularization.
In many applications, such as medical imaging, it is known \emph{a priori} that
the target contains sharp features such as organ boundaries, as well as
approximate ranges for realistic conductivity values. In this paper, we use
this information in a new edge-preserving EIT algorithm, based on the original
D-bar method coupled with a deblurring flow stopped at a minimal data
discrepancy. The method makes heavy use of a novel data fidelity term based on
the so-called {\em CGO sinogram}. This nonlinear data step provides superior
robustness over traditional EIT data formats such as current-to-voltage
matrices or Dirichlet-to-Neumann operators, for commonly used current patterns.Comment: 24 pages, 11 figure
- âŚ