4,372 research outputs found
Evaluation of Intra- and Interscanner Reliability of MRI Protocols for Spinal Cord Gray Matter and Total Cross-Sectional Area Measurements.
BackgroundIn vivo quantification of spinal cord atrophy in neurological diseases using MRI has attracted increasing attention.PurposeTo compare across different platforms the most promising imaging techniques to assess human spinal cord atrophy.Study typeTest/retest multiscanner study.SubjectsTwelve healthy volunteers.Field strength/sequenceThree different 3T scanner platforms (Siemens, Philips, and GE) / optimized phase sensitive inversion recovery (PSIR), T1 -weighted (T1 -w), and T2 *-weighted (T2 *-w) protocols.AssessmentOn all images acquired, two operators assessed contrast-to-noise ratio (CNR) between gray matter (GM) and white matter (WM), and between WM and cerebrospinal fluid (CSF); one experienced operator measured total cross-sectional area (TCA) and GM area using JIM and the Spinal Cord Toolbox (SCT).Statistical testsCoefficient of variation (COV); intraclass correlation coefficient (ICC); mixed effect models; analysis of variance (t-tests).ResultsFor all the scanners, GM/WM CNR was higher for PSIR than T2 *-w (P < 0.0001) and WM/CSF CNR for T1 -w was the highest (P < 0.0001). For TCA, using JIM, median COVs were smaller than 1.5% and ICC >0.95, while using SCT, median COVs were in the range 2.2-2.75% and ICC 0.79-0.95. For GM, despite some failures of the automatic segmentation, median COVs using SCT on T2 *-w were smaller than using JIM manual PSIR segmentations. In the mixed effect models, the subject was always the main contributor to the variance of area measurements and scanner often contributed to TCA variance (P < 0.05). Using JIM, TCA measurements on T2 *-w were different than on PSIR (P = 0.0021) and T1 -w (P = 0.0018), while using SCT, no notable differences were found between T1 -w and T2 *-w (P = 0.18). JIM and SCT-derived TCA were not different on T1 -w (P = 0.66), while they were different for T2 *-w (P < 0.0001). GM area derived using SCT/T2 *-w versus JIM/PSIR were different (P < 0.0001).Data conclusionThe present work sets reference values for the magnitude of the contribution of different effects to cord area measurement intra- and interscanner variability.Level of evidence1 Technical Efficacy: Stage 4 J. Magn. Reson. Imaging 2019;49:1078-1090
Detection of Magnetic Fields Using Fibre Optic Interferometric Sensors.
The principle aim of the work described in this thesis is to determine a suitable optical detection system for d.c. and low frequency magnetic fields which are likely to be encountered in practical magnetometer applications. To construct a sensitive magneotmeter one arm of an optical fibre Mach-Zehnder interferometer has been magnetically sensitised using a magnetostrictive material. Since the signal frequency range of interest was in the region of 0.01 to 10Hz, clearly the signal was in the same frequency band as the environmental noise associated with ambient temperature and pressure variations. Initially, a technique was developed to measure the magnetic field from the shift of the total fringe pattern generated by a modified Mach-Zehnder interferometer and a minimum detectable magnetic field of 10e-7 tesla.m. was obtained. This minimum detectable magnetic field has been improved by a number of modifications. A technique has been developed which utilises an a.c. bias field to put the magnetic signal on a carrier so that it can be measured at a frequency where the amplitude of the interferometer 1/f noise is much reduced. To maintain maximum interferometric sensitivity to this signal active homodyne demodulation techniques have been developed to maintain the interferometer at quadrature by compensating for the environmental noise. A minimum detectable magnetic field of 5x10e-10 tesla.m. has been achieved with this system. As an alternative to the Mach-Zehnder interferometer a Fabry-Perot interferometer, which utilises multiple-beam interference, has been considered. This type of interferometer consists of a single fibre with high reflectivity coatings on its ends. Such an interferometer has been used as a sensor and as an external cavity in laser frequency stabilisation scheme
School building survey and program for Portsmouth and Middletown, RI /
Thesis (M.A.)--Boston University, 1931. This item was digitized by the Internet Archive
Anorogenic plateau formation: The importance of density changes in the lithosphere
International audienceAway from active plate boundaries the relationships between spatiotemporal variations in density and geothermal gradient are important for understanding the evolution of topography in continental interiors. In this context the classic concept of the continental lithosphere as comprising three static layers of different densities (upper crust, lower crust, and upper mantle) is not adequate to assess long-term changes in topography and relief in regions associated with pronounced thermal anomalies in the mantle. We have therefore developed a one-dimensional model, which is based on thermodynamic equilibrium assemblage computations and deliberately excludes the effects of melting processes like intrusion or extrusions. Our model calculates the "metamorphic density" of rocks as a function of pressure, temperature, and chemical composition. It not only provides a useful tool for quantifying the influence of petrologic characteristics on density, but also allows the modeled "metamorphic" density to be adjusted to variable geothermal gradients and applied to different geodynamic environments. We have used this model to simulate a scenario in which the lithosphere-asthenosphere boundary is subjected to continuous heating over a long period of time (130 Ma), and demonstrate how an anorogenic plateau with an elevation of 1400 m can be formed solely as a result of heat transfer within the continental lithosphere. Our results show that, beside dynamic topography (of asthenospheric origin), density changes within the lithosphere have an important impact on the evolution of anorogenic plateaus
Recommended from our members
An Open-Source Tool for Anisotropic Radiation Therapy Planning in Neuro-oncology Using DW-MRI Tractography.
There is evidence from histopathological studies that glioma tumor cells migrate preferentially along large white matter bundles. If the peritumoral white matter structures can be used to predict the likely trajectory of migrating tumor cells outside of the surgical margin, then this information could be used to inform the delineation of radiation therapy (RT) targets. In theory, an anisotropic expansion that takes large white matter bundle anatomy into account may maximize the chances of treating migrating cancer cells and minimize the amount of brain tissue exposed to high doses of ionizing radiation. Diffusion-weighted MRI (DW-MRI) can be used in combination with fiber tracking algorithms to model the trajectory of large white matter pathways using the direction and magnitude of water movement in tissue. The method presented here is a tool for translating a DW-MRI fiber tracking (tractography) dataset into a white matter path length (WMPL) map that assigns each voxel the shortest distance along a streamline back to a specified region of interest (ROI). We present an open-source WMPL tool, implemented in the package Diffusion Imaging in Python (DIPY), and code to convert the resulting WMPL map to anisotropic contours for RT in a commercial treatment planning system. This proof-of-concept lays the groundwork for future studies to evaluate the clinical value of incorporating tractography modeling into treatment planning
Orthogonal polynomials and the moment problem
The classical moment problem concerns distribution functions on the real
line. The central feature is the connection between distribution functions
and the moment sequences which they generate via a Stieltjes integral. The
solution of the classical moment problem leads to the well known theorem
of Favard which connects orthogonal polynomial sequences with distribution
functions on the real line. Orthogonal polynomials in their turn arise
in the computation of measures via continued fractions and the Nevanlinna
parametrisation. In this dissertation classical orthogonal polynomials are investigated
rst and their connection with hypergeometric series is exhibited.
Results from the moment problem allow the study of a more general class
of orthogonal polynomials. q-Hypergeometric series are presented in analogy
with the ordinary hypergeometric series and some results on q-Laguerre
polynomials are given. Finally recent research will be discussed
Time response of oxygen optodes on profiling platforms and its dependence on flow speed and temperature
The time response behavior of Aanderaa optodes model 3830, 4330, and 4330F, as well as a Sea-Bird SBE63 optode and a JFE Alec Co. Rinko dissolved oxygen sensor was analyzed both in the laboratory and in the field. The main factor for the time response is the dynamic regime, i.e., the water flow around the sensor that influences the boundary layer’s dynamics. Response times can be drastically reduced if the sensors are pumped. Laboratory experiments under different dynamic conditions showed a close to linear relation between response time and temperature. Application of a diffusion model including a stagnant boundary layer revealed that molecular diffusion determines the temperature behavior, and that the boundary layer thickness was temperature independent. Moreover, field experiments matched the laboratory findings, with the profiling speed and mode of attachment being of prime importance. The time response was characterized for typical deployments on shipboard CTDs, gliders, and floats, and tools are presented to predict the response time as well as to quantify the effect on the data for a given water mass profile. Finally, the problem of inverse filtering optode data to recover some of the information lost by their time response is addressed
- …