5,109 research outputs found

    Radial Velocity Prospects Current and Future: A White Paper Report prepared by the Study Analysis Group 8 for the Exoplanet Program Analysis Group (ExoPAG)

    Full text link
    [Abridged] The Study Analysis Group 8 of the NASA Exoplanet Analysis Group was convened to assess the current capabilities and the future potential of the precise radial velocity (PRV) method to advance the NASA goal to "search for planetary bodies and Earth-like planets in orbit around other stars.: (U.S. National Space Policy, June 28, 2010). PRVs complement other exoplanet detection methods, for example offering a direct path to obtaining the bulk density and thus the structure and composition of transiting exoplanets. Our analysis builds upon previous community input, including the ExoPlanet Community Report chapter on radial velocities in 2008, the 2010 Decadal Survey of Astronomy, the Penn State Precise Radial Velocities Workshop response to the Decadal Survey in 2010, and the NSF Portfolio Review in 2012. The radial-velocity detection of exoplanets is strongly endorsed by both the Astro 2010 Decadal Survey "New Worlds, New Horizons" and the NSF Portfolio Review, and the community has recommended robust investment in PRVs. The demands on telescope time for the above mission support, especially for systems of small planets, will exceed the number of nights available using instruments now in operation by a factor of at least several for TESS alone. Pushing down towards true Earth twins will require more photons (i.e. larger telescopes), more stable spectrographs than are currently available, better calibration, and better correction for stellar jitter. We outline four hypothetical situations for PRV work necessary to meet NASA mission exoplanet science objectives.Comment: ExoPAG SAG 8 final report, 112 pages, fixed author name onl

    Proton tracking in a high-granularity Digital Tracking Calorimeter for proton CT purposes

    Get PDF
    Radiation therapy with protons as of today utilizes information from x-ray CT in order to estimate the proton stopping power of the traversed tissue in a patient. The conversion from x-ray attenuation to proton stopping power in tissue introduces range uncertainties of the order of 2-3% of the range, uncertainties that are contributing to an increase of the necessary planning margins added to the target volume in a patient. Imaging methods and modalities, such as Dual Energy CT and proton CT, have come into consideration in the pursuit of obtaining an as good as possible estimate of the proton stopping power. In this study, a Digital Tracking Calorimeter is benchmarked for proof-of-concept for proton CT purposes. The Digital Tracking Calorimeteris applied for reconstruction of the tracks and energies of individual high energy protons. The presented prototype forms the basis for a proton CT system using a single technology for tracking and calorimetry. This advantage simplifies the setup and reduces the cost of a proton CT system assembly, and it is a unique feature of the Digital Tracking Calorimeter. Data from the AGORFIRM beamline at KVI-CART in Groningen in the Netherlands and Monte Carlo simulation results are used to in order to develop a tracking algorithm for the estimation of the residual ranges of a high number of concurrent proton tracks. The range of the individual protons can at present be estimated with a resolution of 4%. The readout system for this prototype is able to handle an effective proton frequency of 1 MHz by using 500 concurrent proton tracks in each readout frame, which is at the high end range of present similar prototypes. A future further optimized prototype will enable a high-speed and more accurate determination of the ranges of individual protons in a therapeutic beam.Comment: 21 pages, 8 figure

    Information-theoretic assessment of on-board near-lossless compression of hyperspectral data

    Get PDF
    A rate-distortion model to measure the impact of near-lossless compression of raw data, that is, compression with user-defined maximum absolute error, on the information avail- able once the compressed data have been received and decompressed is proposed. Such a model requires the original uncompressed raw data and their measured noise variances. Advanced near- lossless methods are exploited only to measure the entropy of the datasets but are not required for on-board compression. In substance, the acquired raw data are regarded as a noisy realization of a noise-free spectral information source. The useful spectral information at the decoder is the mutual information between the unknown ideal source and the decoded source, which is affected by both instrument noise and compression-induced distortion. Experiments on simulated noisy images, in which the noise-free source and the noise realization are exactly known, show the trend of spectral information versus compression distortion, which in turn is related to the coded bit rate or equivalently to the compression ratio through the rate-distortion characteristic of the encoder used on satellite. Preliminary experiments on airborne visible infrared imaging spec- trometer (AVIRIS) 2006 Yellowstone sequences match the trends of the simulations. The main conclusion that can be drawn is that the noisier the dataset, the lower the CR that can be tolerated, in order to save a prefixed amount of spectral information. © The Authors. Published by SPIE under a Creative Commons Attribution 3.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI. (DOI: 10.1117/1.JRS.

    Computational and experimental methods for imaging and dosimetry in 177Lu radionuclide therapy : Classical and novel gamma cameras

    Get PDF
    Radionuclide therapy (RNT) is a form of radiotherapy that uses unsealed radioactive sources for the delivery of ionising radiation within a patient's body.Radiation dosimetry is not used routinely in all centres, and the RNT field can benefit from more data on pharmacokinetics and absorbed doses (ADs).Consequently, there is a value in developing and investigating methods that facilitates the acquirement of pharmacokinetic data and AD calculation.Papers I and IV focus on tumour dosimetry in peptide receptor radionuclide therapy (PRRT) with [177Lu]Lu-DOTA-TATE. In Paper I, a method for tumour dosimetry was developed, with the intention to be applicable to image sets consisting of a combination of planar and single photon emission computed tomography (SPECT) images. Semi-automatic segmentation methods are developed and employed for robustness and to alleviate the operator workload. Evaluation showed that the dosimetry method worked well provided that tumour selection criteria were applied. In Paper IV, this method was applied across all treatment cycles in a larger set of patients, producing a large collection of ADs and pharmacokinetics data for tumours. Analysis showed how the ADs evolved over treatment cycles and how this could be explained by changes in the pharmacokinetic parameters, findings which in the long run could help in the design of new treatment and imaging protocols.Papers II, III and V focus on a cadmium zinc telluride (CZT)-based hand-held gamma camera and lay the groundwork for its application within RNT. In Paper II, the camera was characterised and the feasibility of using it for 177Lu imaging was investigated. We found that it was capable of producing useful images and identified appropriate collimators and energy windows. In Paper III, we sought to improve the understanding of how the energy-tailing associated with the CZT-crystal affected 177Lu imaging with the help of Monte Carlo simulations. The wide range of energies of interest for 177Lu meant that new model of the camera system had to be developed and tuned to reproduce the camera's behaviour. Through the model, we were able to gain a better understanding of the camera and estimate the interference of higher-energy photons on lower energy windows. In Paper V, we aimed to develop a method with which the camera could be used for activity quantification. This was done by adapting a dual-photopeak method to 177Lu, a method in which measurements over multiple photopeaks are employed to infer the depth of a source, allowing for activity-quantification with attenuation-correction

    OCM 2021 - Optical Characterization of Materials : Conference Proceedings

    Get PDF
    The state of the art in the optical characterization of materials is advancing rapidly. New insights have been gained into the theoretical foundations of this research and exciting developments have been made in practice, driven by new applications and innovative sensor technologies that are constantly evolving. The great success of past conferences proves the necessity of a platform for presentation, discussion and evaluation of the latest research results in this interdisciplinary field

    OCM 2021 - Optical Characterization of Materials

    Get PDF
    The state of the art in the optical characterization of materials is advancing rapidly. New insights have been gained into the theoretical foundations of this research and exciting developments have been made in practice, driven by new applications and innovative sensor technologies that are constantly evolving. The great success of past conferences proves the necessity of a platform for presentation, discussion and evaluation of the latest research results in this interdisciplinary field

    NOISE ESTIMATION OF HYPERSPECTRAL REMOTE SENSING IMAGE BASED ON MULTIPLE LINEAR REGRESSION AND WAVELET TRANSFORM

    Get PDF
    Noise estimation of hyperspectral remote sensing image is important for itspost-processing and application. In this paper, not only the spectral correlationremoving is considered, but the spatial correlation removing by wavelet transform isconsidered as well. Therefore, a new method based on multiple linear regression(MLR) and wavelet transform is proposed to estimate the noise of hyperspectralremote sensing image. Numerical simulation of AVIRIS data is carried out and thereal data Hyperion is also used to validate the proposed algorithm. Experimentalresults show that the method is more adaptive and accurate than the general MLRand the other classified methods
    corecore