4,766 research outputs found
ICAP: An Interactive Cluster Analysis Procedure for analyzing remotely sensed data
An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. The algorithm interfaces the rapid numerical processing capacity of a computer with the human ability to integrate qualitative information. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters and the analyst, who evaluate and elect to modify the cluster structure. Clusters can be deleted or lumped pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The ICAP was implemented in APL (A Programming Language), an interactive computer language. The flexibility of the algorithm was evaluated using data from different LANDSAT scenes to simulate two situations: one in which the analyst is assumed to have no prior knowledge about the data and wishes to have the clusters formed more or less automatically; and the other in which the analyst is assumed to have some knowledge about the data structure and wishes to use that information to closely supervise the clustering process. For comparison, an existing clustering method was also applied to the two data sets
Using CMOS Sensors in a Cellphone for Gamma Detection and Classification
The CMOS camera found in many cellphones is sensitive to ionized electrons.
Gamma rays penetrate into the phone and produce ionized electrons that are then
detected by the camera. Thermal noise and other noise needs to be removed on
the phone, which requires an algorithm that has relatively low memory and
computational requirements. The continuous high-delta algorithm described fits
those requirements. Only a small fraction of the energy of even the electron is
deposited in the camera sensor, so direct methods of measuring the energy
cannot be used. The fraction of groups of lit up pixels that are lines is
correlated with the energy of the gamma rays. This correlation under certain
conditions allows limited low resolution energy resolution to be performed
LAPR: An experimental aircraft pushbroom scanner
A three band Linear Array Pushbroom Radiometer (LAPR) was built and flown on an experimental basis by NASA at the Goddard Space Flight Center. The functional characteristics of the instrument and the methods used to preprocess the data, including radiometric correction, are described. The radiometric sensitivity of the instrument was tested and compared to that of the Thematic Mapper and the Multispectral Scanner. The radiometric correction procedure was evaluated quantitatively, using laboratory testing, and qualitatively, via visual examination of the LAPR test flight imagery. Although effective radiometric correction could not yet be demonstrated via laboratory testing, radiometric distortion did not preclude the visual interpretation or parallel piped classification of the test imagery
The Universe is not a Computer
When we want to predict the future, we compute it from what we know about the
present. Specifically, we take a mathematical representation of observed
reality, plug it into some dynamical equations, and then map the time-evolved
result back to real-world predictions. But while this computational process can
tell us what we want to know, we have taken this procedure too literally,
implicitly assuming that the universe must compute itself in the same manner.
Physical theories that do not follow this computational framework are deemed
illogical, right from the start. But this anthropocentric assumption has
steered our physical models into an impossible corner, primarily because of
quantum phenomena. Meanwhile, we have not been exploring other models in which
the universe is not so limited. In fact, some of these alternate models already
have a well-established importance, but are thought to be mathematical tricks
without physical significance. This essay argues that only by dropping our
assumption that the universe is a computer can we fully develop such models,
explain quantum phenomena, and understand the workings of our universe. (This
essay was awarded third prize in the 2012 FQXi essay contest; a new afterword
compares and contrasts this essay with Robert Spekkens' first prize entry.)Comment: 10 pages with new afterword; matches published versio
Determination of minimum concentrations of environmental water capable of supporting life Semiannual report, 1 Nov. 1968 - 30 Apr. 1969
Systematic analysis of exchange of tritiated water between mite and surrounding vapo
Application of digital terrain data to quantify and reduce the topographic effect on LANDSAT data
Integration of LANDSAT multispectral scanner (MSS) data with 30 m U.S. Geological Survey (USGS) digital terrain data was undertaken to quantify and reduce the topographic effect on imagery of a forested mountain ridge test site in central Pennsylvania. High Sun angle imagery revealed variation of as much as 21 pixel values in data for slopes of different angles and aspects with uniform surface cover. Large topographic effects were apparent in MSS 4 and 5 was due to a combination of high absorption by the forest cover and the MSS quantization. Four methods for reducing the topographic effect were compared. Band ratioing of MSS 6/5 and MSS 7/5 did not eliminate the topographic effect because of the lack of variation in MSS 4 and 5 radiances. The three radiance models examined to reduce the topographic effect required integration of the digital terrain data. Two Lambertian models increased the variation in the LANDSAT radiances. The nonLambertian model considerably reduced (86 per cent) the topographic effect in the LANDSAT data. The study demonstrates that high quality digital terrain data, as provided by the USGS digital elevation model data, can be used to enhance the utility of multispectral satellite data
Effects of white matter microstructure on phase and susceptibility maps
Purpose: To investigate the effects on quantitative susceptibility mapping (QSM) and susceptibility tensor imaging (STI) of the frequency variation produced by the microstructure of white matter (WM).
Methods: The frequency offsets in a WM tissue sample that are not explained by the effect of bulk isotropic or anisotropic magnetic susceptibility, but rather result from the local microstructure, were characterized for the first time. QSM and STI were then applied to simulated frequency maps that were calculated using a digitized whole-brain, WM model formed from anatomical and diffusion tensor imaging data acquired from a volunteer. In this model, the magnitudes of the frequency contributions due to anisotropy and microstructure were derived from the results of the tissue experiments.
Results: The simulations suggest that the frequency contribution of microstructure is much larger than that due to bulk effects of anisotropic magnetic susceptibility. In QSM, the microstructure contribution introduced artificial WM heterogeneity. For the STI processing, the microstructure contribution caused the susceptibility anisotropy to be significantly overestimated.
Conclusion: Microstructure-related phase offsets in WM yield artifacts in the calculated susceptibility maps. If susceptibility mapping is to become a robust MRI technique, further research should be carried out to reduce the confounding effects of microstructure-related frequency contribution
Effects of white matter microstructure on phase and susceptibility maps
Purpose: To investigate the effects on quantitative susceptibility mapping (QSM) and susceptibility tensor imaging (STI) of the frequency variation produced by the microstructure of white matter (WM).
Methods: The frequency offsets in a WM tissue sample that are not explained by the effect of bulk isotropic or anisotropic magnetic susceptibility, but rather result from the local microstructure, were characterized for the first time. QSM and STI were then applied to simulated frequency maps that were calculated using a digitized whole-brain, WM model formed from anatomical and diffusion tensor imaging data acquired from a volunteer. In this model, the magnitudes of the frequency contributions due to anisotropy and microstructure were derived from the results of the tissue experiments.
Results: The simulations suggest that the frequency contribution of microstructure is much larger than that due to bulk effects of anisotropic magnetic susceptibility. In QSM, the microstructure contribution introduced artificial WM heterogeneity. For the STI processing, the microstructure contribution caused the susceptibility anisotropy to be significantly overestimated.
Conclusion: Microstructure-related phase offsets in WM yield artifacts in the calculated susceptibility maps. If susceptibility mapping is to become a robust MRI technique, further research should be carried out to reduce the confounding effects of microstructure-related frequency contribution
- …