6,691 research outputs found
Rank-based camera spectral sensitivity estimation
In order to accurately predict a digital camera response to spectral stimuli, the spectral sensitivity functions of its sensor need to be known. These functions can be determined by direct measurement in the lab—a difficult and lengthy procedure—or through simple statistical inference. Statistical inference methods are based on the observation that when a camera responds linearly to spectral stimuli, the device spectral sensitivities are linearly related to the camera rgb response values, and so can be found through regression. However, for rendered images, such as the JPEG images taken by a mobile phone, this assumption of linearity is violated. Even small departures from linearity can negatively impact the accuracy of the recovered spectral sensitivities, when a regression method is used. In our work, we develop a novel camera spectral sensitivity estimation technique that can recover the linear device spectral sensitivities from linear images and the effective linear sensitivities from rendered images. According to our method, the rank order of a pair of responses imposes a constraint on the shape of the underlying spectral sensitivity curve (of the sensor). Technically, each rank-pair splits the space where the underlying sensor might lie in two parts (a feasible region and an infeasible region). By intersecting the feasible regions from all the ranked-pairs, we can find a feasible region of sensor space. Experiments demonstrate that using rank orders delivers equal estimation to the prior art. However, the Rank-based method delivers a step-change in estimation performance when the data is not linear and, for the first time, allows for the estimation of the effective sensitivities of devices that may not even have “raw mode.” Experiments validate our method
DistancePPG: Robust non-contact vital signs monitoring using a camera
Vital signs such as pulse rate and breathing rate are currently measured
using contact probes. But, non-contact methods for measuring vital signs are
desirable both in hospital settings (e.g. in NICU) and for ubiquitous in-situ
health tracking (e.g. on mobile phone and computers with webcams). Recently,
camera-based non-contact vital sign monitoring have been shown to be feasible.
However, camera-based vital sign monitoring is challenging for people with
darker skin tone, under low lighting conditions, and/or during movement of an
individual in front of the camera. In this paper, we propose distancePPG, a new
camera-based vital sign estimation algorithm which addresses these challenges.
DistancePPG proposes a new method of combining skin-color change signals from
different tracked regions of the face using a weighted average, where the
weights depend on the blood perfusion and incident light intensity in the
region, to improve the signal-to-noise ratio (SNR) of camera-based estimate.
One of our key contributions is a new automatic method for determining the
weights based only on the video recording of the subject. The gains in SNR of
camera-based PPG estimated using distancePPG translate into reduction of the
error in vital sign estimation, and thus expand the scope of camera-based vital
sign monitoring to potentially challenging scenarios. Further, a dataset will
be released, comprising of synchronized video recordings of face and pulse
oximeter based ground truth recordings from the earlobe for people with
different skin tones, under different lighting conditions and for various
motion scenarios.Comment: 24 pages, 11 figure
Quantitative estimation of plant characteristics using spectral measurement: A survey of the literature
There are no author-identified significant results in this report
Plant health sensing
If plants are to be used as a food source for long term space missions, they must be grown in a stable environment where the health of the crops is continuously monitored. The sensor(s) to be used should detect any diseases or health problems before irreversible damage occurs. The method of analysis must be nondestructive and provide instantaneous information on the condition of the crop. In addition, the sensor(s) must be able to function in microgravity. This first semester, the plant health and disease sensing group concentrated on researching and consulting experts in many fields in attempts to find reliable plant health indicators. Once several indicators were found, technologies that could detect them were investigated. Eventually the three methods chosen to be implemented next semester were stimulus response monitoring, video image processing and chlorophyll level detection. Most of the other technologies investigated this semester are discussed here. They were rejected for various reasons but are included in the report because NASA may wish to consider pursuing them in the future
Multispectral imaging of Mars from a lander
Multispectral imaging of Mars from lande
Experimental evaluation of atmospheric effects on radiometric measurements using the EREP of Skylab
There are no author-identified significant results in this report
Digital Color Imaging
This paper surveys current technology and research in the area of digital
color imaging. In order to establish the background and lay down terminology,
fundamental concepts of color perception and measurement are first presented
us-ing vector-space notation and terminology. Present-day color recording and
reproduction systems are reviewed along with the common mathematical models
used for representing these devices. Algorithms for processing color images for
display and communication are surveyed, and a forecast of research trends is
attempted. An extensive bibliography is provided
Study and simulation results for video landmark acquisition and tracking technology (Vilat-2)
The results of several investigations and hardware developments which supported new technology for Earth feature recognition and classification are described. Data analysis techniques and procedures were developed for processing the Feature Identification and Location Experiment (FILE) data. This experiment was flown in November 1981, on the second Shuttle flight and a second instrument, designed for aircraft flights, was flown over the United States in 1981. Ground tests were performed to provide the basis for designing a more advanced version (four spectral bands) of the FILE which would be capable of classifying clouds and snow (and possibly ice) as distinct features, in addition to the features classified in the Shuttle experiment (two spectral bands). The Shuttle instrument classifies water, bare land, vegetation, and clouds/snow/ice (grouped)
- …