1,119 research outputs found
Autonomous learning for face recognition in the wild via ambient wireless cues
Facial recognition is a key enabling component for emerging Internet of Things (IoT) services such as smart homes or responsive offices. Through the use of deep neural networks, facial recognition has achieved excellent performance. However, this is only possibly when trained with hundreds of images of each user in different viewing and lighting conditions. Clearly, this level of effort in enrolment and labelling is impossible for wide-spread deployment and adoption. Inspired by the fact that most people carry smart wireless devices with them, e.g. smartphones, we propose to use this wireless identifier as a supervisory label. This allows us to curate a dataset of facial images that are unique to a certain domain e.g. a set of people in a particular office. This custom corpus can then be used to finetune existing pre-trained models e.g. FaceNet. However, due to the vagaries of wireless propagation in buildings, the supervisory labels are noisy and weak. We propose a novel technique, AutoTune, which learns and refines the association between a face and wireless identifier over time, by increasing the inter-cluster separation and minimizing the intra-cluster distance. Through extensive experiments with multiple users on two sites, we demonstrate the ability of AutoTune to design an environment-specific, continually evolving facial recognition system with entirely no user effort
Landsat Data Continuity Mission - Launch Fever
The year 2013 will be an exciting period for those that study the Earth land surface from space, particularly those that observe and characterize land cover, land use, and the change of cover and use over time. Two new satellite observatories will be launched next year that will enhance capabilities for observing the global land surface. The United States plans to launch the Landsat Data Continuity Mission (LDCM) in January. That event will be followed later in the year by the European Space Agency (ESA) launch of the first Sentinel 2 satellite. Considered together, the two satellites will increase the frequency of opportunities for viewing the land surface at a scale where human impact and influence can be differentiated from natural change. Data from the two satellites will provide images for similar spectral bands and for comparable spatial resolutions with rigorous attention to calibration that will facilitate cross comparisons. This presentation will provide an overview of the LDCM satellite system and report its readiness for the January launch
Landsat-8 Thermal Infrared Sensor (TIRS) Vicarious Radiometric Calibration
Launched in February 2013, the Landsat-8 carries on-board the Thermal Infrared Sensor (TIRS), a two-band thermal pushbroom imager, to maintain the thermal imaging capability of the Landsat program. The TIRS bands are centered at roughly 10.9 and 12 micrometers (Bands 10 and 11 respectively). They have 100 m spatial resolution and image coincidently with the Operational Land Imager (OLI), also on-board Landsat-8. The TIRS instrument has an internal calibration system consisting of a variable temperature blackbody and a special viewport with which it can see deep space; a two point calibration can be performed twice an orbit. Immediately after launch, a rigorous vicarious calibration program was started to validate the absolute calibration of the system. The two vicarious calibration teams, NASA/Jet Propulsion Laboratory (JPL) and the Rochester Institute of Technology (RIT), both make use of buoys deployed on large water bodies as the primary monitoring technique. RIT took advantage of cross-calibration opportunity soon after launch when Landsat-8 and Landsat-7 were imaging the same targets within a few minutes of each other to perform a validation of the absolute calibration. Terra MODIS is also being used for regular monitoring of the TIRS absolute calibration. The buoy initial results showed a large error in both bands, 0.29 and 0.51 W/sq msrmicrometers or -2.1 K and -4.4 K at 300 K in Band 10 and 11 respectively, where TIRS data was too hot. A calibration update was recommended for both bands to correct for a bias error and was implemented on 3 February 2014 in the USGS/EROS processing system, but the residual variability is still larger than desired for both bands (0.12 and 0.2 W/sq msrmicrometers or 0.87 and 1.67 K at 300 K). Additional work has uncovered the source of the calibration error: out-of-field stray light. While analysis continues to characterize the stray light contribution, the vicarious calibration work proceeds. The additional data have not changed the statistical assessment but indicate that the correction (particularly in band 11) is probably only valid for a subset of data. While the stray light effect is small enough in Band 10 to make the data useful across a wide array of applications, the effect in Band 11 is larger and the vicarious results suggest that Band 11 data should not be used where absolute calibration is required
Twenty-Five Years of Landsat Thermal Band Calibration
Landsat-7 Enhanced Thematic Mapper+ (ETM+), launched in April 1999, and Landsat-5 Thematic Mapper (TM), launched in 1984, both have a single thermal band. Both instruments thermal band calibrations have been updated previously: ETM+ in 2001 for a pre-launch calibration error and TM in 2007 for data acquired since the current era of vicarious calibration has been in place (1999). Vicarious calibration teams at Rochester Institute of Technology (RIT) and NASA/Jet Propulsion Laboratory (JPL) have been working to validate the instrument calibration since 1999. Recent developments in their techniques and sites have expanded the temperature and temporal range of the validation. The new data indicate that the calibration of both instruments had errors: the ETM+ calibration contained a gain error of 5.8% since launch; the TM calibration contained a gain error of 5% and an additional offset error between 1997 and 1999. Both instruments required adjustments in their thermal calibration coefficients in order to correct for the errors. The new coefficients were calculated and added to the Landsat operational processing system in early 2010. With the corrections, both instruments are calibrated to within +/-0.7K
Thermal Infrared Radiometric Calibration of the Entire Landsat 4, 5, and 7 Archive (1982-2010)
Landsat's continuing record of the thermal state of the earth's surface represents the only long term (1982 to the present) global record with spatial scales appropriate for human scale studies (i.e., tens of meters). Temperature drives many of the physical and biological processes that impact the global and local environment. As our knowledge of, and interest in, the role of temperature on these processes have grown, the value of Landsat data to monitor trends and process has also grown. The value of the Landsat thermal data archive will continue to grow as we develop more effective ways to study the long term processes and trends affecting the planet. However, in order to take proper advantage of the thermal data, we need to be able to convert the data to surface temperatures. A critical step in this process is to have the entire archive completely and consistently calibrated into absolute radiance so that it can be atmospherically compensated to surface leaving radiance and then to surface radiometric temperature. This paper addresses the methods and procedures that have been used to perform the radiometric calibration of the earliest sizable thermal data set in the archive (Landsat 4 data). The completion of this effort along with the updated calibration of the earlier (1985 1999) Landsat 5 data, also reported here, concludes a comprehensive calibration of the Landsat thermal archive of data from 1982 to the presen
The forward market for foreign exchange :
The findings of this study support others which examine financial markets. The time series distributions are not normal; they are stable Paretian. The forward market for foreign exchange is not random; but neither is it inefficient based on the weak form model.The study also shows that there were patterns in the time series of rates of return on forward contracts. These were low level autocorrelations and were not stable through time. The patterns were short and changed throughout the time period. Using several testing techniques, the hypothesis that the forward is inefficient was rejected. Several nonparametric tests and Box-Jenkins time series analysis were used. These show that the time series of rates of return on forward contracts, while not a random walk, are not inefficient at the weak form level.Intertemporal speculation, the act of speculating between forward maturities, was defined in the paper. The existence of profitability using this technique was found to be significantly different from zero, but less than the return on short-term U. S. government securities. Therefore, even though the technique is available for forward market participants, the return is not commensurate with the level of risk incurred.The study employed daily spot and forward exchange rates from March 1973 to June 1976 for the following currencies: U.S. to U.K.; U.S. to Swiss Franc; U.S. to German Mark; U.S. to Canadian Dollar; U.K. to German Mark; and U.K. to Canadian Dollar. The rate of return on forward contracts was defined to be the difference between the forward rate at time t and the spot rate which exists upon maturation of that forward contract expressed as a percentage of the spot rate. The distribution of these rates of return was shown to more closely approximate the stable Paretian distribution than the normal distribution. This was true for the spot rate and forward rate distributions as well. Since stable Paretian distributions have no defined variance, an alternative measure of disbursion should be established to replace the sample standard deviation or variance.This study is devoted to an examination of the efficiency and characteristics of the forward market for foreign exchange. Here, efficiency implies that current market prices or rates incorporate any information embodied in the pattern of past prices or rates. The characteristics of the forward market examined include the distribution of rates of return, as well as the relationship of forward exchange rates of different maturities.The percentage premia of different forward rates, relative to the spot rate, were examined in the paper. It was found that relative premia decline as time to maturity increases. Furthermore, the premia decrease at a decreasing rate. The slope between 30 and 60 day premia is more steep than the slope between 60 and 90 day premia. For firms using the forward market to cover exchange rate risk, the implication is that the cost of forward cover is decreased as forward maturity is increased. Thus a reward exists for good forward planning
Landsat-7 ETM+ Radiometric Calibration Status
Now in its 17th year of operation, the Enhanced Thematic Mapper + (ETM+), on board the Landsat-7 satellite, continues to systematically acquire imagery of the Earth to add to the 40+ year archive of Landsat data. Characterization of the ETM+ on-orbit radiometric performance has been on-going since its launch in 1999. The radiometric calibration of the reflective bands is still monitored using on-board calibration devices, though the Pseudo-Invariant Calibration Sites (PICS) method has proven to be an effect tool as well. The calibration gains were updated in April 2013 based primarily on PICS results, which corrected for a change of as much as -0.2%/year degradation in the worst case bands. A new comparison with the SADE database of PICS results indicates no additional degradation in the updated calibration. PICS data are still being tracked though the recent trends are not well understood. The thermal band calibration was updated last in October 2013 based on a continued calibration effort by NASA/Jet Propulsion Lab and Rochester Institute of Technology. The update accounted for a 0.31 W/sq m/ sr/micron bias error. The updated lifetime trend is now stable to within + 0.4K
Analysis of Fcγ receptor haplotypes in rheumatoid arthritis: FCGR3A remains a major susceptibility gene at this locus, with an additional contribution from FCGR3B
The Fcγ receptors play important roles in the initiation and regulation of many immunological and inflammatory processes, and genetic variants (FCGR) have been associated with numerous autoimmune and infectious diseases. The data in rheumatoid arthritis (RA) are conflicting and we previously demonstrated an association between FCGR3A and RA. In view of the close molecular proximity with FCGR2A, FCGR2B and FCGR3B, additional polymorphisms within these genes and FCGR haplotypes were examined to refine the extent of association with RA. Biallelic polymorphisms in FCGR2A, FCGR2B and FCGR3B were examined for association with RA in two well characterized UK Caucasian and North Indian/Pakistani cohorts, in which FCGR3A genotyping had previously been undertaken. Haplotype frequencies and linkage disequilibrium were estimated across the FCGR locus and a model-free analysis was performed to determine association with RA. This was followed by regression analysis, allowing for phase uncertainty, to identify the particular haplotype(s) that influences disease risk. Our results reveal that FCGR2A, FCGR2B and FCGR3B were not associated with RA. The haplotype with the strongest association with RA susceptibility was the FCGR3A–FCGR3B 158V-NA2 haplotype (odds ratio 3.18, 95% confidence interval 1.13–8.92 [P = 0.03] for homozygotes compared with all genotypes). The association was stronger in the presence of nodules (odds ratio 5.03, 95% confidence interval 1.44–17.56; P = 0.01). This haplotype was also more common in North Indian/Pakistani RA patients than in control individuals, but not significantly so. Logistic regression analyses suggested that FCGR3A remained the most significant gene at this locus. The increased association with an FCGR3A–FCGR3B haplotype suggests that other polymorphic variants within FCGR3A or FCGR3B, or in linkage disequilibrium with this haplotype, may additionally contribute to disease pathogenesis
Methodological Standardization for the Pre-Clinical Evaluation of Renal Sympathetic Denervation
Transcatheter ablation of renal autonomic nerves is a viable option for the treatment of resistant arterial hypertension; however, structured pre-clinical evaluation with standardization of analytical procedures remains a clear gap in this field. Here we discuss the topics relevant to the pre-clinical model for the evaluation of renal denervation (RDN) devices and report methodologies and criteria toward standardization of the safety and efficacy assessment, including histopathological evaluations of the renal artery, periarterial nerves, and associated periadventitial tissues. The pre-clinical swine renal artery model can be used effectively to assess both the safety and efficacy of RDN technologies. Assessment of the efficacy of RDN modalities primarily focuses on the determination of the depth of penetration of treatment-related injury (e.g., necrosis) of the periarterial tissues and its relationship (i.e., location and distance) and the effect on the associated renal nerves and the correlation thereof with proxy biomarkers including renal norepinephrine concentrations and nerve-specific immunohistochemical stains (e.g., tyrosine hydroxylase). The safety evaluation of RDN technologies involves assessing for adverse effects on tissues local to the site of treatment (i.e., on the arterial wall) as well as tissues at a distance (e.g., soft tissue, veins, arterial branches, skeletal muscle, adrenal gland, ureters). Increasing experience will help to create a standardized means of examining all arterial beds subject to ablative energy and in doing so enable us to proceed to optimize the development and assessment of these emerging technologies
Bryophyte and lichen biomass and nitrogen fixation in a high elevation cloud forest in Cerro de La Muerte, Costa Rica
Cloud forests have been found to lose more nitrogen in stream discharge than they gain from atmospheric deposition. They also support a large diversity and biomass of tree epiphytes, predominately composed of cryptogams. Since cryptogam epiphytes harbor nitrogen fixing cyanobacteria, they may help make up for the nitrogen loss from ecosystems. We assessed cryptogam biomass on the ground, boles and branches in Quercus costaricensis dominated stands near the tree line in the Cordillera de Talamanca, Costa Rica. Nitrogen fixation was assayed using 15N2 uptake. Total cryptogam biomass was 2 977 kg ha−1, with 67% being found on the lower branches. Bryophytes and chlorolichens made up 53% and 44%, respec- tively, of the biomass. Half of the bryophyte mass was composed of the liverwort Plagiochila heterophylla, and 66% of the chlorolichen of Lobariella pallida. There were no significant differences in nitrogen fixation rates between the cryptogam species, with a mean rate of 5.04 µg N g−1 day−1 during the predominantly wet condition in the forest. The overall nitrogen input from fixation was 6.1 kg N ha−1 year−1, of which 78% came from bryophytes, 18% from chlorolichens, and 4% from cyanolichens. Only 2.0% of the fixation occurred in cryptogams on the ground, whereas 67%, 24%, and 7% occurred on the lower branches, boles, and upper branches, respectively. These results show that tree epiphytes constitute a significant source of nitrogen for these forests, due to the trees’ large surface area, and can make up for the nitrogen lost from these ecosystemsUCR::Vicerrectoría de Docencia::Ciencias Básicas::Facultad de Ciencias::Escuela de BiologíaUCR::Vicerrectoría de Investigación::Unidades de Investigación::Ciencias Básicas::Centro de Investigación en Biodiversidad y Ecología Tropical (CIBET
- …
