3,879 research outputs found

    Biochar for gas sensors devices

    Get PDF
    In recent years, biochar applications are present in many fields [1]. It has been studied as substitution for more expensive carbon materials like carbon nanotubes, graphene and others. The evident advantage for biochar is its low cost of production, being an environmentally friendly source of huge carbon content. On the other hand, nowadays the main application of this material is as field amendment in agriculture [2]. Starting for the peculiarity of biochar, it is possible to modify its features. For instance, after high temperature treatments, its surface area can increase sharply. Please click on the file below for full content of the abstract

    Waste Coffee Ground Biochar: A Material for Humidity Sensors

    Get PDF
    Worldwide consumption of coffee exceeds 11 billion tons/year. Used coffee grounds end up as landfill. However, the unique structural properties of its porous surface make coffee grounds popular for the adsorption of gaseous molecules. In the present work, we demonstrate the use of coffee grounds as a potential and cheap source for biochar carbon. The produced coffee ground biochar (CGB) was investigated as a sensing material for developing humidity sensors. CGB was fully characterized by using laser granulometry, X-ray diffraction (XRD), Raman spectroscopy, field emission-scanning electron microscopy (FESEM), X-ray photoelectron spectroscopy (XPS), thermogravimetric analysis (TGA) and the Brunnauer Emmett Teller (BET) technique in order to acquire a complete understanding of its structural and surface properties and composition. Subsequently humidity sensors were screen printed using an ink-containing CGB with polyvinyl butyral (PVB) acting as a temporary binder and ethylene glycol monobutyral ether, Emflow, as an organic vehicle so that the proper rheological characteristics were achieved. Screen-printed films were the heated at 300℃ in air. Humidity tests were performed under a flow of 1.7 L/min in the relative humidity range 0–100% at room temperature. The initial impedance of the film was 25.2 MΩ which changes to 12.3 MΩ under 98% humidity exposure. A sensor response was observed above 20 % relative humidity (RH). Both the response and recovery times were reasonably fast (less than 2 min)

    3D Gaze Tracking and Automatic Gaze Coding from RGB-D Cameras

    Get PDF
    Gaze is recognised as one of the most important cues for the analysis of the cognitive behaviors of a person such as the attention displayed towards objects or people, their interactions, functionality and causality patterns. In this short paper, we present our investigations towards the development of 3D gaze sensing solutions from consumer RGB-D sensors, including their use for the inference of visual attention in natural dyadic interactions and the resources we have made or will make available to the community

    Geometric Generative Gaze Estimation (G3E) for Remote RGB-D Cameras

    Get PDF
    We propose a head pose invariant gaze estimation model for distant RGB-D cameras. It relies on a geometric understanding of the 3D gaze action and generation of eye images. By introducing a semantic segmentation of the eye region within a generative process, the model (i) avoids the critical feature tracking of geometrical approaches requiring high resolution images; (ii) decouples the person dependent geometry from the ambient conditions, allowing adaptation to different conditions without retraining. Priors in the generative framework are adequate for training from few samples. In addition, the model is capable of gaze extrapolation allowing for less restrictive training schemes. Comparisons with state of the art methods validate these properties which make our method highly valuable for addressing many diverse tasks in sociology, HRI and HCI

    Person Independent 3D Gaze Estimation From Remote RGB-D Cameras

    Get PDF
    We address the problem of person independent 3D gaze estimation using a remote, low resolution, RGB-D camera. The approach relies on a sparse technique to reconstruct normalized eye test images from a gaze appearance model (a set of eye image/gaze pairs) and infer their gaze accordingly. In this context, the paper makes three contributions: (i) unlike most previous approaches, we exploit the coupling (and constraints) between both eyes to infer their gaze jointly; (ii) we show that a generic gaze appearance model built from the aggregation of person-specific models can be used to handle unseen users and compensate for appearance variations across people, since a test user eyes' appearance will be reconstructed from similar users within the generic model. (iii) we propose an automatic model selection method that leads to comparable performance with a reduced computational load

    Gaze Estimation From Multimodal Kinect Data

    Get PDF
    This paper addresses the problem of free gaze estimation under unrestricted head motion. More precisely, unlike previous approaches that mainly focus on estimating gaze towards a small planar screen, we propose a method to estimate the gaze direction in the 3D space. In this context the paper makes the following contributions: (i) leveraging on Kinect device, we propose a multimodal method that rely on depth sensing to obtain robust and accurate head pose tracking even under large head pose, and on the visual data to obtain the remaining eye-in-head gaze directional information from the eye image; (ii) a rectification scheme of the image that exploits the 3D mesh tracking, allowing to conduct a head pose free eye-in-head gaze directional estimation; (iii) a simple way of collecting ground truth data thanks to the Kinect device. Results on three users demonstrate the great potential of our approach

    SIOUX project: a simultaneous multiband camera for exoplanet atmospheres studies

    Get PDF
    The exoplanet revolution is well underway. The last decade has seen order-of-magnitude increases in the number of known planets beyond the Solar system. Detailed characterization of exoplanetary atmospheres provide the best means for distinguishing the makeup of their outer layers, and the only hope for understanding the interplay between initial composition chemistry, temperature-pressure atmospheric profiles, dynamics and circulation. While pioneering work on the observational side has produced the first important detections of atmospheric molecules for the class of transiting exoplanets, important limitations are still present due to the lack of sys- tematic, repeated measurements with optimized instrumentation at both visible (VIS) and near-infrared (NIR) wavelengths. It is thus of fundamental importance to explore quantitatively possible avenues for improvements. In this paper we report initial results of a feasibility study for the prototype of a versatile multi-band imaging system for very high-precision differential photometry that exploits the choice of specifically selected narrow-band filters and novel ideas for the execution of simultaneous VIS and NIR measurements. Starting from the fundamental system requirements driven by the science case at hand, we describe a set of three opto-mechanical solutions for the instrument prototype: 1) a radial distribution of the optical flux using dichroic filters for the wavelength separation and narrow-band filters or liquid crystal filters for the observations; 2) a tree distribution of the optical flux (implying 2 separate foci), with the same technique used for the beam separation and filtering; 3) an exotic solution consisting of the study of a complete optical system (i.e. a brand new telescope) that exploits the chromatic errors of a reflecting surface for directing the different wavelengths at different foci
    • …
    corecore