21,126 research outputs found

    CMOS Vision Sensors: Embedding Computer Vision at Imaging Front-Ends

    Get PDF
    CMOS Image Sensors (CIS) are key for imaging technol-ogies. These chips are conceived for capturing opticalscenes focused on their surface, and for delivering elec-trical images, commonly in digital format. CISs may incor-porate intelligence; however, their smartness basicallyconcerns calibration, error correction and other similartasks. The term CVISs (CMOS VIsion Sensors) definesother class of sensor front-ends which are aimed at per-forming vision tasks right at the focal plane. They havebeen running under names such as computational imagesensors, vision sensors and silicon retinas, among others. CVIS and CISs are similar regarding physical imple-mentation. However, while inputs of both CIS and CVISare images captured by photo-sensors placed at thefocal-plane, CVISs primary outputs may not be imagesbut either image features or even decisions based on thespatial-temporal analysis of the scenes. We may hencestate that CVISs are more “intelligent” than CISs as theyfocus on information instead of on raw data. Actually,CVIS architectures capable of extracting and interpretingthe information contained in images, and prompting reac-tion commands thereof, have been explored for years inacademia, and industrial applications are recently ramp-ing up.One of the challenges of CVISs architects is incorporat-ing computer vision concepts into the design flow. Theendeavor is ambitious because imaging and computervision communities are rather disjoint groups talking dif-ferent languages. The Cellular Nonlinear Network Univer-sal Machine (CNNUM) paradigm, proposed by Profs.Chua and Roska, defined an adequate framework forsuch conciliation as it is particularly well suited for hard-ware-software co-design [1]-[4]. This paper overviewsCVISs chips that were conceived and prototyped at IMSEVision Lab over the past twenty years. Some of them fitthe CNNUM paradigm while others are tangential to it. Allthem employ per-pixel mixed-signal processing circuitryto achieve sensor-processing concurrency in the quest offast operation with reduced energy budget.Junta de Andalucía TIC 2012-2338Ministerio de Economía y Competitividad TEC 2015-66878-C3-1-R y TEC 2015-66878-C3-3-

    Modular Autonomous Biosampler (MAB)- A prototype system for distinct biological size-class sampling and preservation

    Get PDF
    Presently, there is a community wide deficiency in our ability to collect and preserve multiple size-class biologic samples across a broad spectrum of oceanographic platforms (e.g. AUVs, ROVs, and Ocean Observing System Nodes). This is particularly surprising in comparison to the level of instrumentation that now exists for acquiring physical and geophysical data (e.g. side-scan sonar, current profiles etc.), from these same platforms. We present our effort to develop a low-cost, high sample capacity modular,autonomous biological sampling device (MAB). The unit is designed for filtering and preserving 3 distinct biological size-classes (including bacteria), and is deployable in any aquatic setting from a variety of platform modalities (AUV, ROV, or mooring)

    A multisensing setup for the intelligent tire monitoring

    Get PDF
    The present paper offers the chance to experimentally measure, for the first time, the internal tire strain by optical fiber sensors during the tire rolling in real operating conditions. The phenomena that take place during the tire rolling are in fact far from being completely understood. Despite several models available in the technical literature, there is not a correspondently large set of experimental observations. The paper includes the detailed description of the new multi-sensing technology for an ongoing vehicle measurement, which the research group has developed in the context of the project OPTYRE. The experimental apparatus is mainly based on the use of optical fibers with embedded Fiber Bragg Gratings sensors for the acquisition of the circumferential tire strain. Other sensors are also installed on the tire, such as a phonic wheel, a uniaxial accelerometer, and a dynamic temperature sensor. The acquired information is used as input variables in dedicated algorithms that allow the identification of key parameters, such as the dynamic contact patch, instantaneous dissipation and instantaneous grip. The OPTYRE project brings a contribution into the field of experimental grip monitoring of wheeled vehicles, with implications both on passive and active safety characteristics of cars and motorbikes

    The Hierarchic treatment of marine ecological information from spatial networks of benthic platforms

    Get PDF
    Measuring biodiversity simultaneously in different locations, at different temporal scales, and over wide spatial scales is of strategic importance for the improvement of our understanding of the functioning of marine ecosystems and for the conservation of their biodiversity. Monitoring networks of cabled observatories, along with other docked autonomous systems (e.g., Remotely Operated Vehicles [ROVs], Autonomous Underwater Vehicles [AUVs], and crawlers), are being conceived and established at a spatial scale capable of tracking energy fluxes across benthic and pelagic compartments, as well as across geographic ecotones. At the same time, optoacoustic imaging is sustaining an unprecedented expansion in marine ecological monitoring, enabling the acquisition of new biological and environmental data at an appropriate spatiotemporal scale. At this stage, one of the main problems for an effective application of these technologies is the processing, storage, and treatment of the acquired complex ecological information. Here, we provide a conceptual overview on the technological developments in the multiparametric generation, storage, and automated hierarchic treatment of biological and environmental information required to capture the spatiotemporal complexity of a marine ecosystem. In doing so, we present a pipeline of ecological data acquisition and processing in different steps and prone to automation. We also give an example of population biomass, community richness and biodiversity data computation (as indicators for ecosystem functionality) with an Internet Operated Vehicle (a mobile crawler). Finally, we discuss the software requirements for that automated data processing at the level of cyber-infrastructures with sensor calibration and control, data banking, and ingestion into large data portals.Peer ReviewedPostprint (published version

    Trends in Smart City Development

    Get PDF
    This report examines the meanings and practices associated with the term 'smart cities.' Smart city initiatives involve three components: information and communication technologies (ICTs) that generate and aggregate data; analytical tools which convert that data into usable information; and organizational structures that encourage collaboration, innovation, and the application of that information to solve public problems

    Autonomous Vehicle Coordination with Wireless Sensor and Actuator Networks

    Get PDF
    A coordinated team of mobile wireless sensor and actuator nodes can bring numerous benefits for various applications in the field of cooperative surveillance, mapping unknown areas, disaster management, automated highway and space exploration. This article explores the idea of mobile nodes using vehicles on wheels, augmented with wireless, sensing, and control capabilities. One of the vehicles acts as a leader, being remotely driven by the user, the others represent the followers. Each vehicle has a low-power wireless sensor node attached, featuring a 3D accelerometer and a magnetic compass. Speed and orientation are computed in real time using inertial navigation techniques. The leader periodically transmits these measures to the followers, which implement a lightweight fuzzy logic controller for imitating the leader's movement pattern. We report in detail on all development phases, covering design, simulation, controller tuning, inertial sensor evaluation, calibration, scheduling, fixed-point computation, debugging, benchmarking, field experiments, and lessons learned
    corecore