FireFly Mosaic: A Vision-Enabled Wireless Sensor Networking System

Abstract

Abstract β€” With the advent of CMOS cameras, it is now possible to make compact, cheap and low-power image sensors capable of on-board image processing. These embedded vision sensors provide a rich new sensing modality enabling new classes of wireless sensor networking applications. In order to build these applications, system designers need to overcome challanges associated with limited bandwith, limited power, group coordination and fusing of multiple camera views with various other sensory inputs. Real-time properties must be upheld if multiple vision sensors are to process data, com-municate with each other and make a group decision before the measured environmental feature changes. In this paper, we present FireFly Mosaic, a wireless sensor network image processing framework with operating system, networking and image processing primitives that assist in the development of distributed vision-sensing tasks. Each FireFly Mosaic wireless camera consists of a FireFly [1] node coupled with a CMUcam3 [2] embedded vision processor. The FireFly nodes run the Nano-RK [3] real-time operating system and communicate using the RT-Link [4] collision-free TDMA link protocol. Using FireFly Mosaic, we demonstrate an assisted living application capable of fusing multiple cameras with overlapping views to discover and monitor daily activities in a home. Using this application, we show how an integrated platform with support for time synchronization, a collision-free TDMA link layer, an underlying RTOS and an interface to an embedded vision sensor provides a stable framework for distributed real-time vision processing. To the best of our knowledge, this is the first wireless sensor networking system to integrate multiple coordinating cameras performing local processing. I

    Similar works

    Full text

    thumbnail-image

    Available Versions

    Last time updated on 05/06/2019