61,018 research outputs found

    CONFIGURATION-ADAPTIVE EDGE-ASSISTED INDUSTRIAL WIRELESS CAMERAS WITH DEEP REINFORCEMENT LEARNING

    Get PDF
    Embodiments of the present disclosure provide a method and system for designing and implementing an industrial smart camera that uses low-power wireless communications and edge computing to achieve cordless and energy-efficient visual sensing. The smart camera applies deep reinforcement learning to adapt the camera configuration to maintain the desired visual sensing performance with the minimum energy consumption under dynamic variations of application requirement and wireless channel conditions

    Sensorcam: An Energy-Efficient Smart Wireless Camera for Environmental Monitoring

    Get PDF
    Reducing energy cost is crucial for energy-constrained smart wireless cameras. Existing platforms impose two main challenges: First, most commercial smart phones have a closed platform, which makes it impossible to manage low-level circuits. Since the sampling frequency is moderate in environmental monitoring context, any improper power management in idle period will incur significant energy leak. Secondly, low-end cameras tailored for wireless sensor networks usually have limited processing power or communication range, and thus are not capable of outdoor monitoring task under low data rate. To tackle these issues, we develop Sensorcam, a long-range, smart wireless camera running a Linux-base open system. Through better power management in idle period and the "intelligence" of the camera itself, we demonstrate an energy-efficient wireless monitoring system in a real deployment

    Efficient Surveillance System

    Get PDF
    At the University of Limerick, we designed a proof-of-concept device for a low-power, low-cost wireless surveillance system, partially powered by solar energy. We designed a portion of the system - the camera node and the base station\u27s user interface. The major functionality included capturing pictures based on motion detection and remote notification of this activity, which could also be viewed online. We designed this prototype to fill a gap in the market for low-cost, highly-scalable surveillance systems

    CITRIC: A low-bandwidth wireless camera network platform

    Get PDF
    In this paper, we propose and demonstrate a novel wireless camera network system, called CITRIC. The core component of this system is a new hardware platform that integrates a camera, a frequency-scalable (up to 624 MHz) CPU, 16 MB FLASH, and 64 MB RAM onto a single device. The device then connects with a standard sensor network mote to form a camera mote. The design enables in-network processing of images to reduce communication requirements, which has traditionally been high in existing camera networks with centralized processing. We also propose a back-end client/server architecture to provide a user interface to the system and support further centralized processing for higher-level applications. Our camera mote enables a wider variety of distributed pattern recognition applications than traditional platforms because it provides more computing power and tighter integration of physical components while still consuming relatively little power. Furthermore, the mote easily integrates with existing low-bandwidth sensor networks because it can communicate over the IEEE 802.15.4 protocol with other sensor network platforms. We demonstrate our system on three applications: image compression, target tracking, and camera localization

    A prototype node for wireless vision sensor network applications development

    Get PDF
    This paper presents a prototype vision-enabled sensor node based on a commercial vision system of reduced size and power consumption. The wireless infrastructure for the deployment of a distributed smart camera network based on these nodes is provided by commercial motes. The smart camera, based on a low-power bio-inspired processing scheme, enables in-node image processing and vision tools. This permits to elaborate a lighter representation of the scene, keeping the relevant information in terms of detected elements, features and events, alleviating the data transmission through the network. Therefore by passing only the relevant information to the neighboring sensor nodes, distributed and collaborative vision is possible with the limited data rates available in commercial wireless sensor networks. Communication between the different components of the system is supported by the available UARTs and GPIOs. Several examples of in-node image processing and feature detection has been tested in the prototype, and information at different abstraction levels has been broadcasted to the network.Junta de AndalucĂ­a 2006-TIC-2352Ministerio de Ciencia e InnovaciĂłn TEC2009-1181

    FireFly Mosaic: A Vision-Enabled Wireless Sensor Networking System

    Full text link
    Abstract — With the advent of CMOS cameras, it is now possible to make compact, cheap and low-power image sensors capable of on-board image processing. These embedded vision sensors provide a rich new sensing modality enabling new classes of wireless sensor networking applications. In order to build these applications, system designers need to overcome challanges associated with limited bandwith, limited power, group coordination and fusing of multiple camera views with various other sensory inputs. Real-time properties must be upheld if multiple vision sensors are to process data, com-municate with each other and make a group decision before the measured environmental feature changes. In this paper, we present FireFly Mosaic, a wireless sensor network image processing framework with operating system, networking and image processing primitives that assist in the development of distributed vision-sensing tasks. Each FireFly Mosaic wireless camera consists of a FireFly [1] node coupled with a CMUcam3 [2] embedded vision processor. The FireFly nodes run the Nano-RK [3] real-time operating system and communicate using the RT-Link [4] collision-free TDMA link protocol. Using FireFly Mosaic, we demonstrate an assisted living application capable of fusing multiple cameras with overlapping views to discover and monitor daily activities in a home. Using this application, we show how an integrated platform with support for time synchronization, a collision-free TDMA link layer, an underlying RTOS and an interface to an embedded vision sensor provides a stable framework for distributed real-time vision processing. To the best of our knowledge, this is the first wireless sensor networking system to integrate multiple coordinating cameras performing local processing. I

    Technology of swallowable capsule for medical applications

    Get PDF
    Medical technology has undergone major breakthroughs in recent years, especially in the area of the examination tools for diagnostic purposes. This paper reviews the swallowable capsule technology in the examination of the gastrointestinal system for various diseases. The wireless camera pill has created a more advanced method than many traditional examination methods for the diagnosis of gastrointestinal diseases such as gastroscopy by the use of an endoscope. After years of great innovation, commercial swallowable pills have been produced and applied in clinical practice. These smart pills can cover the examination of the gastrointestinal system and not only provide to the physicians a lot more useful data that is not available from the traditional methods, but also eliminates the use of the painful endoscopy procedure. In this paper, the key state-of-the-art technologies in the existing Wireless Capsule Endoscopy (WCE) systems are fully reported and the recent research progresses related to these technologies are reviewed. The paper ends by further discussion on the current technical bottlenecks and future research in this area
    • 

    corecore