13 research outputs found
A New Light Sensing Module for Mica Motes
We present the Ping-Pong mote, a new light sensing module for the Mica mote platform. The Ping-pong mote achieves performance comparable to a commercial light intensity meter, while conforming to the size and energy constraints imposed by its application in wireless sensor networks. The Ping-pong mote was developed to replace the Mica sensor board (MTS310) whose slow response time and narrow dynamic range in light intensity capture is unsuitable to many applications, including media production. The Ping-pong mote features significantly improved SNR due to its adoption of high-end photo sensors, amplification and conversion circuits coupled with active noise suppression, application-tuned filter networks, and a noise-attentive manual layout
Recommended from our members
A New Light Sensing Module for Mica Motes
We present the Ping-Pong mote, a new light sensing module for the Mica mote platform. The Ping-pong mote achieves performance comparable to a commercial light intensity meter, while conforming to the size and energy constraints imposed by its application in wireless sensor networks. The Ping-pong mote was developed to replace the Mica sensor board (MTS310) whose slow response time and narrow dynamic range in light intensity capture is unsuitable to many applications, including media production. The Ping-pong mote features significantly improved SNR due to its adoption of high-end photo sensors, amplification and conversion circuits coupled with active noise suppression, application-tuned filter networks, and a noise-attentive manual layout
The Westwood Experience: Connecting story to locations via Mixed Reality
Figure 1: Some of the components of the experience. From left to right: the mayor character, the game played in the theater, the panorama effect in front of Yamato, details of the set at the brewery, Marilyn Monroe’s grave. The Westwood Experience is a location-based narrative using Mixed Reality effects to connect participants to unique and evoca-tive real locations, bridging the gap between the real and story worlds. This paper describes the experience and a detailed eval-uation of it. The experience itself centers around a narrative told by the “mayor ” of Westwood. He tells a love story from his youth when he first came to Westwood, and intermixes the story with his-torical information. Most of this story is told on a mobile computer, using Mixed Reality and video for illustration. We evaluate the experience both quantitatively and qualitatively to find lessons learned about the experience itself and general guide-lines for this type of experience. The analysis and guidelines from our evaluation are grouped into three categories: narration in mo-bile environments, social dynamics, and Mixed Reality effects
Recommended from our members
Improving Personal and Environmental Health Decision Making with Mobile Personal Sensing
CENS is focusing on three types of health applications. Personalized medicine (AndWellness, AndAmbulation), epidemiological data collection (Project Surya), and personal decision making and awareness (PEIR). Each of these applications uses a similar systems architecture: time, location (GPS), and motion (accelerometer) trace collection on the mobile phone with a user interface, scientific model-based analytics used to draw inferences from the data, and graphical map or calendar based feedback to users. The specifics of each component depend on the type of data collected, the target populations, and the goals of the project. The UI for AndWellness includes an ecological momentary assessment, which is a set of questions a user completes regarding their feelings at that moment; and control over the time, location, and frequency of reminders, which are included to remind users to complete the assessments. The AndWellness UI aims to make the assessment easy to understand and quick to complete. The UI for Project Surya is designed for rural villagers living in India who will likely not know how to read. Therefore the UI will be primarily graphically based, and have little or no text. The specific analytics used for each project differs based on the goal of the project. All four applications use activity classification algorithms in order to infer a user's activity from the GPS and/or accelerometer traces. The similarity ends here. Project Surya uses image analysis algorithms to infer soot levels from images of specialized filters and calibrated color charts. AndWellness uses simple statistical calculations to calculate base-rates for a small set of behaviors that are measured with the ecological momentary assessments. PEIR uses models from the Air Resources Board and other GIS streams to compute users' carbon impact, particulate exposure, and fast food exposure from a location trace. The feedback for each project is presented using a map and/or calendar based interface, based on the data and goals of the project. Because AndWellness users are interested in identifying patterns in space and time across weeks or months, AndWellness presents data in both a calendar and map-based interface, and makes it easy to cross reference any event across either mode. PEIR uses a map to highlight routes and the pollution exposure, and bar graphs to show aggregates for each of the three metrics computed by the analytics. AndAmbulation solely uses a calendar interface because users are most interested in trends over time
Recommended from our members
Improving Personal and Environmental Health Decision Making with Mobile Personal Sensing
CENS is focusing on three types of health applications. Personalized medicine (AndWellness, AndAmbulation), epidemiological data collection (Project Surya), and personal decision making and awareness (PEIR). Each of these applications uses a similar systems architecture: time, location (GPS), and motion (accelerometer) trace collection on the mobile phone with a user interface, scientific model-based analytics used to draw inferences from the data, and graphical map or calendar based feedback to users. The specifics of each component depend on the type of data collected, the target populations, and the goals of the project. The UI for AndWellness includes an ecological momentary assessment, which is a set of questions a user completes regarding their feelings at that moment; and control over the time, location, and frequency of reminders, which are included to remind users to complete the assessments. The AndWellness UI aims to make the assessment easy to understand and quick to complete. The UI for Project Surya is designed for rural villagers living in India who will likely not know how to read. Therefore the UI will be primarily graphically based, and have little or no text. The specific analytics used for each project differs based on the goal of the project. All four applications use activity classification algorithms in order to infer a user's activity from the GPS and/or accelerometer traces. The similarity ends here. Project Surya uses image analysis algorithms to infer soot levels from images of specialized filters and calibrated color charts. AndWellness uses simple statistical calculations to calculate base-rates for a small set of behaviors that are measured with the ecological momentary assessments. PEIR uses models from the Air Resources Board and other GIS streams to compute users' carbon impact, particulate exposure, and fast food exposure from a location trace. The feedback for each project is presented using a map and/or calendar based interface, based on the data and goals of the project. Because AndWellness users are interested in identifying patterns in space and time across weeks or months, AndWellness presents data in both a calendar and map-based interface, and makes it easy to cross reference any event across either mode. PEIR uses a map to highlight routes and the pollution exposure, and bar graphs to show aggregates for each of the three metrics computed by the analytics. AndAmbulation solely uses a calendar interface because users are most interested in trends over time
Recommended from our members