22 research outputs found
BRIX - An Easy-to-Use Modular Sensor and Actuator Prototyping Toolkit
Zehe S, Großhauser T, Hermann T. BRIX - An Easy-to-Use Modular Sensor and Actuator Prototyping Toolkit. In: Tenth Annual IEEE International Conference on Pervasive Computing and Communications, Workshop Proceedings. Lugano, Swizerland: IEEE; 2012: 817-822.In this paper we present BRIX, a novel modular hardware prototyping platform for applications in mobile, wearable and stationary sensing, data streaming and feedback. The system consists of three different types of compact stack- able modules, which can adapt to various applications and scenarios. The core of BRIX is a base module that contains basic motion sensors, a processor and a wireless interface. A battery module provides power for the system and makes it a mobile device. Different types of extension modules can be stacked onto the base module to extend its scope of functions by sensors, actuators and interactive elements. BRIX allows a very intuitive, inexpensive and expeditious prototyping that does not require knowledge in electronics or hardware design. In an example application, we demonstrate how BRIX can be used to track human body movements
BRIXâ‚‚ - A Versatile Toolkit for Rapid Prototyping and Education in Ubiquitous Computing
Zehe S. BRIX₂ - A Versatile Toolkit for Rapid Prototyping and Education in Ubiquitous Computing. Bielefeld: Universität Bielefeld; 2018
Weather to Go - A Blended Sonification Application
Presented at the 20th International Conference on Auditory Display (ICAD2014), June 22-25, 2014, New York, NY.People often stay in touch with the weather forecast for various
reasons. We depend on knowing the upcoming weather conditions
in order to plan activities outside or even just to decide what to
wear on our way to work. With weather to go we present an auditory
weather report which informs the user about the future or
current weather situation when leaving home in the morning or
the office in the evening. The sonification is designed to be calm,
coherent and expectable so that it can blend well into the user’s familiar
environment. In this work the auditory display is activated
when somebody leaves through the door. The activity is sensed
by a multi-purpose sensor unit mounted at the door. When the
door is opened, weather to go renders and plays sounds that characterize
the weather forecast for the region where the system is
located. That way, the system raises the user’s awareness for suitable
clothes, transportation or route to take to the destination in the
right moment
An Adaptive Acknowledgement On-demand Protocol for Wireless Sensor Networks
Lian Sang C, Hesse M, Zehe S, Adams M, Hörmann T, Rückert U. An Adaptive Acknowledgement On-demand Protocol for Wireless Sensor Networks. In: Proceedings of the 6th International Confererence on Sensor Networks. Vol 1. 2017: 174-181.The concept of packet acknowledgement in wireless communication networks is crucial for reliable data transmission.
However, reliability comes with the cost of an increased duty cycle of the network. This is due to the additional acknowledgement time for every single data packet sent. Therefore, energy consumption and latency of all sensor nodes is increased whilst the overall throughput in the network decreases. This paper contributes an adaptive acknowledgement on-demand protocol for wireless sensor networks with star network topology. The goal is to tackle the trade-off between energy efficiency and reliable data transmission. The proposed protocol is able to detect network congestion in real time by constantly monitoring the overall packet delivery ratio for each sensor node. In case the packet delivery ratio of any sensor nodes in the network is dropped significantly (e.g. due to environmental changes), the protocol switches automatically to a more reliable data transmission mode utilizing acknowledgements concerning the affected sensor nodes. Our proposed method is tested and evaluated based on a specific hardware implementation and the corresponding results are discussed in this paper
Sonified Aerobics - Interactive Sonification of Coordinated Body Movements
Presented at the 17th International Conference on Auditory Display (ICAD2011), 20-23 June, 2011 in Budapest, Hungary.This paper introduces a new hard-/ and software system for the interactive sonification of sports movement involv- ing arm- and leg movements. Two different sonifications are designed to convey rhythmical patterns that become au- ditory gestalt so that listeners can identify features of the underlying coordinated movement. The Sonification is de- signed for the application to enable visually impaired users to participate in aerobics exercises, and also to enhance the perception of movements for sighted participants, which is useful for instance if the scene is occluded or the head pos- ture is incompatible with the observation of the instructor or fitness professional who shows the practices in parallel. Furthermore, the system allows to monitor fine couplings in arm/leg coordination while jogging, as auditory feedback may help stabilizing the movement pattern. We present the sensing system, two sonification designs, and interaction examples that lead to coordination-specific sound gestalts. Finally, some qualitative observations are reported from the first uses of the prototype
Supplementary Material for "Sonified Aerobics - Interactive Sonification of coordinated body movements"
Hermann T, Zehe S. Supplementary Material for "Sonified Aerobics - Interactive Sonification of coordinated body movements". Bielefeld University; 2011.<img
src="https://pub.uni-bielefeld.de/download/2696642/2702776"
width="200" style="float:right;" >
This paper introduces a new hard-/ and software system for the interactive sonification of sports movement involving arm- and leg movements. Two different sonifications are designed to convey rhythmical patterns that become auditory gestalt so that listeners can identify features of the underlying coordinated movement. The Sonification is designed for the application to enable visually impaired users to participate in aerobics exercises, and also to enhance the perception of movements for sighted participants, which is useful for instance if the scene is occluded or the head pos- ture is incompatible with the observation of the instructor or fitness professional who shows the practices in parallel. Furthermore, the system allows to monitor fine couplings in arm/leg coordination while jogging, as auditory feedback may help stabilizing the movement pattern. We present the sensing system, two sonification designs, and interaction examples that lead to coordination-specific sound gestalts. Finally, some qualitative observations are reported from the first uses of the prototype.
#### Example S1: (mapping sonification)
<video controls="controls"
width="60%" height="60%">
<source src="https://pub.uni-bielefeld.de/download/2696642/2696646"
type="video/mp4" />
#### Example S2: (event-based sonification)
<video controls="controls"
width="60%" height="60%">
<source src="https://pub.uni-bielefeld.de/download/2696642/2696647"
type="video/mp4" />
</center
Characterization of binding properties of ephedrine derivatives to human alpha-1-acid glycoprotein
Most drugs, especially those with acidic or neutral moieties, are bound to the plasma protein albumin, whereas basic drugs are preferentially bound to human alpha-1-acid glycoprotein (AGP). The protein binding of the long-established drugs ephedrine and pseudoephedrine, which are used in the treatment of hypotension and colds, has so far only been studied with albumin. Since in a previous study a stereoselective binding of ephedrine and pseudoephedrine to serum but not to albumin was observed, the aim of this study was to check whether the enantioselective binding behavior of ephedrine and pseudoephedrine, in addition to the derivatives methylephedrine and norephedrine, is due to AGP and to investigate the influence of their different substituents and steric arrangement. Discontinuous ultrafiltration was used for the determination of protein binding. Characterization of ligand-protein interactions of the drugs was obtained by saturation transfer difference nuclear magnetic resonance spectroscopy. Docking experiments were performed to analyze possible ligand-protein interactions. The more basic the ephedrine derivative is, the higher is the affinity to AGP. There was no significant difference in the binding properties between the individual enantiomers and the diastereomers of ephedrine and pseudoephedrine
Supplementary Material for "Head gesture sonification for supporting social interaction"
Hermann T, Neumann A, Zehe S. Supplementary Material for "Head gesture sonification for supporting social interaction". Bielefeld University; 2012.<img
src="https://pub.uni-bielefeld.de/download/2695833/2702649"
width="300" style="float:right;">
In this paper we introduce two new methods for real-time sonification of head movements and head gestures. Head gestures such as nodding, shaking the head, etc. are important non-verbal back-channelling signals which facilitate coordination and alignment of communicating interaction partners. Visually impaired persons cannot interpret such non-verbal signals, same as people in mediated communication (e.g. on the phone), or cooperating users whose visual attention is focussed elsewhere. We introduce the idea, the approach, our sensing setup and two different sonification methods. A first preliminary study on the recognition of signals shows that subjects understand gesture type even without prior explanation, and can estimate gesture intensity and frequency with no or little training.
#### Sound example S1
<audio controls="controls"
width="80%" height="80%">
<source src="https://pub.uni-bielefeld.de/download/2695833/2695834"
type="audio/mp3" />
continuous excitatory sonification of head gestures of different type, intensity and velocity
#### Sound example S2
<audio controls="controls"
width="80%" height="80%">
<source src="https://pub.uni-bielefeld.de/download/2695833/2695835"
type="audio/mp3" />
event-based sonification of head gestures of different type, intensity and velocity
#### Video example S3
<video controls="controls"
width="80%" height="80%">
<source src="https://pub.uni-bielefeld.de/download/2695833/2695836"
type="video/mp4" />
</center
Sonified Aerobics - Interactive Sonification of coordinated body movements
Hermann T, Zehe S. Sonified Aerobics - Interactive Sonification of coordinated body movements. In: Worall D, Wersényi G, eds. The 17th Annual Conference on Auditory Display, Budapest, Hungary 20-24 June, 2011, Proceedings. Budapest, Hungary: OPAKFI; 2011.This paper introduces a new hard-/ and software system for the interactive sonification of sports movement involving arm- and leg movements. Two different sonifications are designed to convey rhythmical patterns that become auditory gestalt so that listeners can identify features of the underlying coordinated movement. The Sonification is designed for the application to enable visually impaired users to participate in aerobics exercises, and also to enhance the perception of movements for sighted participants, which is useful for instance if the scene is occluded or the head posture is incompatible with the observation of the instructor or fitness professional who shows the practices in parallel. Furthermore, the system allows to monitor fine couplings in arm/leg coordination while jogging, as auditory feedback may help stabilizing the movement pattern. We present the sensing system, two sonification designs, and interaction examples that lead to coordination-specific sound gestalts. Finally, some qualitative observations are reported from the first uses of the prototype
EFFECTS OF STIMULATED BODY PART AND SIDE OF DEVIANT PRESENTATION ON SOMATOSENSORY ODDBALL-RESPONSES: AN EVENT-RELATED-POTENTIAL STUDY
Lindenbaum L, Kißler J, Anlauff J, Zehe S. EFFECTS OF STIMULATED BODY PART AND SIDE OF DEVIANT PRESENTATION ON SOMATOSENSORY ODDBALL-RESPONSES: AN EVENT-RELATED-POTENTIAL STUDY. In: PSYCHOPHYSIOLOGY. Vol 56. Hoboken: Wiley; 2019: S65