93,153 research outputs found
Overcoming Bandwidth Limitations in Wireless Sensor Networks by Exploitation of Cyclic Signal Patterns: An Event-triggered Learning Approach
Wireless sensor networks are used in a wide range of applications, many of which require real-time transmission of the measurements. Bandwidth limitations result in limitations on the sampling frequency and number of sensors. This problem can be addressed by reducing the communication load via data compression and event-based communication approaches. The present paper focuses on the class of applications in which the signals exhibit unknown and potentially time-varying cyclic patterns. We review recently proposed event-triggered learning (ETL) methods that identify and exploit these cyclic patterns, we show how these methods can be applied to the nonlinear multivariable dynamics of three-dimensional orientation data, and we propose a novel approach that uses Gaussian process models. In contrast to other approaches, all three ETL methods work in real time and assure a small upper bound on the reconstruction error. The proposed methods are compared to several conventional approaches in experimental data from human subjects walking with a wearable inertial sensor network. They are found to reduce the communication load by 60–70%, which implies that two to three times more sensor nodes could be used at the same bandwidth
Statistical modeling of ground motion relations for seismic hazard analysis
We introduce a new approach for ground motion relations (GMR) in the
probabilistic seismic hazard analysis (PSHA), being influenced by the extreme
value theory of mathematical statistics. Therein, we understand a GMR as a
random function. We derive mathematically the principle of area-equivalence;
wherein two alternative GMRs have an equivalent influence on the hazard if
these GMRs have equivalent area functions. This includes local biases. An
interpretation of the difference between these GMRs (an actual and a modeled
one) as a random component leads to a general overestimation of residual
variance and hazard. Beside this, we discuss important aspects of classical
approaches and discover discrepancies with the state of the art of stochastics
and statistics (model selection and significance, test of distribution
assumptions, extreme value statistics). We criticize especially the assumption
of logarithmic normally distributed residuals of maxima like the peak ground
acceleration (PGA). The natural distribution of its individual random component
(equivalent to exp(epsilon_0) of Joyner and Boore 1993) is the generalized
extreme value. We show by numerical researches that the actual distribution can
be hidden and a wrong distribution assumption can influence the PSHA negatively
as the negligence of area equivalence does. Finally, we suggest an estimation
concept for GMRs of PSHA with a regression-free variance estimation of the
individual random component. We demonstrate the advantages of event-specific
GMRs by analyzing data sets from the PEER strong motion database and estimate
event-specific GMRs. Therein, the majority of the best models base on an
anisotropic point source approach. The residual variance of logarithmized PGA
is significantly smaller than in previous models. We validate the estimations
for the event with the largest sample by empirical area functions. etc
Statistical framework for video decoding complexity modeling and prediction
Video decoding complexity modeling and prediction is an increasingly important issue for efficient resource utilization in a variety of applications, including task scheduling, receiver-driven complexity shaping, and adaptive dynamic voltage scaling. In this paper we present a novel view of this problem based on a statistical framework perspective. We explore the statistical structure (clustering) of the execution time required by each video decoder module (entropy decoding, motion compensation, etc.) in conjunction with complexity features that are easily extractable at encoding time (representing the properties of each module's input source data). For this purpose, we employ Gaussian mixture models (GMMs) and an expectation-maximization algorithm to estimate the joint execution-time - feature probability density function (PDF). A training set of typical video sequences is used for this purpose in an offline estimation process. The obtained GMM representation is used in conjunction with the complexity features of new video sequences to predict the execution time required for the decoding of these sequences. Several prediction approaches are discussed and compared. The potential mismatch between the training set and new video content is addressed by adaptive online joint-PDF re-estimation. An experimental comparison is performed to evaluate the different approaches and compare the proposed prediction scheme with related resource prediction schemes from the literature. The usefulness of the proposed complexity-prediction approaches is demonstrated in an application of rate-distortion-complexity optimized decoding
- …