153 research outputs found
A Novel Framework for Online Amnesic Trajectory Compression in Resource-constrained Environments
State-of-the-art trajectory compression methods usually involve high
space-time complexity or yield unsatisfactory compression rates, leading to
rapid exhaustion of memory, computation, storage and energy resources. Their
ability is commonly limited when operating in a resource-constrained
environment especially when the data volume (even when compressed) far exceeds
the storage limit. Hence we propose a novel online framework for error-bounded
trajectory compression and ageing called the Amnesic Bounded Quadrant System
(ABQS), whose core is the Bounded Quadrant System (BQS) algorithm family that
includes a normal version (BQS), Fast version (FBQS), and a Progressive version
(PBQS). ABQS intelligently manages a given storage and compresses the
trajectories with different error tolerances subject to their ages. In the
experiments, we conduct comprehensive evaluations for the BQS algorithm family
and the ABQS framework. Using empirical GPS traces from flying foxes and cars,
and synthetic data from simulation, we demonstrate the effectiveness of the
standalone BQS algorithms in significantly reducing the time and space
complexity of trajectory compression, while greatly improving the compression
rates of the state-of-the-art algorithms (up to 45%). We also show that the
operational time of the target resource-constrained hardware platform can be
prolonged by up to 41%. We then verify that with ABQS, given data volumes that
are far greater than storage space, ABQS is able to achieve 15 to 400 times
smaller errors than the baselines. We also show that the algorithm is robust to
extreme trajectory shapes.Comment: arXiv admin note: substantial text overlap with arXiv:1412.032
An Optimal Linear Time Algorithm for Quasi-Monotonic Segmentation
Monotonicity is a simple yet significant qualitative characteristic. We
consider the problem of segmenting a sequence in up to K segments. We want
segments to be as monotonic as possible and to alternate signs. We propose a
quality metric for this problem using the l_inf norm, and we present an optimal
linear time algorithm based on novel formalism. Moreover, given a
precomputation in time O(n log n) consisting of a labeling of all extrema, we
compute any optimal segmentation in constant time. We compare experimentally
its performance to two piecewise linear segmentation heuristics (top-down and
bottom-up). We show that our algorithm is faster and more accurate.
Applications include pattern recognition and qualitative modeling.Comment: This is the extended version of our ICDM'05 paper (arXiv:cs/0702142
Monotone Pieces Analysis for Qualitative Modeling
It is a crucial task to build qualitative models of industrial applications for model-based diagnosis. A Model Abstraction procedure is designed to automatically transform a quantitative model into qualitative model. If the data is monotone, the behavior can be easily abstracted using the corners of the bounding rectangle. Hence, many existing model abstraction approaches rely on monotonicity. But it is not a trivial problem to robustly detect monotone pieces from scattered data obtained by numerical simulation or experiments. This paper introduces an approach based on scale-dependent monotonicity: the notion that monotonicity can be defined relative to a scale. Real-valued functions defined on a finite set of reals e.g. simulation results, can be partitioned into quasi-monotone segments. The end points for the monotone segments are used as the initial set of landmarks for qualitative model abstraction. The qualitative model abstraction works as an iteratively refining process starting from the initial landmarks. The monotonicity analysis presented here can be used in constructing many other kinds of qualitative models; it is robust and computationally efficient
ΠΠ°ΡΡΠΎΡΡΠ²Π°Π½Π½Ρ ΠΌΠ΅ΡΠΎΠ΄ΡΠ² Π½Π΅Π»ΡΠ½ΡΠΉΠ½ΠΎΠ³ΠΎ Π°Π½Π°Π»ΡΠ·Ρ Π΄Π»Ρ ΠΏΠΎΠ±ΡΠ΄ΠΎΠ²ΠΈ ΡΠΈΡΡΠ΅ΠΌΠΈ ΠΌΠΎΠ½ΡΡΠΎΡΡΠ½Π³Ρ ΡΠΎΠ½Π΄ΠΎΠ²ΠΈΡ ΡΠΈΠ½ΠΊΡΠ²
ΠΠ΅ΡΠΎΡ Π΄Π°Π½ΠΎΡ ΡΠΎΠ±ΠΎΡΠΈ Ρ Π²ΠΈΡΡΡΠ΅Π½Π½Ρ ΠΏΡΠΎΠ±Π»Π΅ΠΌΠΈ Π·Π³Π»Π°Π΄ΠΆΡΠ²Π°Π½Π½Ρ ΠΌΡΡΠΈ Π»Π°ΠΌΡΠ½Π°ΡΠ½ΡΡΡΡ ΡΠ΅ΠΊΡΡΠ΅Π½ΡΠ½ΠΎΠ³ΠΎ ΠΊΡΠ»ΡΠΊΡΡΠ½ΠΎΠ³ΠΎ
Π°Π½Π°Π»ΡΠ·Ρ Π΄Π»Ρ ΠΏΠΎΠ±ΡΠ΄ΠΎΠ²ΠΈ ΡΠΈΡΡΠ΅ΠΌΠΈ ΠΌΠΎΠ½ΡΡΠΎΡΠΈΠ½Π³Ρ ΡΠΎΠ½Π΄ΠΎΠ²ΠΈΡ
ΡΠΈΠ½ΠΊΡΠ²
A Better Alternative to Piecewise Linear Time Series Segmentation
Time series are difficult to monitor, summarize and predict. Segmentation
organizes time series into few intervals having uniform characteristics
(flatness, linearity, modality, monotonicity and so on). For scalability, we
require fast linear time algorithms. The popular piecewise linear model can
determine where the data goes up or down and at what rate. Unfortunately, when
the data does not follow a linear model, the computation of the local slope
creates overfitting. We propose an adaptive time series model where the
polynomial degree of each interval vary (constant, linear and so on). Given a
number of regressors, the cost of each interval is its polynomial degree:
constant intervals cost 1 regressor, linear intervals cost 2 regressors, and so
on. Our goal is to minimize the Euclidean (l_2) error for a given model
complexity. Experimentally, we investigate the model where intervals can be
either constant or linear. Over synthetic random walks, historical stock market
prices, and electrocardiograms, the adaptive model provides a more accurate
segmentation than the piecewise linear model without increasing the
cross-validation error or the running time, while providing a richer vocabulary
to applications. Implementation issues, such as numerical stability and
real-world performance, are discussed.Comment: to appear in SIAM Data Mining 200
ΠΡΠΎΠ³Π½ΠΎΠ·ΠΈΡΠΎΠ²Π°Π½ΠΈΠ΅ ΡΡΡΠ΅ΡΡΠ²Π΅Π½Π½ΠΎ Π½Π΅ΡΡΠ°ΡΠΈΠΎΠ½Π°ΡΠ½ΡΡ ΠΌΠ½ΠΎΠ³ΠΎΡΠ°ΠΊΡΠΎΡΠ½ΡΡ Π²ΡΠ΅ΠΌΠ΅Π½Π½ΡΡ ΡΡΠ΄ΠΎΠ² Π½Π° ΠΏΡΠΈΠΌΠ΅ΡΠ΅ ΠΏΠΎΠΊΠ°Π·Π°ΡΠ΅Π»Ρ ΠΈΠ½Π²Π΅ΡΡΠΈΡΠΈΠΉ ΡΠΎΡΡΠΈΠΉΡΠΊΠΈΡ Π½Π΅Π±Π°Π½ΠΊΠΎΠ²ΡΠΊΠΈΡ ΠΊΠΎΡΠΏΠΎΡΠ°ΡΠΈΠΉ Π·Π° ΡΡΠ±Π΅ΠΆ
Π Π΄Π°Π½Π½ΠΎΠΉ ΡΠ°Π±ΠΎΡΠ΅ ΠΏΡΠ΅Π΄ΠΏΡΠΈΠ½ΠΈΠΌΠ°Π΅ΡΡΡ ΠΏΠΎΠΏΡΡΠΊΠ° ΡΠ°Π·ΡΠ°Π±ΠΎΡΠΊΠΈ Π°Π»Π³ΠΎΡΠΈΡΠΌΠ°, ΠΏΠΎΠ·Π²ΠΎΠ»ΡΡΡΠ΅Π³ΠΎ ΠΏΡΠΎΠ³Π½ΠΎΠ·ΠΈΡΠΎΠ²Π°ΡΡ
Π±ΡΠ΄ΡΡΠΈΠ΅ Π·Π½Π°ΡΠ΅Π½ΠΈΡ ΠΌΠ°ΠΊΡΠΎΡΠΊΠΎΠ½ΠΎΠΌΠΈΡΠ΅ΡΠΊΠΈΡ
ΠΏΠΎΠΊΠ°Π·Π°ΡΠ΅Π»Π΅ΠΉ, ΠΏΡΠΈΠ½ΠΈΠΌΠ°Ρ Π²ΠΎ Π²Π½ΠΈΠΌΠ°Π½ΠΈΠ΅ Π½Π΅ΡΡΠ°ΡΠΈΠΎΠ½Π°ΡΠ½ΠΎΡΡΡ ΠΏΡΠΎΡΠ΅ΡΡΠΎΠ²
ΠΏΡΠΈ ΠΈΠ·ΠΌΠ΅Π½Π΅Π½ΠΈΠΈ ΡΡΡΡΠΊΡΡΡΡ ΠΌΠΎΠ΄Π΅Π»ΠΈ Π½Π° ΠΏΡΠΈΠΌΠ΅ΡΠ΅ ΠΏΠΎΠΊΠ°Π·Π°ΡΠ΅Π»Ρ ΠΎΠ±ΡΠ΅ΠΌΠ° ΠΈΠ½Π²Π΅ΡΡΠΈΡΠΈΠΉ ΡΠΎΡΡΠΈΠΉΡΠΊΠΈΡ
Π½Π΅Π±Π°Π½ΠΊΠΎΠ²ΡΠΊΠΈΡ
ΠΊΠΎΡΠΏΠΎΡΠ°ΡΠΈΠΉ Π·Π° ΡΡΠ±Π΅ΠΆ
ΠΠ΄Π°ΠΏΡΠΈΠ²Π½Π°Ρ ΠΎΠ±ΡΠ°Π±ΠΎΡΠΊΠ° Π½Π΅ΡΡΠ°ΡΠΈΠΎΠ½Π°ΡΠ½ΡΡ Π²ΡΠ΅ΠΌΠ΅Π½Π½ΡΡ ΡΡΠ΄ΠΎΠ² Π½Π° ΠΎΡΠ½ΠΎΠ²Π΅ Π½Π΅ΡΠ΅ΡΠΊΠΎΠ³ΠΎ ΠΏΠΎΠ΄Ρ ΠΎΠ΄Π°
ΠΠ° ΠΎΡΠ½ΠΎΠ²Π΅ ΠΎΠ±ΡΠ΅Π΄ΠΈΠ½Π΅Π½ΠΈΡ Π½Π΅ΡΠ΅ΡΠΊΠΎΠ³ΠΎ ΠΏΠ°ΠΊΠ΅ΡΠ½ΠΎΠ³ΠΎ ΡΠΏΠΎΡΠΎΠ±Π° ΠΎΠ±ΡΠ°Π±ΠΎΡΠΊΠΈ ΠΈ ΡΠ΅Π³ΠΌΠ΅Π½ΡΠ°ΡΠΈΠΈ Π²ΡΠ΅ΠΌΠ΅Π½Π½ΡΡ
ΡΡΠ΄ΠΎΠ² Ρ ΡΠ΅ΠΊΡΡΡΠ΅Π½ΡΠ½ΡΠΌΠΈ ΠΏΡΠΎΡΠ΅Π΄ΡΡΠ°ΠΌΠΈ ΠΎΠ±ΡΠ°Π±ΠΎΡΠΊΠΈ ΡΠ΅ΠΊΡΡΠΈΡ
Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ, ΠΏΡΠ΅Π΄Π»ΠΎΠΆΠ΅Π½ ΠΎΠ½Π»Π°ΠΉΠ½ ΠΌΠ΅ΡΠΎΠ΄ ΡΠ΅Π³ΠΌΠ΅Π½ΡΠ°ΡΠΈΠΈ ΠΌΠ½ΠΎΠ³ΠΎΠΌΠ΅ΡΠ½ΡΡ
Π²ΡΠ΅ΠΌΠ΅Π½Π½ΡΡ
ΡΡΠ΄ΠΎΠ², ΠΊΠΎΡΠΎΡΡΠΉ ΠΏΡΠΈΠΌΠ΅Π½ΠΈΠΌ Π΄Π»Ρ ΠΎΠ±Π½Π°ΡΡΠΆΠ΅Π½ΠΈΡ ΠΎΠ΄Π½ΠΎΡΠΎΠ΄Π½ΡΡ
ΡΠ΅Π³ΠΌΠ΅Π½ΡΠΎΠ² Π² ΡΠ΅Π°Π»ΡΠ½ΠΎΠΌ ΡΠ΅ΠΆΠΈΠΌΠ΅ Π²ΡΠ΅ΠΌΠ΅Π½ΠΈ Π½Π° ΠΎΡΠ½ΠΎΠ²Π΅ ΠΏΠΎΡΠΎΠΊΠΎΠ²ΡΡ
Π΄Π°Π½Π½ΡΡ
By combining the fuzzy batch mode processing and segmentation of time series with recurrent processing procedures current values offered online segmentation method of multivariate time series, which is useful for the detection of homogeneous segments in real time based on the data strea
Π Π°Π·ΡΠ°Π±ΠΎΡΠΊΠ° ΠΊΠ°ΡΠ΅Π³ΠΎΡΠ½ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ ΡΠ΅Π°Π»ΠΈΠ·Π°ΡΠΈΠΈ ΠΆΠΈΠ·Π½Π΅Π½ΠΎΠ³ΠΎ ΡΠΈΠΊΠ»Π° ΠΏΡΠΎΡΠ΅ΡΡΠ° Π²ΡΠ±ΠΎΡΠ° ΠΌΠ΅ΡΠΎΠΏΡΠΈΡΡΠΈΠΉ
Π Π°ΡΡΠΌΠ°ΡΡΠΈΠ²Π°ΡΡΡΡ ΠΌΠ΅ΡΠΎΠ΄Ρ ΡΠΎΡΠΌΠΈΡΠΎΠ²Π°Π½ΠΈΡ ΠΊΠ°ΡΠ΅Π³ΠΎΡΠ½ΡΡ
ΠΌΠΎΠ΄Π΅Π»Π΅ΠΉ ΡΠ»ΠΎΠΆΠ½ΡΡ
ΠΎΠ±ΡΠ΅ΠΊΡΠΎΠ². Π ΠΊΠ°ΡΠ΅Π³ΠΎΡΠ½ΠΎ-ΡΡΠ½ΠΊΡΠΎΡΠ½ΠΎΠΌ ΠΏΡΠ΅Π΄ΡΡΠ°Π²Π»Π΅Π½ΠΈΠΈ ΡΠΎΡΠΌΠ°Π»ΠΈΠ·ΠΎΠ²Π°Π½Ρ ΠΏΠΎΠΊΠ°Π·Π°ΡΠ΅Π»ΠΈ ΡΡΡΠ΅ΠΊΡΠΈΠ²Π½ΠΎΡΡΠΈ, Π±ΠΈΠ·Π½Π΅Ρ-ΠΏΡΠΎΡΠ΅ΡΡΡ ΠΈ ΠΌΠ΅ΡΠΎΠΏΡΠΈΡΡΠΈΡ, ΠΏΠΎΠ·Π²ΠΎΠ»ΡΡΡΠΈΠ΅ ΡΠΎΡΠΌΠ°Π»ΠΈΠ·ΠΎΠ²Π°ΡΡ ΡΠΎΡΠΌΠΈΡΠΎΠ²Π°Π½ΠΈΠ΅ ΡΠΏΠΈΡΠΊΠ° ΠΌΠ΅ΡΠΎΠΏΡΠΈΡΡΠΈΠΉ Π΄Π»Ρ ΠΎΠΏΠ΅ΡΠ°ΡΠΈΠ²Π½ΠΎΠ³ΠΎ ΡΠΏΡΠ°Π²Π»Π΅Π½ΠΈΡ ΠΠThe methods of forming categorical models of complex objects is considered. In the representation of category and functor the performance of indicators, business processes and activities are formalized This make it possible to formalize the generation of a list of activities for the operational management of business processe
Smoothening and Segmentation of ECG Signals Using Total Variation Denoising βMinimization-Majorization and Bottom-Up Approach
AbstractAn ECG Signal records electrical activity of heart. It includes information on heart's rhythm and is useful for diagnosis of heart related diseases. It encounters with various artifacts during acquisition and transmission. The unwanted signals/noises present in ECG signals disturb the clinical information present in it. This paper tries to reduce unwanted signals through Majorization-Minorization approach to optimize total variation in the signals. The denoised signal is then segmented using bottom up approach. The results show significant improvement in signal to noise ratio and successful segmentation of sections of ECG signals
- β¦