4,667 research outputs found
On construction, performance, and diversification for structured queries on the semantic desktop
[no abstract
Graph Summarization
The continuous and rapid growth of highly interconnected datasets, which are
both voluminous and complex, calls for the development of adequate processing
and analytical techniques. One method for condensing and simplifying such
datasets is graph summarization. It denotes a series of application-specific
algorithms designed to transform graphs into more compact representations while
preserving structural patterns, query answers, or specific property
distributions. As this problem is common to several areas studying graph
topologies, different approaches, such as clustering, compression, sampling, or
influence detection, have been proposed, primarily based on statistical and
optimization methods. The focus of our chapter is to pinpoint the main graph
summarization methods, but especially to focus on the most recent approaches
and novel research trends on this topic, not yet covered by previous surveys.Comment: To appear in the Encyclopedia of Big Data Technologie
Symbolic Computing with Incremental Mindmaps to Manage and Mine Data Streams - Some Applications
In our understanding, a mind-map is an adaptive engine that basically works
incrementally on the fundament of existing transactional streams. Generally,
mind-maps consist of symbolic cells that are connected with each other and that
become either stronger or weaker depending on the transactional stream. Based
on the underlying biologic principle, these symbolic cells and their
connections as well may adaptively survive or die, forming different cell
agglomerates of arbitrary size. In this work, we intend to prove mind-maps'
eligibility following diverse application scenarios, for example being an
underlying management system to represent normal and abnormal traffic behaviour
in computer networks, supporting the detection of the user behaviour within
search engines, or being a hidden communication layer for natural language
interaction.Comment: 4 pages; 4 figure
- …