16,972 research outputs found
When Things Matter: A Data-Centric View of the Internet of Things
With the recent advances in radio-frequency identification (RFID), low-cost
wireless sensor devices, and Web technologies, the Internet of Things (IoT)
approach has gained momentum in connecting everyday objects to the Internet and
facilitating machine-to-human and machine-to-machine communication with the
physical world. While IoT offers the capability to connect and integrate both
digital and physical entities, enabling a whole new class of applications and
services, several significant challenges need to be addressed before these
applications and services can be fully realized. A fundamental challenge
centers around managing IoT data, typically produced in dynamic and volatile
environments, which is not only extremely large in scale and volume, but also
noisy, and continuous. This article surveys the main techniques and
state-of-the-art research efforts in IoT from data-centric perspectives,
including data stream processing, data storage models, complex event
processing, and searching in IoT. Open research issues for IoT data management
are also discussed
Recommended from our members
Integrating information and knowledge for enterprise innovation
It has widely been accepted that enterprise integration, can be a source of socio-technical and cultural problems within organisations wishing to provide a focussed end-to-end business service. This can cause possible “straitjacketing” of business process architectures, thus suppressing responsive business re-engineering and competitive advantage for some companies. Accordingly, the current typology and emergent forms of Enterprise Resource Planning (ERP) and Enterprise Application Integration (EAI) technologies are set in the context of understanding information and knowledge integration philosophies. As such, key influences and trends in emerging IS integration choices, for end-to-end, cost-effective and flexible knowledge integration, are examined. As touch points across and outside organisations proliferate, via work-flow and relationship management-driven value innovation, aspects of knowledge refinement and knowledge integration pose challenges to maximising the potential of innovation and sustainable success, within enterprises. This is in terms of the increasing propensity for data fragmentation and the lack of effective information management, in the light of information overload. Furthermore, the nature of IS mediation which is inherent within decision making and workflow-based business processes, provides the basis for evaluation of the effects of information and knowledge integration. Hence, the authors propose a conceptual, holistic evaluation framework which encompasses these ideas. It is thus argued that such trends, and their implications regarding enterprise IS integration to engender sustainable competitive advantage, require fundamental re-thinking
Recommended from our members
Towards an aspect weaving BPEL engine
This position paper proposes the use of dynamic aspects and
the visitor design pattern to obtain a highly configurable and
extensible BPEL engine. Using these two techniques, the
core of this infrastructural software can be customised to
meet new requirements and add features such as debugging,
execution monitoring, or changing to another Web Service
selection policy. Additionally, it can easily be extended to
cope with customer-specific BPEL extensions. We propose
the use of dynamic aspects not only on the engine itself
but also on the workflow in order to tackle the problems of
Web Service hot deployment and hot fixes to long running
processes. In this way, composing aWeb Service "on-the-fly"
means weaving its choreography interface into the workflow
A planning approach to the automated synthesis of template-based process models
The design-time specification of flexible processes can be time-consuming and error-prone, due to the high number of tasks involved and their context-dependent nature. Such processes frequently suffer from potential interference among their constituents, since resources are usually shared by the process participants and it is difficult to foresee all the potential tasks interactions in advance. Concurrent tasks may not be independent from each other (e.g., they could operate on the same data at the same time), resulting in incorrect outcomes. To tackle these issues, we propose an approach for the automated synthesis of a library of template-based process models that achieve goals in dynamic and partially specified environments. The approach is based on a declarative problem definition and partial-order planning algorithms for template generation. The resulting templates guarantee sound concurrency in the execution of their activities and are reusable in a variety of partially specified contextual environments. As running example, a disaster response scenario is given. The approach is backed by a formal model and has been tested in experiment
Causal Consistency: Beyond Memory
In distributed systems where strong consistency is costly when not
impossible, causal consistency provides a valuable abstraction to represent
program executions as partial orders. In addition to the sequential program
order of each computing entity, causal order also contains the semantic links
between the events that affect the shared objects -- messages emission and
reception in a communication channel , reads and writes on a shared register.
Usual approaches based on semantic links are very difficult to adapt to other
data types such as queues or counters because they require a specific analysis
of causal dependencies for each data type. This paper presents a new approach
to define causal consistency for any abstract data type based on sequential
specifications. It explores, formalizes and studies the differences between
three variations of causal consistency and highlights them in the light of
PRAM, eventual consistency and sequential consistency: weak causal consistency,
that captures the notion of causality preservation when focusing on convergence
; causal convergence that mixes weak causal consistency and convergence; and
causal consistency, that coincides with causal memory when applied to shared
memory.Comment: 21st ACM SIGPLAN Symposium on Principles and Practice of Parallel
Programming, Mar 2016, Barcelone, Spai
Recommended from our members
JuxtaLearn D3.2 Performance Framework
This deliverable, D3.2, for Work Package 3 incorporating the pedagogy from WP2 and orchestration factors mapped in D3.1 reviews aspects of performance in the context of participative video making. It reviews literature on curiosity and engagement characteristics of interaction mechanisms for public displays and anticipates requirements for social network analysis of relevant public videos from WP6 task 6.3. Thus, to support JuxtaLearn performance it proposes a reflective performance framework that encompasses the material environment and objects required, the participants, and the knowledge needed
- …