21,099 research outputs found
IDMoB: IoT Data Marketplace on Blockchain
Today, Internet of Things (IoT) devices are the powerhouse of data generation
with their ever-increasing numbers and widespread penetration. Similarly,
artificial intelligence (AI) and machine learning (ML) solutions are getting
integrated to all kinds of services, making products significantly more
"smarter". The centerpiece of these technologies is "data". IoT device vendors
should be able keep up with the increased throughput and come up with new
business models. On the other hand, AI/ML solutions will produce better results
if training data is diverse and plentiful.
In this paper, we propose a blockchain-based, decentralized and trustless
data marketplace where IoT device vendors and AI/ML solution providers may
interact and collaborate. By facilitating a transparent data exchange platform,
access to consented data will be democratized and the variety of services
targeting end-users will increase. Proposed data marketplace is implemented as
a smart contract on Ethereum blockchain and Swarm is used as the distributed
storage platform.Comment: Presented at Crypto Valley Conference on Blockchain Technology (CVCBT
2018), 20-22 June 2018 - published version may diffe
Context-Awareness Enhances 5G Multi-Access Edge Computing Reliability
The fifth generation (5G) mobile telecommunication network is expected to
support Multi- Access Edge Computing (MEC), which intends to distribute
computation tasks and services from the central cloud to the edge clouds.
Towards ultra-responsive, ultra-reliable and ultra-low-latency MEC services,
the current mobile network security architecture should enable a more
decentralized approach for authentication and authorization processes. This
paper proposes a novel decentralized authentication architecture that supports
flexible and low-cost local authentication with the awareness of context
information of network elements such as user equipment and virtual network
functions. Based on a Markov model for backhaul link quality, as well as a
random walk mobility model with mixed mobility classes and traffic scenarios,
numerical simulations have demonstrated that the proposed approach is able to
achieve a flexible balance between the network operating cost and the MEC
reliability.Comment: Accepted by IEEE Access on Feb. 02, 201
Microservice Transition and its Granularity Problem: A Systematic Mapping Study
Microservices have gained wide recognition and acceptance in software
industries as an emerging architectural style for autonomic, scalable, and more
reliable computing. The transition to microservices has been highly motivated
by the need for better alignment of technical design decisions with improving
value potentials of architectures. Despite microservices' popularity, research
still lacks disciplined understanding of transition and consensus on the
principles and activities underlying "micro-ing" architectures. In this paper,
we report on a systematic mapping study that consolidates various views,
approaches and activities that commonly assist in the transition to
microservices. The study aims to provide a better understanding of the
transition; it also contributes a working definition of the transition and
technical activities underlying it. We term the transition and technical
activities leading to microservice architectures as microservitization. We then
shed light on a fundamental problem of microservitization: microservice
granularity and reasoning about its adaptation as first-class entities. This
study reviews state-of-the-art and -practice related to reasoning about
microservice granularity; it reviews modelling approaches, aspects considered,
guidelines and processes used to reason about microservice granularity. This
study identifies opportunities for future research and development related to
reasoning about microservice granularity.Comment: 36 pages including references, 6 figures, and 3 table
Toward a framework for data quality in cloud-based health information system
This Cloud computing is a promising platform for health information systems in order to reduce costs and improve accessibility. Cloud computing represents a shift away from computing being purchased as a product to be a service delivered over the Internet to customers. Cloud computing paradigm is becoming one of the popular IT infrastructures for facilitating Electronic Health Record (EHR) integration and sharing. EHR is defined as a repository of patient data in digital form. This record is stored and exchanged securely and accessible by different levels of authorized users. Its key purpose is to support the continuity of care, and allow the exchange and integration of medical information for a patient. However, this would not be achieved without ensuring the quality of data populated in the healthcare clouds as the data quality can have a great impact on the overall effectiveness of any system. The assurance of the quality of data used in healthcare systems is a pressing need to help the continuity and quality of care. Identification of data quality dimensions in healthcare clouds is a challenging issue as data quality of cloud-based health information systems arise some issues such as the appropriateness of use, and provenance. Some research proposed frameworks of the data quality dimensions without taking into consideration the nature of cloud-based healthcare systems. In this paper, we proposed an initial framework that fits the data quality attributes. This framework reflects the main elements of the cloud-based healthcare systems and the functionality of EHR
A look at cloud architecture interoperability through standards
Enabling cloud infrastructures to evolve into a transparent platform while preserving integrity raises interoperability issues. How components are connected needs to be addressed. Interoperability requires standard data models and communication encoding technologies compatible with the existing Internet infrastructure. To reduce vendor lock-in situations, cloud computing must implement universal strategies regarding standards, interoperability and portability. Open standards are of critical importance and need to be embedded into interoperability solutions. Interoperability is determined at the data level as well as the service level. Corresponding modelling standards and integration solutions shall be analysed
- …