102 research outputs found
Bringing pervasive embedded networks to the service cloud: a lightweight middleware approach
The emergence of novel pervasive networks that consist of tiny embedded nodes have reduced the gap between real and virtual worlds. This paradigm has opened the Service Cloud to a variety of wireless devices especially those with sensorial and actuating capabilities. Those pervasive networks contribute to build new context-aware applications that interpret the state of the physical world at real-time. However, traditional Service-Oriented Architectures (SOA), which are widely used in the current Internet are unsuitable for such resource-constraint devices since they are too heavy. In this research paper, an internetworking approach is proposed in order to address that important issue. The main part of our proposal is the Knowledge-Aware and Service-Oriented (KASO) Middleware that has been designed for pervasive embedded networks. KASO Middleware implements a diversity of mechanisms, services and protocols which enable developers and business processing designers to deploy, expose, discover, compose, and orchestrate real-world services (i.e. services running on sensor/actuator devices). Moreover, KASO Middleware implements endpoints to offer those services to the Cloud in a REST manner. Our internetworking approach has been validated through a real healthcare telemonitoring system deployed in a sanatorium. The validation tests show that KASO Middleware successfully brings pervasive embedded networks to the Service Cloud
Using HDF5 as best practices for a business intelligence initiative
Internship Report presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Knowledge Management and Business IntelligenceThe goal of this report is to detail the internship done by the student Tiago Pombeiro in the company Sysvalue. The internship is a partial requirement for obtaining the Master's degree in Information Management with the specialization on Knowledge Management and Business Intelligence.
The work done places in the development of a component of the product Lighthouse, a platform of continuous monitoring as a service developed and commercialized by Sysvalue. More specifically the component in question focuses in a system for: capture, storage, provision and presentation of the performance data that is captured by the services / infrastructures that are being monitored by the platform, proving the advantages of the use of data format Hierarchical Data Format 5 (HDF5). The development of this component was applied to a business intelligence initiative so that the good practices of this field would be applied.
This report starts by giving an introduction to the context and goals where the internship is inserted upon, followed by the detailed description of the background, context and objectives of the internship itself.
After it is presented the literature background focused on the development of business intelligence projects, as well for HDF5 concepts and purposes - all subjects that were relevant for the internship practical work.
Hereafter the used methodology is described, from which the task and schedule were defined. Subsequently to this section a presentation of the results of the tasks is done, followed by a critical opinion about them. To finalize it is presented the possible future works that can follow this project presented as well a pragmatic reflexion of the internship
Using Blockchain to support Data & Service Monetization
Two required features of a data monetization platform are query and retrieval of the metadata of the resources to be monetized. Centralized platforms rely on the maturity of traditional NoSQL database systems to support these features. These databases, for example, MongoDB allows for very efficient query and retrieval of data it stores. However, centralized platforms come with a bag of security and privacy concerns, making them not the ideal approach for a data monetization platform. On the other hand, most existing decentralized platforms are only partially decentralized. In this research, I developed Cowry, a platform for publishing metadata describing available resources (data or services), discovery of published metadata including fast search and filtering. My main contribution is a fully decentralized architecture that combines blockchain and traditional distributed database to gain additional features such as efficient query and retrieval of metadata stored on the blockchain
Process-aware web programming with Jolie
We extend the Jolie programming language to capture the native modelling of
process-aware web information systems, i.e., web information systems based upon
the execution of business processes. Our main contribution is to offer a
unifying approach for the programming of distributed architectures on the web,
which can capture web servers, stateful process execution, and the composition
of services via mediation. We discuss applications of this approach through a
series of examples that cover, e.g., static content serving, multiparty
sessions, and the evolution of web systems. Finally, we present a performance
evaluation that includes a comparison of Jolie-based web systems to other
frameworks and a measurement of its scalability.Comment: IMADA-preprint-c
BabeLO : an extensible converter of programming exercises formats
In the last two decades, there was a proliferation of programming exercise formats that hinders interoperability in automatic assessment. In the lack of a widely accepted standard, a pragmatic solution is to convert content among the existing formats. BabeLO is a programming exercise converter providing services to a network of heterogeneous e-learning systems such as contest management systems, programming exercise authoring tools, evaluation engines and repositories of learning objects. Its main feature is the use of a pivotal format to achieve greater extensibility. This approach simplifies the extension to other formats, just requiring the conversion to and from the pivotal format. This paper starts with an analysis of programming exercise formats representative of the existing diversity. This analysis sets the context for the proposed approach to exercise conversion and to the description of the pivotal data format. The abstract service definition is the basis for the design of BabeLO, its components and web service interface. This paper includes a report on the use of BabeLO in two concrete scenarios: to relocate exercises to a different repository, and to use an evaluation engine in a network of heterogeneous systems
Virtual Resources & Internet of Things
Internet of Things (IoT) systems mostly follow a Cloud-centric approach. These systems get the benefits of the extensive computational capabilities and flexibility of the Cloud. Although Cloud-centric systems support virtualization of components to interact with IoT networks, many of these systems introduce high latency and restrict direct access to IoT devices. Fog computing has been presented as an alternative to reduce latency when engaging IoT networks, however, new forms of virtualization are required to access physical devices in a direct manner.
This research introduces a definition of Virtual Resources to enable direct access to IoT networks and to allow richer interactions between applications and IoT components. Additionally, this work proposes Virtual Resources as a mechanism to handle the multi-tenancy challenge that emerges when more than one tenant tries to access and manipulate an IoT component simultaneously. Virtual Resources are developed using Go language and CoAP protocol. This work proposes permission-based blockchain to provision Virtual Resources directly on IoT devices. Seven experiments have been done using Raspberry Pi computers and Edison Arduino boards to test the definition of Virtual Resources presented by this work. The results of the experiments demonstrate that Virtual Resources can be deployed across different IoT platforms. Also, the results show that Virtual Resources and blockchain can support multi-tenancy in the IoT space. IBM Bluemix Blockchain as a Service and Multichain blockchain have been evaluated handling the provisioning of Virtual Resources in the IoT network. The results of these experiments show that permission-based blockchain can store the configurations of Virtual Resources and provision these configurations in the IoT network
Model Driven Development of Sensor Web Networks Architecture
Primjena Internet protokola u uređajima sa ograničenim resursima, dovodi do radikalne promjene Interneta i pojave potpuno novog koncepta pod nazivom Internet stvâri – Internet of Things (IoT), čiji je jedan od osnovnih gradivnih elemenata Senzor Web (SW) čvor. SW čvor predstavlja elementarni “resurs” u SW mreži koja se po svojoj prirodi može posmatrati kao nestrukturirana kolekcija gradivnih elemenata koji se mogu dinamički orkestrirati u virtuelne klastere, odnosno u arhi-tekturu. Cilj disertacije predstavlja unapređenje procesa razvoja arhitekture sistema baziranih na SW mrežama uz oslonac na dinamičko generisanje servisnog sloja u svrhu povećanja produktivnosti, održivosti i smanjenja troškova razvoja. Pod unapređenjem procesa razvoja arhitekture smatra se analiza, integracija i prilagođavanje postojećih sistema i pristupa projektovanja arhitekture senzorskih mreža, kao i sistema baziranih na IoT konceptima. U tu svrhu definisana je arhitektura SW mreža, kreiran domenski specifičan jezik, interaktivni grafički editor i alat za automatsku transforma-ciju modela u implementacione klase. U sklopu teze izvršena je i eksperimentalna verifikacija predloženog modela i razvojnog okruženja, čime je dokazana njhova praktična primjena.The use of Internet protocols in limited resources devices contributes to radical changes in the Internet and the emergence of an entirely new concept called the Internet of Things (IoT), consisted of the Sensor Web (SW) nodes as one of the basic building blocks. SW node is the elementary "resource" in the SW Network, which by their nature can be seen as an unstructured collection of blocks that can be dynamically orchestrated into the virtual cluster, or in the architecture. The aim of this thesis is to improve the process of developing a system archite-cture based on SW networks, relying on the dynamic generation of the service layer in order to increase productivity, sustainability and cost of development. The improvement of the architecture development process includes analysis, integration and adaptation of existing systems and sensor network architecture design approaches, as well as systems based on the IoT concepts. For this purpose, the archite-cture of the SW Network is defined, a domain-specific language has been created as well as interactive graphics editor and a tool for automatic transformation of models into the implementation class. As part of the dissertation, the experimental verification of the proposed model and the development environment were carried out demonstra-ting their practical application
Security Management Framework for the Internet of Things
The increase in the design and development of wireless communication technologies
offers multiple opportunities for the management and control of cyber-physical systems
with connections between smart and autonomous devices, which provide the delivery
of simplified data through the use of cloud computing. Given this relationship with the
Internet of Things (IoT), it established the concept of pervasive computing that allows
any object to communicate with services, sensors, people, and objects without human
intervention. However, the rapid growth of connectivity with smart applications through
autonomous systems connected to the internet has allowed the exposure of numerous
vulnerabilities in IoT systems by malicious users.
This dissertation developed a novel ontology-based cybersecurity framework to
improve security in IoT systems using an ontological analysis to adapt appropriate
security services addressed to threats. The composition of this proposal explores
two approaches: (1) design time, which offers a dynamic method to build security
services through the application of a methodology directed to models considering
existing business processes; and (2) execution time, which involves monitoring the IoT
environment, classifying vulnerabilities and threats, and acting in the environment,
ensuring the correct adaptation of existing services.
The validation approach was used to demonstrate the feasibility of implementing the
proposed cybersecurity framework. It implies the evaluation of the ontology to offer
a qualitative evaluation based on the analysis of several criteria and also a proof of
concept implemented and tested using specific industrial scenarios. This dissertation
has been verified by adopting a methodology that follows the acceptance in the research
community through technical validation in the application of the concept in an industrial
setting.O aumento no projeto e desenvolvimento de tecnologias de comunicação sem fio oferece
múltiplas oportunidades para a gestão e controle de sistemas ciber-físicos com conexões
entre dispositivos inteligentes e autônomos, os quais proporcionam a entrega de dados
simplificados através do uso da computação em nuvem. Diante dessa relação com
a Internet das Coisas (IoT) estabeleceu-se o conceito de computação pervasiva que
permite que qualquer objeto possa comunicar com os serviços, sensores, pessoas e objetos
sem intervenção humana. Entretanto, o rápido crescimento da conectividade com as
aplicações inteligentes através de sistemas autônomos conectados com a internet permitiu
a exposição de inúmeras vulnerabilidades dos sistemas IoT para usuários maliciosos.
Esta dissertação desenvolveu um novo framework de cibersegurança baseada em
ontologia para melhorar a segurança em sistemas IoT usando uma análise ontológica
para a adaptação de serviços de segurança apropriados endereçados para as ameaças. A
composição dessa proposta explora duas abordagens: (1) tempo de projeto, o qual oferece
um método dinâmico para construir serviços de segurança através da aplicação de uma
metodologia dirigida a modelos, considerando processos empresariais existentes; e (2)
tempo de execução, o qual envolve o monitoramento do ambiente IoT, a classificação de
vulnerabilidades e ameaças, e a atuação no ambiente garantindo a correta adaptação dos
serviços existentes.
Duas abordagens de validação foram utilizadas para demonstrar a viabilidade da
implementação do framework de cibersegurança proposto. Isto implica na avaliação da
ontologia para oferecer uma avaliação qualitativa baseada na análise de diversos critérios
e também uma prova de conceito implementada e testada usando cenários específicos.
Esta dissertação foi validada adotando uma metodologia que segue a validação na
comunidade científica através da validação técnica na aplicação do nosso conceito em
um cenário industrial
On-premise containerized, light-weight software solutions for Biomedicine
Bioinformatics software systems are critical tools for analysing large-scale biological
data, but their design and implementation can be challenging due to the need for reliability, scalability, and performance. This thesis investigates the impact of several
software approaches on the design and implementation of bioinformatics software
systems. These approaches include software patterns, microservices, distributed
computing, containerisation and container orchestration. The research focuses on
understanding how these techniques affect bioinformatics software systems’ reliability, scalability, performance, and efficiency. Furthermore, this research highlights
the challenges and considerations involved in their implementation. This study also
examines potential solutions for implementing container orchestration in bioinformatics research teams with limited resources and the challenges of using container
orchestration. Additionally, the thesis considers microservices and distributed computing and how these can be optimised in the design and implementation process to
enhance the productivity and performance of bioinformatics software systems. The
research was conducted using a combination of software development, experimentation, and evaluation. The results show that implementing software patterns can
significantly improve the code accessibility and structure of bioinformatics software
systems. Specifically, microservices and containerisation also enhanced system reliability, scalability, and performance. Additionally, the study indicates that adopting
advanced software engineering practices, such as model-driven design and container
orchestration, can facilitate efficient and productive deployment and management of
bioinformatics software systems, even for researchers with limited resources. Overall, we develop a software system integrating all our findings. Our proposed system
demonstrated the ability to address challenges in bioinformatics. The thesis makes
several key contributions in addressing the research questions surrounding the design,
implementation, and optimisation of bioinformatics software systems using software
patterns, microservices, containerisation, and advanced software engineering principles and practices. Our findings suggest that incorporating these technologies can
significantly improve bioinformatics software systems’ reliability, scalability, performance, efficiency, and productivity.Bioinformatische Software-Systeme stellen bedeutende Werkzeuge für die Analyse
umfangreicher biologischer Daten dar. Ihre Entwicklung und Implementierung kann
jedoch aufgrund der erforderlichen Zuverlässigkeit, Skalierbarkeit und Leistungsfähigkeit eine Herausforderung darstellen. Das Ziel dieser Arbeit ist es, die Auswirkungen von Software-Mustern, Microservices, verteilten Systemen, Containerisierung
und Container-Orchestrierung auf die Architektur und Implementierung von bioinformatischen Software-Systemen zu untersuchen. Die Forschung konzentriert sich
darauf, zu verstehen, wie sich diese Techniken auf die Zuverlässigkeit, Skalierbarkeit,
Leistungsfähigkeit und Effizienz von bioinformatischen Software-Systemen auswirken
und welche Herausforderungen mit ihrer Konzeptualisierungen und Implementierung
verbunden sind. Diese Arbeit untersucht auch potenzielle Lösungen zur Implementierung von Container-Orchestrierung in bioinformatischen Forschungsteams mit begrenzten Ressourcen und die Einschränkungen bei deren Verwendung in diesem Kontext. Des Weiteren werden die Schlüsselfaktoren, die den Erfolg von bioinformatischen Software-Systemen mit Containerisierung, Microservices und verteiltem Computing beeinflussen, untersucht und wie diese im Design- und Implementierungsprozess optimiert werden können, um die Produktivität und Leistung bioinformatischer
Software-Systeme zu steigern. Die vorliegende Arbeit wurde mittels einer Kombination aus Software-Entwicklung, Experimenten und Evaluation durchgeführt. Die
erzielten Ergebnisse zeigen, dass die Implementierung von Software-Mustern, die Zuverlässigkeit und Skalierbarkeit von bioinformatischen Software-Systemen erheblich
verbessern kann. Der Einsatz von Microservices und Containerisierung trug ebenfalls zur Steigerung der Zuverlässigkeit, Skalierbarkeit und Leistungsfähigkeit des
Systems bei. Darüber hinaus legt die Arbeit dar, dass die Anwendung von SoftwareEngineering-Praktiken, wie modellgesteuertem Design und Container-Orchestrierung,
die effiziente und produktive Bereitstellung und Verwaltung von bioinformatischen
Software-Systemen erleichtern kann. Zudem löst die Implementierung dieses SoftwareSystems, Herausforderungen für Forschungsgruppen mit begrenzten Ressourcen. Insgesamt hat das System gezeigt, dass es in der Lage ist, Herausforderungen im Bereich
der Bioinformatik zu bewältigen und stellt somit ein wertvolles Werkzeug für Forscher in diesem Bereich dar. Die vorliegende Arbeit leistet mehrere wichtige Beiträge
zur Beantwortung von Forschungsfragen im Zusammenhang mit dem Entwurf, der
Implementierung und der Optimierung von Software-Systemen für die Bioinformatik unter Verwendung von Prinzipien und Praktiken der Softwaretechnik. Unsere
Ergebnisse deuten darauf hin, dass die Einbindung dieser Technologien die Zuverlässigkeit, Skalierbarkeit, Leistungsfähigkeit, Effizienz und Produktivität bioinformatischer Software-Systeme erheblich verbessern kann
A Semantic Interoperability Model Based on the IEEE 1451 Family of Standards Applied to the Industry 4.0
The Internet of Things (IoT) has been growing recently. It is a concept for connecting
billions of smart devices through the Internet in different scenarios. One area being
developed inside the IoT in industrial automation, which covers Machine-to-Machine (M2M) and industrial communications with an automatic process, emerging the
Industrial Internet of Things (IIoT) concept. Inside the IIoT is developing the concept of
Industry 4.0 (I4.0). That represents the fourth industrial revolution and addresses the
use of Internet technologies to improve the production efficiency of intelligent services
in smart factories. I4.0 is composed of a combination of objects from the physical world and the digital world that offers dedicated functionality and flexibility inside and outside of an I4.0 network.
The I4.0 is composed mainly of Cyber-Physical Systems (CPS). The CPS is the integration
of the physical world and its digital world, i.e., the Digital Twin (DT). It is responsible for realising the intelligent cross-link application, which operates in a self-organised and
decentralised manner, used by smart factories for value creation. An area where the CPS
can be implemented in manufacturing production is developing the Cyber-Physical Production System (CPPS) concept. CPPS is the implementation of Industry 4.0 and CPS in manufacturing and production, crossing all levels of production between the
autonomous and cooperative elements and sub-systems. It is responsible for connecting
the virtual space with the physical world, allowing the smart factories to be more intelligent, resulting in better and smart production conditions, increasing productivity,
production efficiency, and product quality. The big issue is connecting smart devices with
different standards and protocols. About 40% of the benefits of the IoT cannot be
achieved without interoperability. This thesis is focused on promoting the
interoperability of smart devices (sensors and actuators) inside the IIoT under the I4.0 context.
The IEEE 1451 is a family of standards developed to manage transducers. This standard reaches the syntactic level of interoperability inside Industry 4.0. However, Industry 4.0
requires a semantic level of communication not to exchange data ambiguously. A new
semantic layer is proposed in this thesis allowing the IEEE 1451 standard to be a complete framework for communication inside the Industry 4.0 to provide an interoperable network interface with users and applications to collect and share the data from the industry field.A Internet das Coisas tem vindo a crescer recentemente. É um conceito que permite
conectar bilhões de dispositivos inteligentes através da Internet em diferentes cenários.
Uma área que está sendo desenvolvida dentro da Internet das Coisas é a automação
industrial, que abrange a comunicação máquina com máquina no processo industrial de
forma automática. Essa interligação, representa o conceito da Internet das Coisas
Industrial. Dentro da Internet das Coisas Industrial está a desenvolver o conceito de
Indústria 4.0 (I4.0). Isso representa a quarta revolução industrial que aborda o uso de
tecnologias utilizadas na Internet para melhorar a eficiência da produção de serviços em
fábricas inteligentes. A Indústria 4.0 é composta por uma combinação de objetos do
mundo físico e do mundo da digital que oferece funcionalidade dedicada e flexibilidade
dentro e fora de uma rede da Indústria 4.0.
O I4.0 é composto principalmente por Sistemas Ciberfísicos. Os Sistemas Ciberfísicos
permitem a integração do mundo físico com seu representante no mundo digital, por
meio do Gémeo Digital. Sistemas Ciberfísicos são responsáveis por realizar a aplicação
inteligente da ligação cruzada, que opera de forma auto-organizada e descentralizada,
utilizada por fábricas inteligentes para criação de valor. Uma área em que o Sistema
Ciberfísicos pode ser implementado na produção manufatureira, isso representa o
desenvolvimento do conceito Sistemas de Produção Ciberfísicos. Esse sistema é a
implementação da Indústria 4.0 e Sistema Ciberfísicos na fabricação e produção. A
cruzar todos os níveis desde a produção entre os elementos e subsistemas autónomos e
cooperativos. Ele é responsável por conectar o espaço virtual com o mundo físico,
permitindo que as fábricas inteligentes sejam mais inteligentes, resultando em condições
de produção melhores e inteligentes, aumentando a produtividade, a eficiência da
produção e a qualidade do produto. A grande questão é como conectar dispositivos
inteligentes com diferentes normas e protocolos. Cerca de 40% dos benefícios da Internet
das Coisas não podem ser alcançados sem interoperabilidade. Esta tese está focada em
promover a interoperabilidade de dispositivos inteligentes (sensores e atuadores) dentro
da Internet das Coisas Industrial no contexto da Indústria 4.0.
O IEEE 1451 é uma família de normas desenvolvidos para gerenciar transdutores. Esta
norma alcança o nível sintático de interoperabilidade dentro de uma indústria 4.0. No
entanto, a Indústria 4.0 requer um nível semântico de comunicação para não haver a
trocar dados de forma ambígua. Uma nova camada semântica é proposta nesta tese
permitindo que a família de normas IEEE 1451 seja um framework completo para
comunicação dentro da Indústria 4.0. Permitindo fornecer uma interface de rede
interoperável com utilizadores e aplicações para recolher e compartilhar os dados dentro
de um ambiente industrial.This thesis was developed at the Measurement and Instrumentation Laboratory (IML)
in the University of Beira Interior and supported by the portuguese project INDTECH
4.0 – Novas tecnologias para fabricação, que tem como objetivo geral a conceção e
desenvolvimento de tecnologias inovadoras no contexto da Indústria 4.0/Factories of the Future (FoF), under the number POCI-01-0247-FEDER-026653
- …