1,042 research outputs found
Designing a Thrifty Approach for SME Business Continuity: Practices for Transparency of the Design Process
Business continuity (BC) management is an organizational approach to preparing information systems (IS) for incidents, but such approaches are uncommon among small and medium-sized enterprises (SMEs). Past research has indicated a gap in approaches that are designed for SMEs since BC management approaches tend to originate from larger organizations and SMEs lack the resources to implement them. To fill this gap, and to respond to a practical need by an IT consultancy company, we employed design science research (DSR) to develop a BC approach for SMEs coined as the thrifty BC management approach. Jointly with the company’s practitioners, we developed a set of meta-requirements for BC approaches for SMEs anchored in prior BC literature, practitioners’ practical expertise, and the theories of collective mindfulness and sociotechnical systems. We evaluated our thrifty BC management approach with multiple SMEs. These evaluations suggest that the designed approach mostly meets the defined meta-requirements. Moreover, the evaluations offered ample opportunities for learning. The design process, unfolding in a real-world setting, was precarious, rife with contingencies and ad hoc decisions. To render the design process transparent, we adapted four writing conventions from the confessional research genre familiar to ethnographic research but novel to DSR. We offer a threefold contribution. First, we contribute to SMEs’ BC with meta-requirements and their instantiation in a new BC approach (artifact); second, we contribute with four practices of confessional writing for transparency of DSR research; and third, we contribute with reflections on our theoretical learning from throughout the design process
Data science: a game changer for science and innovation
AbstractThis paper shows data science's potential for disruptive innovation in science, industry, policy, and people's lives. We present how data science impacts science and society at large in the coming years, including ethical problems in managing human behavior data and considering the quantitative expectations of data science economic impact. We introduce concepts such as open science and e-infrastructure as useful tools for supporting ethical data science and training new generations of data scientists. Finally, this work outlines SoBigData Research Infrastructure as an easy-to-access platform for executing complex data science processes. The services proposed by SoBigData are aimed at using data science to understand the complexity of our contemporary, globally interconnected society
Contribución a la estimulación del uso de soluciones Cloud Computing: Diseño de un intermediador de servicios Cloud para fomentar el uso de ecosistemas distribuidos digitales confiables, interoperables y de acuerdo a la legalidad. Aplicación en entornos multi-cloud.
184 p.El objetivo del trabajo de investigación presentado en esta tesis es facilitar a los desarrolladores y operadores de aplicaciones desplegadas en múltiples Nubes el descubrimiento y la gestión de los diferentes servicios de Computación, soportando su reutilización y combinación, para generar una red de servicios interoperables, que cumplen con las leyes y cuyos acuerdos de nivel de servicio pueden ser evaluados de manera continua. Una de las contribuciones de esta tesis es el diseño y desarrollo de un bróker de servicios de Computación llamado ACSmI (Advanced Cloud Services meta-Intermediator). ACSmI permite evaluar el cumplimiento de los acuerdos de nivel de servicio incluyendo la legislación. ACSmI también proporciona una capa de abstracción intermedia para los servicios de Computación donde los desarrolladores pueden acceder fácilmente a un catálogo de servicios acreditados y compatibles con los requisitos no funcionales establecidos.Además, este trabajo de investigación propone la caracterización de las aplicaciones nativas multiNube y el concepto de "DevOps extendido" especialmente pensado para este tipo de aplicaciones. El concepto "DevOps extendido" pretende resolver algunos de los problemas actuales del diseño, desarrollo, implementación y adaptación de aplicaciones multiNube, proporcionando un enfoque DevOps novedoso y extendido para la adaptación de las prácticas actuales de DevOps al paradigma multiNube
Model-Driven Methodology for Rapid Deployment of Smart Spaces based on Resource-Oriented Architectures
Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym
Big Data Analytics in Project Management: A Key to Success
This review delves into the influence of big data analytics on project management effectiveness and project success rates. By examining applications, accomplishments, hindrances, and emerging developments in the context of big data analytics and project management, this review provides insights into its transformative potential. Results indicate that big data analytics fosters improved project performance, more robust risk management, and heightened adaptability. However, challenges related to data quality, privacy, and project manager training remain to be addressed. This review underscores the value of data-driven decision-making for both practitioners and researchers in the project management field
Designing Monitoring Systems for Continuous Certification of Cloud Services: Deriving Meta-requirements and Design Guidelines
Continuous service certification (CSC) involves the consistently gathering and assessing certification-relevant information about cloud service operations to validate whether they continue to adhere to certification criteria. Previous research has proposed test-based CSC methodologies that directly assess the components of cloud service infrastructures. However, test-based certification requires that certification authorities can access the cloud infrastructure, which various issues may limit. To address these challenges, cloud service providers need to conduct monitoring-based CSC; that is, monitor their cloud service infrastructure to gather certification-relevant data by themselves and then provide these data to certification authorities. Nevertheless, we need to better understand how to design monitoring systems to enable cloud service providers to perform such monitoring. By taking a design science perspective, we derive universal meta-requirements and design guidelines for CSC monitoring systems based on findings from five expert focus group interviews with 33 cloud experts and 10 one-to-one interviews with cloud customers. With this study, we expand the current knowledge base regarding CSC and monitoring-based CSC. Our derived design guidelines contribute to the development of CSC monitoring systems and enable monitoring-based CSC that overcomes issues of prior test-based approaches
A Big Data perspective on Cyber-Physical Systems for Industry 4.0: modernizing and scaling complex event processing
Doctoral program in Advanced Engineering Systems for IndustryNowadays, the whole industry makes efforts to find the most productive ways of working and it already
understood that using the data that is being produced inside and outside the factories is a way to improve
the business performance. A set of modern technologies combined with sensor-based communication
create the possibility to act according to our needs, precisely at the moment when the data is being
produced and processed. Considering the diversity of processes existing in a factory, all of them producing
data, Complex Event Processing (CEP) with the capabilities to process that amount of data is needed in
the daily work of a factory, to process different types of events and find patterns between them. Although
the integration of the Big Data and Complex Event Processing topics is already present in the literature,
open challenges in this area were identified, hence the reason for the contribution presented in this thesis.
Thereby, this doctoral thesis proposes a system architecture that integrates the CEP concept with a rulebased
approach in the Big Data context: the Intelligent Event Broker (IEB). This architecture proposes the
use of adequate Big Data technologies in its several components. At the same time, some of the gaps
identified in this area were fulfilled, complementing Event Processing with the possibility to use Machine
Learning Models that can be integrated in the rules' verification, and also proposing an innovative
monitoring system with an immersive visualization component to monitor the IEB and prevent its
uncontrolled growth, since there are always several processes inside a factory that can be integrated in
the system. The proposed architecture was validated with a demonstration case using, as an example,
the Active Lot Release Bosch's system. This demonstration case revealed that it is feasible to implement
the proposed architecture and proved the adequate functioning of the IEB system to process Bosch's
business processes data and also to monitor its components and the events flowing through those
components.Hoje em dia as indústrias esforçam-se para encontrar formas de serem mais produtivas. A utilização dos
dados que são produzidos dentro e fora das fábricas já foi identificada como uma forma de melhorar o
desempenho do negócio. Um conjunto de tecnologias atuais combinado com a comunicação baseada
em sensores cria a possibilidade de se atuar precisamente no momento em que os dados estão a ser
produzidos e processados, assegurando resposta às necessidades do negócio. Considerando a
diversidade de processos que existem e produzem dados numa fábrica, as capacidades do
Processamento de Eventos Complexos (CEP) revelam-se necessárias no quotidiano de uma fábrica,
processando diferentes tipos de eventos e encontrando padrões entre os mesmos. Apesar da integração
do conceito CEP na era de Big Data ser um tópico já presente na literatura, existem ainda desafios nesta
área que foram identificados e que dão origem às contribuições presentes nesta tese. Assim, esta tese
de doutoramento propõe uma arquitetura para um sistema que integre o conceito de CEP na era do Big
Data, seguindo uma abordagem baseada em regras: o Intelligent Event Broker (IEB). Esta arquitetura
propõe a utilização de tecnologias de Big Data que sejam adequadas aos seus diversos componentes.
As lacunas identificadas na literatura foram consideradas, complementando o processamento de eventos
com a possibilidade de utilizar modelos de Machine Learning com vista a serem integrados na verificação
das regras, propondo também um sistema de monitorização inovador composto por um componente de
visualização imersiva que permite monitorizar o IEB e prevenir o seu crescimento descontrolado, o que
pode acontecer devido à integração do conjunto significativo de processos existentes numa fábrica. A
arquitetura proposta foi validada através de um caso de demonstração que usou os dados do Active Lot
Release, um sistema da Bosch. Os resultados revelaram a viabilidade da implementação da arquitetura
e comprovaram o adequado funcionamento do sistema no que diz respeito ao processamento dos dados
dos processos de negócio da Bosch e à monitorização dos componentes do IEB e eventos que fluem
através desses.This work has been supported by FCT – Fundação para a Ciência e Tecnologia within the R&D Units
Project Scope: UIDB/00319/2020, the Doctoral scholarship PD/BDE/135101/2017 and by European
Structural and Investment Funds in the FEDER component, through the Operational Competitiveness and
Internationalization Programme (COMPETE 2020) [Project nº 039479; Funding Reference: POCI-01-
0247-FEDER-039479]
Designing a Thrifty Approach for SME Business Continuity: Practices for Transparency of the Design Process
Business continuity (BC) management is an organizational approach to preparing information systems (IS) for incidents, but such approaches are uncommon among small and medium-sized enterprises (SMEs). Past research has indicated a gap in approaches that are designed for SMEs since BC management approaches tend to originate from larger organizations and SMEs lack the resources to implement them. To fill this gap, and to respond to a practical need by an IT consultancy company, we employed design science research (DSR) to develop a BC approach for SMEs coined as the thrifty BC management approach. Jointly with the company’s practitioners, we developed a set of meta-requirements for BC approaches for SMEs anchored in prior BC literature, practitioners’ practical expertise, and the theories of collective mindfulness and sociotechnical systems. We evaluated our thrifty BC management approach with multiple SMEs. These evaluations suggest that the designed approach mostly meets the defined meta-requirements. Moreover, the evaluations offered ample opportunities for learning. The design process, unfolding in a real-world setting, was precarious, rife with contingencies and ad hoc decisions. To render the design process transparent, we adapted four writing conventions from the confessional research genre familiar to ethnographic research but novel to DSR. We offer a threefold contribution. First, we contribute to SMEs’ BC with meta-requirements and their instantiation in a new BC approach (artifact); second, we contribute with four practices of confessional writing for transparency of DSR research; and third, we contribute with reflections on our theoretical learning from throughout the design process.</p
- …