13 research outputs found

    A Framework to Build a Big Data Ecosystem Oriented to the Collaborative Networked Organization

    Get PDF
    A Collaborative Networked Organization (CNO) is a set of entities that operate in heterogeneous contexts and aim to collaborate to take advantage of a business opportunity or solve a problem. Big data allows CNOs to be more competitive by improving their strategy, management and business processes. To support the development of big data ecosystems in CNOs, several frameworks have been reported in the literature. However, these frameworks limit their application to a specific CNO manifestation and cannot conduct intelligent processing of big data to support decision making at the CNO. This paper makes two main contributions: (1) the proposal of a metaframework to analyze existing and future frameworks for the development of big data ecosystems in CNOs and (2) to show the Collaborative Networked Organizations–big data (CNO-BD) framework, which includes guidelines, tools, techniques, conceptual solutions and good practices for the building of a big data ecosystem in different kinds of Collaborative Networked Organizations, overcoming the weaknesses of previous issues. The CNO-BD framework consists of seven dimensions: levels, approaches, data fusion, interoperability, data sources, big data assurance and programmable modules. The framework was validated through expert assessment and a case study

    Systems Interoperability Types: A Tertiary Study

    Full text link
    Interoperability has been a focus of attention over at least four decades, with the emergence of several interoperability types (or levels), diverse models, frameworks, and solutions, also as a result of a continuous effort from different domains. The current heterogeneity in technologies such as blockchain, IoT and new application domains such as Industry 4.0 brings not only new interaction possibilities but also challenges for interoperability. Moreover, confusion and ambiguity in the current understanding of interoperability types exist, hampering stakeholders' communication and decision making. This work presents an updated panorama of software-intensive systems interoperability with particular attention to its types. For this, we conducted a tertiary study that scrutinized 37 secondary studies published from 2012 to 2023, from which we found 36 interoperability types associated with 117 different definitions, besides 13 interoperability models and six frameworks in various domains. This panorama reveals that the concern with interoperability has migrated from technical to social-technical issues going beyond the software systems' boundary and still requiring solving many open issues. We also address the urgent actions and also potential research opportunities to leverage interoperability as a multidisciplinary research field to achieve low-coupled, cost-effective, and interoperable systems.Comment: 33 page

    A systematic methodology to analyse the performance and design configurations of business interoperability in cooperative industrial networks

    Get PDF
    This thesis proposes a methodology for modelling business interoperability in a context of cooperative industrial networks. The purpose is to develop a methodology that enables the design of cooperative industrial network platforms that are able to deliver business interoperability and the analysis of its impact on the performance of these platforms. To achieve the proposed objective, two modelling tools have been employed: the Axiomatic Design Theory for the design of interoperable platforms; and Agent-Based Simulation for the analysis of the impact of business interoperability. The sequence of the application of the two modelling tools depends on the scenario under analysis, i.e. whether the cooperative industrial network platform exists or not. If the cooperative industrial network platform does not exist, the methodology suggests first the application of the Axiomatic Design Theory to design different configurations of interoperable cooperative industrial network platforms, and then the use of Agent-Based Simulation to analyse or predict the business interoperability and operational performance of the designed configurations. Otherwise, one should start by analysing the performance of the existing platform and based on the achieved results, decide whether it is necessary to redesign it or not. If the redesign is needed, simulation is once again used to predict the performance of the redesigned platform. To explain how those two modelling tools can be applied in practice, a theoretical modelling framework, a theoretical Axiomatic Design model and a theoretical Agent-Based Simulation model are proposed. To demonstrate the applicability of the proposed methodology and/or to validate the proposed theoretical models, a case study regarding a Portuguese Reverse Logistics cooperative network (Valorpneu network) and a case study regarding a Portuguese construction project (Dam Baixo Sabor network) are presented. The findings of the application of the proposed methodology to these two case studies suggest that indeed the Axiomatic Design Theory can effectively contribute in the design of interoperable cooperative industrial network platforms and that Agent-Based Simulation provides an effective set of tools for analysing the impact of business interoperability on the performance of those platforms. However, these conclusions cannot be generalised as only two case studies have been carried out. In terms of relevance to theory, this is the first time that the network effect is addressed in the analysis of the impact of business interoperability on the performance of networked companies and also the first time that a holistic approach is proposed to design interoperable cooperative industrial network platforms. Regarding the practical implications, the proposed methodology is intended to provide industrial managers a management tool that can guide them easily, and in practical and systematic way, in the design of configurations of interoperable cooperative industrial network platforms and/or in the analysis of the impact of business interoperability on the performance of their companies and the networks where their companies operate

    Methodological approaches and techniques for designing ontologies in information systems requirements engineering

    Get PDF
    Programa doutoral em Information Systems and TechnologyThe way we interact with the world around us is changing as new challenges arise, embracing innovative business models, rethinking the organization and processes to maximize results, and evolving change management. Currently, and considering the projects executed, the methodologies used do not fully respond to the companies' needs. On the one hand, organizations are not familiar with the languages used in Information Systems, and on the other hand, they are often unable to validate requirements or business models. These are some of the difficulties encountered that lead us to think about formulating a new approach. Thus, the state of the art presented in this paper includes a study of the models involved in the software development process, where traditional methods and the rivalry of agile methods are present. In addition, a survey is made about Ontologies and what methods exist to conceive, transform, and represent them. Thus, after analyzing some of the various possibilities currently available, we began the process of evolving a method and developing an approach that would allow us to design ontologies. The method we evolved and adapted will allow us to derive terminologies from a specific domain, aggregating them in order to facilitate the construction of a catalog of terminologies. Next, the definition of an approach to designing ontologies will allow the construction of a domain-specific ontology. This approach allows in the first instance to integrate and store the data from different information systems of a given organization. In a second instance, the rules for mapping and building the ontology database are defined. Finally, a technological architecture is also proposed that will allow the mapping of an ontology through the construction of complex networks, allowing mapping and relating terminologies. This doctoral work encompasses numerous Research & Development (R&D) projects belonging to different domains such as Software Industry, Textile Industry, Robotic Industry and Smart Cities. Finally, a critical and descriptive analysis of the work done is performed, and we also point out perspectives for possible future work.A forma como interagimos com o mundo à nossa volta está a mudar à medida que novos desafios surgem, abraçando modelos empresariais inovadores, repensando a organização e os processos para maximizar os resultados, e evoluindo a gestão da mudança. Atualmente, e considerando os projetos executados, as metodologias utilizadas não respondem na totalidade às necessidades das empresas. Por um lado, as organizações não estão familiarizadas com as linguagens utilizadas nos Sistemas de Informação, por outro lado, são muitas vezes incapazes de validar requisitos ou modelos de negócio. Estas são algumas das dificuldades encontradas que nos levam a pensar na formulação de uma nova abordagem. Assim, o estado da arte apresentado neste documento inclui um estudo dos modelos envolvidos no processo de desenvolvimento de software, onde os métodos tradicionais e a rivalidade de métodos ágeis estão presentes. Além disso, é efetuado um levantamento sobre Ontologias e quais os métodos existentes para as conceber, transformar e representar. Assim, e após analisarmos algumas das várias possibilidades atualmente disponíveis, iniciou-se o processo de evolução de um método e desenvolvimento de uma abordagem que nos permitisse conceber ontologias. O método que evoluímos e adaptamos permitirá derivar terminologias de um domínio específico, agregando-as de forma a facilitar a construção de um catálogo de terminologias. Em seguida, a definição de uma abordagem para conceber ontologias permitirá a construção de uma ontologia de um domínio específico. Esta abordagem permite em primeira instância, integrar e armazenar os dados de diferentes sistemas de informação de uma determinada organização. Num segundo momento, são definidas as regras para o mapeamento e construção da base de dados ontológica. Finalmente, é também proposta uma arquitetura tecnológica que permitirá efetuar o mapeamento de uma ontologia através da construção de redes complexas, permitindo mapear e relacionar terminologias. Este trabalho de doutoramento engloba inúmeros projetos de Investigação & Desenvolvimento (I&D) pertencentes a diferentes domínios como por exemplo Indústria de Software, Indústria Têxtil, Indústria Robótica e Smart Cities. Finalmente, é realizada uma análise critica e descritiva do trabalho realizado, sendo que apontamos ainda perspetivas de possíveis trabalhos futuros

    Arquitetura de uma plataforma informática para suporte aos processos da cadeia de fornecimento: caso de demonstração PRODUTECH4S&C

    Get PDF
    Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de InformaçãoA Indústria 4.0 veio revolucionar o modo como a indústria opera, introduzindo inovações tecnológicas, como Internet of Things (IoT) ou Cyber Physical Sistems (CPS), que sugiram com a evolução da informática ao longo dos anos. Esta inovação levou a uma mudança de paradigma e introduziu os desafios da digitalização dos processos industriais, bem como nos processos a montante e a jusante da cadeia de valor, acrescentando um grande fluxo de informação disponível a toda a cadeia, e portanto, necessidades de comunicação intra e inter-organizações. A digitalização dos processos industriais, nomeadamente, na cadeia de fornecimento pode envolver a necessidade de ligação de diferentes empresas entre todas as partes envolventes do processo, quer sejam elas internas ou externas à organização. Ao longo do tempo, o desenvolvimento de aplicações informáticas de suporte aos processos industriais evoluiu independentemente sem comunicação com outras aplicações informáticas, e sem requisitos para a ligação com empresas externas, o que resultou num conjunto heterogéneo de ambientes. Esta evolução "individualista" dificulta a comunicação (interoperabilidade entre aplicações informáticas) entre diferentes entidades, pelo que é necessário definir um conjunto de definições e protocolo (sintaxe e semântica) que permita a comunicação entre as aplicações, para poder existir uma eficiente troca de informações. Assim, surge este projeto de dissertação que procura conceber uma arquitetura para uma plataforma informática de suporte aos processos da cadeia de fornecimento, tendo por base modelos de referência existentes na literatura, como o SCOR, que se encontre alinhada para responder às necessidades dos processos da cadeia de fornecimento na Indústria 4.0. Pretende se elaborar uma arquitetura aplicando o método de four-step-rule-set (4SRS), esta deve assegurar incorporação de necessidades específicas relativas ao desenvolvimento tecnológico (como cloud computing, arquiteturas de micro-serviços, brokers, etc.) bem como permitir uma maior visibilidade dos players da cadeia de fornecimento e focar numa economia mais sustentável.Industry 4.0 has revolutionized the way industry operates, introducing technological innovations, such as Internet of Things (IoT) or Cyber Physical Systems (CPS), that have emerged with the evolution of computing over the years. This innovation has led to a paradigm shift and introduced the challenges of digitization of industrial processes, as well as in the upstream and downstream processes of the value chain, adding a large flow of information available to the entire chain, and therefore intra- and inter-organization communication needs. Digitization of industrial processes, particularly in the supply chain, may involve the need for different companies to connect all parties involved in the process, whether they are internal or external to the organization. Over time, the development of software applications to support industrial processes has evolved independently without communication with other software applications, and without requirements for linkage with external companies, resulting in a heterogeneous set of environments. This "individualistic" evolution hinders communication (interoperability between computer applications) between different entities, so it is necessary to define a set of definitions and protocol (syntax and semantics) that allows communication between applications, so that there can be an efficient exchange of information. Thus, this dissertation project seeks to design an architecture to a informatic platform that supports supply chain processes, based on existing reference models in the literature, such as SCOR, which is aligned to meet the needs of supply chain processes in Industry 4.0. It is intended to develop an architecture applying the four-step-rule-set method (4SRS), which should ensure the incorporation of specific needs related to technological development (such as cloud computing, microservices architectures, brokers, etc.) as well as allow a greater visibility of the supply chain players and focus on a more sustainable economy

    Design of a Framework to Measure the Degree of Live Virtual Constructive (LVC) Simulation Interoperability

    Get PDF
    Accomplishment of the Live, Virtual and Constructive simulation interoperability has been a major goal and a challenge in the Modeling and Simulation (M&S) community. There have been efforts to interoperate individual Live, Virtual and Constructive simulations within a common synthetic environment through suitable technologies such as interface specifications, protocols, and standard middleware architectures. However, achieving interoperability of LVC simulation is a technologically complex since it is affected by multiple factors, and the characteristics are not yet satisfactorily defined and studied. A proper method is absent to measure the potential interoperability degree of LVC simulation. Therefore, there should be an appropriate systematic approach to measure the potential LVC simulation interoperability which includes technical, conceptual and organizational domains. This research aims to design a preliminary systematic approach to measure the potential interoperability degree of an individual Live, Virtual and Constructive simulation and a relevant organization which plans to use the simulation system for simulation interoperability. Specifically, a framework that contains components such as a) LVC simulation interoperability domains, b) interoperability domain factors, c) interoperability maturity levels, d) interoperability determination method is proposed. To accomplish the goal, a set of factors that determine the interoperability degree in LVC simulation environment are identified, and the factors are used to build the key elements of the framework. The proposed methodology for the framework design is based on systematic literature reviews and a survey involving a number of relevant domain experts. A case study is demonstrated to prove the validity and effectiveness of the developed framework. The case study illustrates how the interoperability levels of a simulation system and a relevant organization are effectively measured. This research potentially contributes by providing an understanding of the factors that determine the interoperability degree of LVC simulation, improvement of the LVC simulation interoperability measurement process, and consequently, accomplishment of more effective LVC simulation interoperability

    An investigation of interoperability issues between authorisation systems within web services

    Get PDF
    The existing authorisation systems within the context of Web Services mainly apply two access control approaches – Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC). The RBAC approach links an authenticated Web Service Requester to its specific access control permission through roles, but RBAC is not flexible enough to cater for some cases where extra attribute information is needed in addition to the identity. By contrast, the ABAC approach has more flexibility, as it allows a Web Service Requester to submit necessary credentials containing extra attribute information that can fulfil the policies declared by a Web Service Provider, which aims to protect the sensitive resources/services.RBAC and ABAC can only help to establish a unilateral trust relationship between two Web Services to enable a Web Service Provider to make an access control decision. Unfortunately, the nature of Web Services presents a high probability that two Web Services may not know each other. Therefore, successful authorisation may fail, if the Web Service Requester does not trust the Web Service Provider.Trust Negotiation (TN) is also an access control approach, which can provide a bilateral trust relationship between two unknown entities, so it sometimes can enable authorisation success in situations where success is not possible through RBAC or ABAC approaches. However, interoperability issues will arise between authorisation systems within Web Services, where a bilateral trust-based authorisation solution is applied. In addition, a lack of a unified approach that can address the interoperability issues remains as a research problem. This research aims to explore possible factors causing the lack of interoperability first, and then to explore an approach that can address the interoperability issues. The main contributions of this research are an improved interoperability model illustrating interoperability issues at different layers of abstraction, and a novel interoperability-solution design along with an improved TN protocol as an example of utilising this design to provide interoperability between authorisation systems within Web Services

    Interoperability framework to enhance the DLT based systems integration with enterprise IT systems.

    Get PDF
    Distributed ledger technology (DLT) has generated tremendous interest due to its popular application to Bitcoin and other cryptocurrencies. Despite its enormous potential business benefits and even greater hype, DLT never attracted significant investment and its widespread implementation failed to occur. One of the most recognised reasons is the lack of an integration framework for integrating DLT-based systems with centralised or non-DLT information technology (IT) systems. This research endeavours to fill this gap by designing a DLT interoperability framework (DIF). This framework is based on the interoperability principles derived from integrated DLT-based solutions and modern organisations' integration needs and practices. DIF enables organisations to design interoperability architecture and integrated solutions for enterprise implementation. Based on the DIF, this research also developed and instantiated a Hyperledger Fabric DLT solution prototype (HDSP) on Amazon Web Services (AWS) for the manuka honey supply chain (MHSC) use case. The research utilised design science research (DSR) methodology to develop the DIF and HDSP. Iterative artefact evaluations were undertaken using formative (ex-ante), summative (ex-post), maturity model for enterprise interoperability (MMEI), IT professional evaluation, and artefact instantiation and demonstration techniques suggested in the DSR. The DIF, HDSP and their evaluation provide a pathway for organisations to design and implement integrated DLT-based solutions. The knowledge generated and utilised in this research provides a robust theoretical foundation for building and implementing such integrated solutions

    Aligning Social Media, Mobile, Analytics, and Cloud Computing Technologies and Disaster Response

    Get PDF
    After nearly 2 decades of advances in information and communications technologies (ICT) including social media, mobile, analytics, and cloud computing, disaster response agencies in the United States have not been able to improve alignment between ICT-based information and disaster response actions. This grounded theory study explored emergency response ICT managers\u27 understanding of how social media, mobile, analytics, and cloud computing technologies (SMAC) are related to and can inform disaster response strategies. Sociotechnical theory served as the conceptual framework to ground the study. Data were collected from document reviews and semistructured interviews with 9 ICT managers from emergency management agencies in the state of Hawaii who had experience in responding to major disasters. The data were analyzed using open, axial coding, and selective coding. Three elements of a theory emerged from the findings: (a) the ICT managers were hesitant about SMAC technologies replacing first responder\u27s radios to interoperate between emergency response agencies during major disasters, (b) the ICT managers were receptive to converging conventional ICT with SMAC technologies, and (c) the ICT managers were receptive to joining legacy information sharing strategies with new information sharing strategies based on SMAC technologies. The emergent theory offers a framework for aligning SMAC technologies and disaster response strategies. The implications for positive social change include reduced interoperability failures between disaster agencies during major catastrophes, which may lower the risk of casualties and deaths to emergency responders and disaster victims, thus benefiting them and their communities
    corecore