6,761 research outputs found

    An Analysis of Service Ontologies

    Get PDF
    Services are increasingly shaping the world’s economic activity. Service provision and consumption have been profiting from advances in ICT, but the decentralization and heterogeneity of the involved service entities still pose engineering challenges. One of these challenges is to achieve semantic interoperability among these autonomous entities. Semantic web technology aims at addressing this challenge on a large scale, and has matured over the last years. This is evident from the various efforts reported in the literature in which service knowledge is represented in terms of ontologies developed either in individual research projects or in standardization bodies. This paper aims at analyzing the most relevant service ontologies available today for their suitability to cope with the service semantic interoperability challenge. We take the vision of the Internet of Services (IoS) as our motivation to identify the requirements for service ontologies. We adopt a formal approach to ontology design and evaluation in our analysis. We start by defining informal competency questions derived from a motivating scenario, and we identify relevant concepts and properties in service ontologies that match the formal ontological representation of these questions. We analyze the service ontologies with our concepts and questions, so that each ontology is positioned and evaluated according to its utility. The gaps we identify as the result of our analysis provide an indication of open challenges and future work

    Sustainable manufacturing in the fourth industrial revolution: a big data application proposal in the textile industry

    Get PDF
    Purpose: Design an industrial production model with a focus on industry 4.0 (Big Data) and decision-making analysis for small and medium-sized enterprises (SMEs) in the clothing sector that allows improving procedures, jobs and related costs within the study organization Develop a sustainable manufacturing proposal for the industrial textile sector with a focus on Big data (entry, transformation, data loading and analysis) in organizational decision making, in search of time and cost optimization and environmental impact mitigation related. Design/methodology/approach: The present research, of an applied nature, raises a value proposition focused on the planning, design and structuring of an industrial model focused on Big Data, specifically in the apparel manufacturing sector for decision-making in a structured and automated way with the methodological approach to follow: 1) Approach of production strategies oriented in Big Data for the textile sector; 2) Definition of the production model and configuration of the operational system; 3) Data science and industrial analysis, 4) Production model approach (Power BI) and 5) model validation. Methodological design of the investigation. 1) Presentation of the case study, where the current situational analysis of the company is carried out, formulation of the problem and proposal of solution for the set of data analyzed; 2) Presentation of a solution proposal focused on Big Data, on the identification of the industrial ecosystem and integration with the company's information systems, as well as the solution approach in the study and science of data in real time; 3) Presentation of the Model proposal for SQL structured databases in the loading, transformation and loading of important information for this study; 4) Information processing, in the edition of data in the M language of Power BI software, construction and elaboration of the model; 5) Presentation of the related databases, in the integration with the foreign key of the Master table and the transactional Tables; 6) Data analysis and presentation of the Dashboard, in the design, construction and analysis of the related study variables, as well as the approach of solution scenarios in the correct organizational decision making Findings: The results obtained show an improvement in operational efficiency from the value-added proposal. Research limitations/implications: Currently, the number of studies applying Big Data technology for organizations in the textile and manufacturing sector in organizational decision making are limited. If analyzed from the local scene, there are few cases of Big Data implementation in the textile sector, as a consequence of the lack of projects and financing of value propositions. Another limiting factor in this research is the absence of digital information of high relevance for study and analysis, which leads to longer times in data entry and placement in information systems in real time. Finally, there is no data organizational culture, where there are processes and/or procedures for data registration and its transformation into clean data. Originality/value: This research integrates, as well as the correct organizational decision making For the verification of originality, the project search and systematic review of literature in the main online search engines are carried out for this research; In addition, the percentages of coincidence with online reviewers such as turnitin and plag.es are reviewed in the transparency of this study projectPeer Reviewe

    Intelligent Management and Efficient Operation of Big Data

    Get PDF
    This chapter details how Big Data can be used and implemented in networking and computing infrastructures. Specifically, it addresses three main aspects: the timely extraction of relevant knowledge from heterogeneous, and very often unstructured large data sources, the enhancement on the performance of processing and networking (cloud) infrastructures that are the most important foundational pillars of Big Data applications or services, and novel ways to efficiently manage network infrastructures with high-level composed policies for supporting the transmission of large amounts of data with distinct requisites (video vs. non-video). A case study involving an intelligent management solution to route data traffic with diverse requirements in a wide area Internet Exchange Point is presented, discussed in the context of Big Data, and evaluated.Comment: In book Handbook of Research on Trends and Future Directions in Big Data and Web Intelligence, IGI Global, 201

    Towards Information Polycentricity Theory: Investigation of a Hospital Revenue Cycle

    Get PDF
    This research takes steps towards developing a new theory of organizational information management based on the ideas that, first, information creates ordering effects in transactions and, second, that there are multiple centers of authority in organizations. The rationale for developing this theory is the empirical observation that hospitals have great difficulty in managing information relating to transactions with patients. The research illustrates the detailed workings of an initial conceptual framework based on an action research project into the revenue cycle of a hospital. The framework facilitates a deeper understanding of how information technology can help to transform information management practices in complex organizations, such as hospitals. At the same time, this research adds to the literature on Polycentricity Theory by linking its two core concepts—multiple nested centers of decision making and context-dependent governance—with Transaction Cost Theory and information management theories to establish a new foundation for understanding the role of information technology in organizational contexts

    Trusted Computing and Secure Virtualization in Cloud Computing

    Get PDF
    Large-scale deployment and use of cloud computing in industry is accompanied and in the same time hampered by concerns regarding protection of data handled by cloud computing providers. One of the consequences of moving data processing and storage off company premises is that organizations have less control over their infrastructure. As a result, cloud service (CS) clients must trust that the CS provider is able to protect their data and infrastructure from both external and internal attacks. Currently however, such trust can only rely on organizational processes declared by the CS provider and can not be remotely verified and validated by an external party. Enabling the CS client to verify the integrity of the host where the virtual machine instance will run, as well as to ensure that the virtual machine image has not been tampered with, are some steps towards building trust in the CS provider. Having the tools to perform such verifications prior to the launch of the VM instance allows the CS clients to decide in runtime whether certain data should be stored- or calculations should be made on the VM instance offered by the CS provider. This thesis combines three components -- trusted computing, virtualization technology and cloud computing platforms -- to address issues of trust and security in public cloud computing environments. Of the three components, virtualization technology has had the longest evolution and is a cornerstone for the realization of cloud computing. Trusted computing is a recent industry initiative that aims to implement the root of trust in a hardware component, the trusted platform module. The initiative has been formalized in a set of specifications and is currently at version 1.2. Cloud computing platforms pool virtualized computing, storage and network resources in order to serve a large number of customers customers that use a multi-tenant multiplexing model to offer on-demand self-service over broad network. Open source cloud computing platforms are, similar to trusted computing, a fairly recent technology in active development. The issue of trust in public cloud environments is addressed by examining the state of the art within cloud computing security and subsequently addressing the issues of establishing trust in the launch of a generic virtual machine in a public cloud environment. As a result, the thesis proposes a trusted launch protocol that allows CS clients to verify and ensure the integrity of the VM instance at launch time, as well as the integrity of the host where the VM instance is launched. The protocol relies on the use of Trusted Platform Module (TPM) for key generation and data protection. The TPM also plays an essential part in the integrity attestation of the VM instance host. Along with a theoretical, platform-agnostic protocol, the thesis also describes a detailed implementation design of the protocol using the OpenStack cloud computing platform. In order the verify the implementability of the proposed protocol, a prototype implementation has built using a distributed deployment of OpenStack. While the protocol covers only the trusted launch procedure using generic virtual machine images, it presents a step aimed to contribute towards the creation of a secure and trusted public cloud computing environment
    • …
    corecore