14,810 research outputs found

    Technological, organisational, and environmental factors affecting the adoption of blockchain-based distributed identity management in organisations

    Get PDF
    Background: Blockchain is a disruptive technology with the potential to innovate businesses. Ignoring or resisting it might result in a competitive disadvantage for organisations. Apart from its original financial application of cryptocurrency, other applications are emerging, the most common being supply chain management and e-voting systems. However, there is less focus on information and cybersecurity applications, especially from the enterprise perspective. This research addresses this knowledge gap, focussing on its application of distributed identity management in organisations. Objectives: The main objective is to investigate technological, organisational, and environmental (TOE) factors affecting the adoption of blockchain-based distributed identity management (BDIDM) in organisations to determine the most critical factors. Secondary objectives include determining whether the blockchain type affects BDIDM adoption and whether the TOE-BDIDM model measuring the phenomenon is effective and appropriate. But given the relative newness of blockchain, the initial goal consists of intensively exploring the topic to understand the practicality of adopting BDIDM in organisations and establishing whether claims made around it are factual than just due to the blockchain hype. Methodology: The study uses meta-synthesis to explore the topic, summarising 69 papers selected qualitatively from reputed academic sources. The study then surveys 111 information and cybersecurity practitioners selected randomly in South African organisations to investigate the TOE factors affecting BDIDM adoption. To do so, it utilises an online questionnaire rooted in an adapted TOE model called TOE-BDIDM as a data collection instrument. The analysis of this primary data is purely quantitative and includes (i) Structural Equation Modelling (SEM) of the measurement model, i.e. confirmatory factor analysis (CFA); (ii) binary logistics regression analysis; and (iii) Chi-Square tests Results: Meta-synthesis revealed theoretical grounds underlying claims made around the topic while spotting diverging views about BDIDM practicality for the enterprise context. It also identifies the TOE theory as more suitable to explain the phenomenon. Binary logistics regression modelling reveals that TOE factors do affect BDIDM adoption in organisations, either positively or negatively. The factors predict BDIDM adopters and non-adopters, with Technology Characteristics being the most critical factor and the most that could predict BDIDM non-adopters. Organisation Readiness was the second critical factor, the most that could predict BDIDM adopters. Overall, TOE-BDIDM effectively predicted 92.5% of adopters and 45.2% of non-adopters. CFA indicates that TOE-BDIDM appropriateness for investigating the phenomenon is relatively fair. The Chi-Square tests reveal a significant association between Blockchain Type and BDIDM adoption. Implications: The discussion highlights various implications of the above findings, including the plausibility of the impartiality of typical privacy-preserving BDIDM models like the Selfsovereign identity: The majority of respondents preferred private permissioned blockchain, which tends to be centralised, more intermediated, and less privacy-preserving. The rest implications relate to the disruptiveness nature of BDIDM and the BDIDM adoption being more driven by technological than organisational or environmental factors. The study ends by reflecting on the research process and providing fundamental limitations and recommendations for future researc

    A Taxonomy of Quality Metrics for Cloud Services

    Full text link
    [EN] A large number of metrics with which to assess the quality of cloud services have been proposed over the last years. However, this knowledge is still dispersed, and stakeholders have little or no guidance when choosing metrics that will be suitable to evaluate their cloud services. The objective of this paper is, therefore, to systematically identify, taxonomically classify, and compare existing quality of service (QoS) metrics in the cloud computing domain. We conducted a systematic literature review of 84 studies selected from a set of 4333 studies that were published from 2006 to November 2018. We specifically identified 470 metric operationalizations that were then classified using a taxonomy, which is also introduced in this paper. The data extracted from the metrics were subsequently analyzed using thematic analysis. The findings indicated that most metrics evaluate quality attributes related to performance efficiency (64%) and that there is a need for metrics that evaluate other characteristics, such as security and compatibility. The majority of the metrics are used during the Operation phase of the cloud services and are applied to the running service. Our results also revealed that metrics for cloud services are still in the early stages of maturity only 10% of the metrics had been empirically validated. The proposed taxonomy can be used by practitioners as a guideline when specifying service level objectives or deciding which metric is best suited to the evaluation of their cloud services, and by researchers as a comprehensive quality framework in which to evaluate their approaches.This work was supported by the Spanish Ministry of Science, Innovation and Universities through the Adapt@Cloud Project under Grant TIN2017-84550-R. The work of Ximena Guerron was supported in part by the Universidad Central del Ecuador (UCE), and in part by the Banco Central del Ecuador.Guerron, X.; Abrahao Gonzales, SM.; Insfran, E.; Fernández-Diego, M.; González-Ladrón-De-Guevara, F. (2020). A Taxonomy of Quality Metrics for Cloud Services. IEEE Access. 8:131461-131498. https://doi.org/10.1109/ACCESS.2020.3009079S131461131498

    UNDERSTANDING USER PERCEPTIONS AND PREFERENCES FOR MASS-MARKET INFORMATION SYSTEMS – LEVERAGING MARKET RESEARCH TECHNIQUES AND EXAMPLES IN PRIVACY-AWARE DESIGN

    Get PDF
    With cloud and mobile computing, a new category of software products emerges as mass-market information systems (IS) that addresses distributed and heterogeneous end-users. Understanding user requirements and the factors that drive user adoption are crucial for successful design of such systems. IS research has suggested several theories and models to explain user adoption and intentions to use, among them the IS Success Model and the Technology Acceptance Model (TAM). Although these approaches contribute to theoretical understanding of the adoption and use of IS in mass-markets, they are criticized for not being able to drive actionable insights on IS design as they consider the IT artifact as a black-box (i.e., they do not sufficiently address the system internal characteristics). We argue that IS needs to embrace market research techniques to understand and empirically assess user preferences and perceptions in order to integrate the "voice of the customer" in a mass-market scenario. More specifically, conjoint analysis (CA), from market research, can add user preference measurements for designing high-utility IS. CA has gained popularity in IS research, however little guidance is provided for its application in the domain. We aim at supporting the design of mass-market IS by establishing a reliable understanding of consumer’s preferences for multiple factors combing functional, non-functional and economic aspects. The results include a “Framework for Conjoint Analysis Studies in IS” and methodological guidance for applying CA. We apply our findings to the privacy-aware design of mass-market IS and evaluate their implications on user adoption. We contribute to both academia and practice. For academia, we contribute to a more nuanced conceptualization of the IT artifact (i.e., system) through a feature-oriented lens and a preference-based approach. We provide methodological guidelines that support researchers in studying user perceptions and preferences for design variations and extending that to adoption. Moreover, the empirical studies for privacy- aware design contribute to a better understanding of the domain specific applications of CA for IS design and evaluation with a nuanced assessment of user preferences for privacy-preserving features. For practice, we propose guidelines for integrating the voice of the customer for successful IS design. -- Les technologies cloud et mobiles ont fait émerger une nouvelle catégorie de produits informatiques qui s’adressent à des utilisateurs hétérogènes par le biais de systèmes d'information (SI) distribués. Les termes “SI de masse” sont employés pour désigner ces nouveaux systèmes. Une conception réussie de ceux-ci passe par une phase essentielle de compréhension des besoins et des facteurs d'adoption des utilisateurs. Pour ce faire, la recherche en SI suggère plusieurs théories et modèles tels que le “IS Success Model” et le “Technology Acceptance Model”. Bien que ces approches contribuent à la compréhension théorique de l'adoption et de l'utilisation des SI de masse, elles sont critiquées pour ne pas être en mesure de fournir des informations exploitables sur la conception de SI car elles considèrent l'artefact informatique comme une boîte noire. En d’autres termes, ces approches ne traitent pas suffisamment des caractéristiques internes du système. Nous soutenons que la recherche en SI doit adopter des techniques d'étude de marché afin de mieux intégrer les exigences du client (“Voice of Customer”) dans un scénario de marché de masse. Plus précisément, l'analyse conjointe (AC), issue de la recherche sur les consommateurs, peut contribuer au développement de système SI à forte valeur d'usage. Si l’AC a gagné en popularité au sein de la recherche en SI, des recommandations quant à son utilisation dans ce domaine restent rares. Nous entendons soutenir la conception de SI de masse en facilitant une identification fiable des préférences des consommateurs sur de multiples facteurs combinant des aspects fonctionnels, non-fonctionnels et économiques. Les résultats comprennent un “Cadre de référence pour les études d'analyse conjointe en SI” et des recommandations méthodologiques pour l'application de l’AC. Nous avons utilisé ces contributions pour concevoir un SI de masse particulièrement sensible au respect de la vie privée des utilisateurs et nous avons évalué l’impact de nos recherches sur l'adoption de ce système par ses utilisateurs. Ainsi, notre travail contribue tant à la théorie qu’à la pratique des SI. Pour le monde universitaire, nous contribuons en proposant une conceptualisation plus nuancée de l'artefact informatique (c'est-à-dire du système) à travers le prisme des fonctionnalités et par une approche basée sur les préférences utilisateurs. Par ailleurs, les chercheurs peuvent également s'appuyer sur nos directives méthodologiques pour étudier les perceptions et les préférences des utilisateurs pour différentes variations de conception et étendre cela à l'adoption. De plus, nos études empiriques sur la conception d’un SI de masse sensible au respect de la vie privée des utilisateurs contribuent à une meilleure compréhension de l’application des techniques CA dans ce domaine spécifique. Nos études incluent notamment une évaluation nuancée des préférences des utilisateurs sur des fonctionnalités de protection de la vie privée. Pour les praticiens, nous proposons des lignes directrices qui permettent d’intégrer les exigences des clients afin de concevoir un SI réussi

    Evaluation Theory for Characteristics of Cloud Identity Trust Framework

    Get PDF
    Trust management is a prominent area of security in cloud computing because insufficient trust management hinders cloud growth. Trust management systems can help cloud users to make the best decision regarding the security, privacy, Quality of Protection (QoP), and Quality of Service (QoS). A Trust model acts as a security strength evaluator and ranking service for the cloud and cloud identity applications and services. It might be used as a benchmark to setup the cloud identity service security and to find the inadequacies and enhancements in cloud infrastructure. This chapter addresses the concerns of evaluating cloud trust management systems, data gathering, and synthesis of theory and data. The conclusion is that the relationship between cloud identity providers and Cloud identity users can greatly benefit from the evaluation and critical review of current trust models

    Trusted resource allocation in volunteer edge-cloud computing for scientific applications

    Get PDF
    Data-intensive science applications in fields such as e.g., bioinformatics, health sciences, and material discovery are becoming increasingly dynamic and demanding with resource requirements. Researchers using these applications which are based on advanced scientific workflows frequently require a diverse set of resources that are often not available within private servers or a single Cloud Service Provider (CSP). For example, a user working with Precision Medicine applications would prefer only those CSPs who follow guidelines from HIPAA (Health Insurance Portability and Accountability Act) for implementing their data services and might want services from other CSPs for economic viability. With the generation of more and more data these workflows often require deployment and dynamic scaling of multi-cloud resources in an efficient and high-performance manner (e.g., quick setup, reduced computation time, and increased application throughput). At the same time, users seek to minimize the costs of configuring the related multi-cloud resources. While performance and cost are among the key factors to decide upon CSP resource selection, the scientific workflows often process proprietary/confidential data that introduces additional constraints of security postures. Thus, users have to make an informed decision on the selection of resources that are most suited for their applications while trading off between the key factors of resource selection which are performance, agility, cost, and security (PACS). Furthermore, even with the most efficient resource allocation across multi-cloud, the cost to solution might not be economical for all users which have led to the development of new paradigms of computing such as volunteer computing where users utilize volunteered cyber resources to meet their computing requirements. For economical and readily available resources, it is essential that such volunteered resources can integrate well with cloud resources for providing the most efficient computing infrastructure for users. In this dissertation, individual stages such as user requirement collection, user's resource preferences, resource brokering and task scheduling, in lifecycle of resource brokering for users are tackled. For collection of user requirements, a novel approach through an iterative design interface is proposed. In addition, fuzzy interference-based approach is proposed to capture users' biases and expertise for guiding their resource selection for their applications. The results showed improvement in performance i.e. time to execute in 98 percent of the studied applications. The data collected on user's requirements and preferences is later used by optimizer engine and machine learning algorithms for resource brokering. For resource brokering, a new integer linear programming based solution (OnTimeURB) is proposed which creates multi-cloud template solutions for resource allocation while also optimizing performance, agility, cost, and security. The solution was further improved by the addition of a machine learning model based on naive bayes classifier which captures the true QoS of cloud resources for guiding template solution creation. The proposed solution was able to improve the time to execute for as much as 96 percent of the largest applications. As discussed above, to fulfill necessity of economical computing resources, a new paradigm of computing viz-a-viz Volunteer Edge Computing (VEC) is proposed which reduces cost and improves performance and security by creating edge clusters comprising of volunteered computing resources close to users. The initial results have shown improved time of execution for application workflows against state-of-the-art solutions while utilizing only the most secure VEC resources. Consequently, we have utilized reinforcement learning based solutions to characterize volunteered resources for their availability and flexibility towards implementation of security policies. The characterization of volunteered resources facilitates efficient allocation of resources and scheduling of workflows tasks which improves performance and throughput of workflow executions. VEC architecture is further validated with state-of-the-art bioinformatics workflows and manufacturing workflows.Includes bibliographical references

    Heating controls: International evidence base and policy experiences

    Get PDF
    This report presents a synthesis in the form of narrative summaries of the international evidence base and policy experiences on heating controls in the domestic sector. The research builds on the former Department of Energy and Climate Change (DECC) commissioned (systematic) scoping review of the UK evidence on heating controls published in 2016 (Lomas et al., 2016), and the Rapid Evidence Assessment of smarter heating controls published in 2014 (Munton et al., 2014). The report consists of two parts. Part 1 involves a (systematic) scoping review of the international evidence base on the energy savings, cost-effectiveness and usability of heating controls in the domestic sector. Part 2 contains the findings from an analysis of the policy experiences of other countries

    Accountability Requirements in the Cloud Provider Chain

    Get PDF
    In order to be responsible stewards of other people’s data, cloud providers must be accountable for their data handling practices. The potential long provider chains in cloud computing introduce additional accountability challenges, with many stakeholders involved. Symmetry is very important in any requirements’ elicitation activity, since input from diverse stakeholders needs to be balanced. This article ventures to answer the question “How can one create an accountable cloud service?” by examining requirements which must be fulfilled to achieve an accountability-based approach, based on interaction with over 300 stakeholders.publishedVersio

    Internet Predictions

    Get PDF
    More than a dozen leading experts give their opinions on where the Internet is headed and where it will be in the next decade in terms of technology, policy, and applications. They cover topics ranging from the Internet of Things to climate change to the digital storage of the future. A summary of the articles is available in the Web extras section
    corecore