951 research outputs found

    A Case Study of Edge Computing Implementations: Multi-access Edge Computing, Fog Computing and Cloudlet

    Get PDF
    With the explosive growth of intelligent and mobile devices, the current centralized cloud computing paradigm is encountering difficult challenges. Since the primary requirements have shifted towards implementing real-time response and supporting context awareness and mobility, there is an urgent need to bring resources and functions of centralized clouds to the edge of networks, which has led to the emergence of the edge computing paradigm. Edge computing increases the responsibilities of network edges by hosting computation and services, therefore enhancing performances and improving quality of experience (QoE). Fog computing, multi-access edge computing (MEC), and cloudlet are three typical and promising implementations of edge computing. Fog computing aims to build a system that enables cloud-to-thing service connectivity and works in concert with clouds, MEC is seen as a key technology of the fifth generation (5G) system, and Cloudlet is a micro-data center deployed in close proximity. In terms of deployment scenarios, Fog computing focuses on the Internet of Things (IoT), MEC mainly provides mobile RAN application solutions for 5G systems, and cloudlet offloads computing power at the network edge. In this paper, we present a comprehensive case study on these three edge computing implementations, including their architectures, differences, and their respective application scenario in IoT, 5G wireless systems, and smart edge. We discuss the requirements, benefits, and mechanisms of typical co-deployment cases for each paradigm and identify challenges and future directions in edge computing

    Analyzing Dynamic Capabilities in the Context of Cloud Platform Ecosystems - A Case Study Approach

    Get PDF
    Dynamic capabilities (DCs) refer to a firm’s abilities to continuously adapt its resource base in order to respond to changes in its external environment. The capability to change dynamically is crucial in business ecosystems that are composed of a variety of actors. Amazon Web Services (AWS), the leader in the cloud platform industry, is a promising cloud platform provider (CPP) to show a high degree of dynamic capability fulfillment within its highly fluctuating ecosystem. To date, the full scope of dynamic capabilities in cloud platform ecosystems (CPEs) has not been fully understood. Previous work has failed to deliver a combined perspective of explicit dynamic capabilities in cloud platform ecosystems applied on an in-depth practical case. With our mixed-method case study on the AWS ecosystem we deliver a thorough understanding of its sensing, seizing and transforming capabilities. We generate a set of strategy management frameworks that support our expectations, lead to unexpected insights and answer the questions of what, how, why and with whom AWS uses DCs. In detail, we provide an understanding about DC chronological change, DC network patterns and DC logical explanations. Our research is based on a self-compiled case study database containing 16k+ secondary data pages from interviews, blogs, announcements, case studies, job vacancies, etc. that we analyze qualitatively and quantitatively. We find out that AWS develops and holds a large set of interacting dynamic capabilities incorporating a variety of ecosystem actors in order to sustain tremendous customer value and satisfaction. The thesis infers significant theoretical and practical implications for all CPE actors, like partners, customers, investors and researchers in the field of IT strategy management. Managers of all CPE actors are encouraged to critically evaluate their own maturity level and complement a CPP’s DC explications in order to boost business by implementing sensing, seizing, transforming and innovating capabilities. Keywords: Dynamic Capabilities, Cloud Platform Ecosystems, Innovation Capabilities, Mixed-Methods Case Study, Amazon Web Service

    Deep learning for internet of underwater things and ocean data analytics

    Get PDF
    The Internet of Underwater Things (IoUT) is an emerging technological ecosystem developed for connecting objects in maritime and underwater environments. IoUT technologies are empowered by an extreme number of deployed sensors and actuators. In this thesis, multiple IoUT sensory data are augmented with machine intelligence for forecasting purposes

    Stochastic Energy Efficient Cloud Service Provisioning Deploying Renewable Energy Sources

    Get PDF

    Dynamic Capabilities in Cybersecurity Intelligence: A Meta-Synthesis to Enhance Protection Against Cyber Threats

    Get PDF
    Advanced cybersecurity threats with automated capabilities are on the rise in industries such as finance, healthcare, technology, retail, telecoms, and transportation, as well as government. It is necessary to conduct analyses of cybersecurity-related resources and capabilities to build cybersecurity intelligence (CI). The purpose of this paper is to suggest a dynamic capability in a cybersecurity intelligence (DCCI) model based on existing literature that helped firms reduce risks of cyber violations and advance the development of systems and the life cycle of firms. Through a meta-synthesis, an abduction and induction approach through eight methodological steps analyzed in forty-seven case studies the presence of cybersecurity capabilities to build CI. Combining theoretical and practical information security maturity models as a foundation, we understand capabilities building to improve the predictability of cyber incidents. The results evidenced four second-order dimensions to build CI named doing, enabling, improving, and managing cybersecurity, and eight first-order outcomes to represent the DCCI model. This research makes an unprecedented contribution to international and national scenarios, as it will allow firms to innovate their resource management processes and abilities to enable better cybersecurity projects and reduce the impacts of potential cyberattacks with the probability of eradicating vulnerabilities

    Big Data Testing Techniques: Taxonomy, Challenges and Future Trends

    Full text link
    Big Data is reforming many industrial domains by providing decision support through analyzing large data volumes. Big Data testing aims to ensure that Big Data systems run smoothly and error-free while maintaining the performance and quality of data. However, because of the diversity and complexity of data, testing Big Data is challenging. Though numerous research efforts deal with Big Data testing, a comprehensive review to address testing techniques and challenges of Big Data is not available as yet. Therefore, we have systematically reviewed the Big Data testing techniques evidence occurring in the period 2010-2021. This paper discusses testing data processing by highlighting the techniques used in every processing phase. Furthermore, we discuss the challenges and future directions. Our findings show that diverse functional, non-functional and combined (functional and non-functional) testing techniques have been used to solve specific problems related to Big Data. At the same time, most of the testing challenges have been faced during the MapReduce validation phase. In addition, the combinatorial testing technique is one of the most applied techniques in combination with other techniques (i.e., random testing, mutation testing, input space partitioning and equivalence testing) to find various functional faults through Big Data testing.Comment: 32 page
    • …
    corecore