197,430 research outputs found

    Next-Generation SDN and Fog Computing: A New Paradigm for SDN-Based Edge Computing

    Get PDF
    In the last few years, we have been able to see how terms like Mobile Edge Computing, Cloudlets, and Fog computing have arisen as concepts that reach a level of popularity to express computing towards network Edge. Shifting some processing tasks from the Cloud to the Edge brings challenges to the table that might have been non-considered before in next-generation Software-Defined Networking (SDN). Efficient routing mechanisms, Edge Computing, and SDN applications are challenging to deploy as controllers are expected to have different distributions. In particular, with the advances of SDN and the P4 language, there are new opportunities and challenges that next-generation SDN has for Fog computing. The development of new pipelines along with the progress regarding control-to-data plane programming protocols can also promote data and control plane function offloading. We propose a new mechanism of deploying SDN control planes both locally and remotely to attend different challenges. We encourage researchers to develop new ways to functionally deploying Fog and Cloud control planes that let cross-layer planes interact by deploying specific control and data plane applications. With our proposal, the control and data plane distribution can provide a lower response time for locally deployed applications (local control plane). Besides, it can still be beneficial for a centralized and remotely placed control plane, for applications such as path computation within the same network and between separated networks (remote control plane)

    Gravitational Wave Data Analysis: Computing Challenges in the 3G Era

    Get PDF
    Cyber infrastructure will be a critical consideration in the development of next generation gravitational-wave detectors. The demand for data analysis computing in the 3G era will be driven by the high number of detections as well as the expanded search parameter space for compact astrophysical objects and the subsequent parameter estimation follow-up required to extract the nature of the sources. Additionally, there will be an increased need to develop appropriate and scalable computing cyberinfrastructure, including data access and transfer protocols, and storage and management of software tools, that have sustainable development, support, and management processes. This report identifies the major challenges and opportunities facing 3G gravitational-wave observatories and presents recommendations for addressing them. This report is the fourth in a six part series of reports by the GWIC 3G Subcommittee: i) Expanding the Reach of Gravitational Wave Observatories to the Edge of the Universe, ii) The Next Generation Global Gravitational Wave Observatory: The Science Book, iii) 3G R&D: R&D for the Next Generation of Ground-based Gravitational Wave Detectors, iv) Gravitational Wave Data Analysis: Computing Challenges in the 3G Era (this report), v) Future Ground-based Gravitational-wave Observatories: Synergies with Other Scientific Communities, and vi) An Exploration of Possible Governance Models for the Future Global Gravitational-Wave Observatory Network

    Extending Ambient Intelligence to the Internet of Things: New Challenges for QoC Management

    Get PDF
    International audienceQuality of Context (QoC) awareness is recognized as a key point for the success of context-aware computing solutions. At a time where the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms bring together new opportunities for more complex context computation, the next generation of Multiscale Distributed Context Managers (MDCM) is facing new challenges concerning QoC management. This paper presents how our QoCIM framework can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario

    Accelerator Virtualization in Fog Computing: Moving from the Cloud to the Edge

    Full text link
    [EN] Hardware accelerators are available on the cloud for enhanced analytics. Next-generation clouds aim to bring enhanced analytics using accelerators closer to user devices at the edge of the network for improving quality of service (QoS) by minimizing end-to-end latencies and response times. The collective computing model that utilizes resources at the cloud-edge continuum in a multi-tier hierarchy comprising the cloud, edge, and user devices is referred to as fog computing. This article identifies challenges and opportunities in making accelerators accessible at the edge. A holistic view of the fog architecture is key to pursuing meaningful research in this area.Varghese, B.; Reaño González, C.; Silla Jiménez, F. (2018). Accelerator Virtualization in Fog Computing: Moving from the Cloud to the Edge. IEEE Cloud Computing. 5(6):28-37. https://doi.org/10.1109/MCC.2018.064181118S28375

    The Next Generation Cloud technologies: A Review On Distributed Cloud, Fog And Edge Computing and Their Opportunities and Challenges

    Get PDF
    Cloud computing is a 21st-century wonder with applications in nearly every industry imaginable. As a new technology, it has certain shortcomings. There are always attempts for improvements to combat those shortcomings. The next generation cloud technologies is believed to overcome these shortcomings. This research seeks to examine the few next generation cloud technologies, namely, distributed cloud, fog computing, edge computing. The distributed cloud improves worldwide service communications while also allowing for more responsive communications in individual regions. The distributed approach is used by cloud providers to allow lower latency and greater efficiency for cloud services. We find that there are few opportunities in Distributed Cloud such as, Improved security, IoT implementations, Faster content delivery, and cost efficiency. However, it poses some challenges such as data exposure to hackers when transferred from public networks. Fog computing, according to the findings, reduces the amount of time it takes, lowers operational costs, increases the level of security. However, one of the most difficult aspects of fog computing is the substantial reliance on data transit. Several opportunities of edge computing in various areas include Network optimization, Healthcare improvement, and Transportation. The popularity of some of the next generation cloud technologies has been strongly impacted by the growth of the internet of things and the unanticipated surge in data created by IoT-connected devices. It is possible to state that obstacles can be gradually overcome because the benefits of next generation cloud technologies enable solutions that meet a wide range of contemporary company requirements. The adoption of next generation cloud technologies might take some time as businesses consider the benefits and drawbacks, and the transition may be slow

    A manifesto for future generation cloud computing: research directions for the next decade

    Get PDF
    The Cloud computing paradigm has revolutionised the computer science horizon during the past decade and has enabled the emergence of computing as the fifth utility. It has captured significant attention of academia, industries, and government bodies. Now, it has emerged as the backbone of modern economy by offering subscription-based services anytime, anywhere following a pay-as-you-go model. This has instigated (1) shorter establishment times for start-ups, (2) creation of scalable global enterprise applications, (3) better cost-to-value associativity for scientific and high performance computing applications, and (4) different invocation/execution models for pervasive and ubiquitous applications. The recent technological developments and paradigms such as serverless computing, software-defined networking, Internet of Things, and processing at network edge are creating new opportunities for Cloud computing. However, they are also posing several new challenges and creating the need for new approaches and research strategies, as well as the re-evaluation of the models that were developed to address issues such as scalability, elasticity, reliability, security, sustainability, and application models. The proposed manifesto addresses them by identifying the major open challenges in Cloud computing, emerging trends, and impact areas. It then offers research directions for the next decade, thus helping in the realisation of Future Generation Cloud Computing

    Big Data Proteogenomics and High Performance Computing: Challenges and Opportunities

    Get PDF
    Proteogenomics is an emerging field of systems biology research at the intersection of proteomics and genomics. Two high-throughput technologies, Mass Spectrometry (MS) for proteomics and Next Generation Sequencing (NGS) machines for genomics are required to conduct proteogenomics studies. Independently both MS and NGS technologies are inflicted with data deluge which creates problems of storage, transfer, analysis and visualization. Integrating these big data sets (NGS+MS) for proteogenomics studies compounds all of the associated computational problems. Existing sequential algorithms for these proteogenomics datasets analysis are inadequate for big data and high performance computing (HPC) solutions are almost non-existent. The purpose of this paper is to introduce the big data problem of proteogenomics and the associated challenges in analyzing, storing and transferring these data sets. Further, opportunities for high performance computing research community are identified and possible future directions are discussed

    Big Data Proteogenomics and High Performance Computing: Challenges and Opportunities

    Get PDF
    Proteogenomics is an emerging field of systems biology research at the intersection of proteomics and genomics. Two high-throughput technologies, Mass Spectrometry (MS) for proteomics and Next Generation Sequencing (NGS) machines for genomics are required to conduct proteogenomics studies. Independently both MS and NGS technologies are inflicted with data deluge which creates problems of storage, transfer, analysis and visualization. Integrating these big data sets (NGS+MS) for proteogenomics studies compounds all of the associated computational problems. Existing sequential algorithms for these proteogenomics datasets analysis are inadequate for big data and high performance computing (HPC) solutions are almost non-existent. The purpose of this paper is to introduce the big data problem of proteogenomics and the associated challenges in analyzing, storing and transferring these data sets. Further, opportunities for high performance computing research community are identified and possible future directions are discussed
    • …
    corecore