46,332 research outputs found

    Investigating the Impacts of Cloud Computing on Firm Profitability

    Get PDF
    The advent of cloud computing has been a technical revolution that transformed how organizations access, store, and process information. This research proposes that cloud deployment can have a significant impact on profitability in multiple ways. We argued that one of the most significant ways is by reducing costs by eliminating the need for businesses to invest in and maintain their own IT infrastructure, making it easier for businesses to scale their resources up or down as needed, improving agility, and providing advanced security features and tools. Additionally, cloud deployment can increase profitability through increased scalability, improved collaboration, access to new technologies such as machine learning and big data, and improved customer experience by providing faster and more reliable service. By implementing cloud deployment, businesses can also increase revenue and improve overall operational efficiency and productivity. Using the datasets of 115 firms, this research investigated the impact of various cloud-use matrices on firm profitability. The results indicate that the gross profit margins of firms are increased when services delivered via the cloud, cloud spending, best cloud governance, and the number of cloud-based applications are increased in a more concentrated market with less competition. To increase the positive impact of cloud computing on a business organization, it is important to develop a clear and comprehensive cloud strategy, establish robust security and compliance policies, invest in the necessary resources and expertise for successful cloud migration, and continuously monitor and measure the performance and effectiveness of the cloud solutions. This will help organizations make informed decisions, align their cloud investments with their overall business goals and objectives, mitigate security and compliance risks, ensure a successful cloud migration, and continuously optimize their cloud solutions for maximum value. By taking this holistic approach, businesses can ensure that they get the most value out of their cloud investments and achieve optimal results

    Big Data and the Internet of Things

    Full text link
    Advances in sensing and computing capabilities are making it possible to embed increasing computing power in small devices. This has enabled the sensing devices not just to passively capture data at very high resolution but also to take sophisticated actions in response. Combined with advances in communication, this is resulting in an ecosystem of highly interconnected devices referred to as the Internet of Things - IoT. In conjunction, the advances in machine learning have allowed building models on this ever increasing amounts of data. Consequently, devices all the way from heavy assets such as aircraft engines to wearables such as health monitors can all now not only generate massive amounts of data but can draw back on aggregate analytics to "improve" their performance over time. Big data analytics has been identified as a key enabler for the IoT. In this chapter, we discuss various avenues of the IoT where big data analytics either is already making a significant impact or is on the cusp of doing so. We also discuss social implications and areas of concern.Comment: 33 pages. draft of upcoming book chapter in Japkowicz and Stefanowski (eds.) Big Data Analysis: New algorithms for a new society, Springer Series on Studies in Big Data, to appea

    The ESCAPE project : Energy-efficient Scalable Algorithms for Weather Prediction at Exascale

    Get PDF
    In the simulation of complex multi-scale flows arising in weather and climate modelling, one of the biggest challenges is to satisfy strict service requirements in terms of time to solution and to satisfy budgetary constraints in terms of energy to solution, without compromising the accuracy and stability of the application. These simulations require algorithms that minimise the energy footprint along with the time required to produce a solution, maintain the physically required level of accuracy, are numerically stable, and are resilient in case of hardware failure. The European Centre for Medium-Range Weather Forecasts (ECMWF) led the ESCAPE (Energy-efficient Scalable Algorithms for Weather Prediction at Exascale) project, funded by Horizon 2020 (H2020) under the FET-HPC (Future and Emerging Technologies in High Performance Computing) initiative. The goal of ESCAPE was to develop a sustainable strategy to evolve weather and climate prediction models to next-generation computing technologies. The project partners incorporate the expertise of leading European regional forecasting consortia, university research, experienced high-performance computing centres, and hardware vendors. This paper presents an overview of the ESCAPE strategy: (i) identify domain-specific key algorithmic motifs in weather prediction and climate models (which we term Weather & Climate Dwarfs), (ii) categorise them in terms of computational and communication patterns while (iii) adapting them to different hardware architectures with alternative programming models, (iv) analyse the challenges in optimising, and (v) find alternative algorithms for the same scheme. The participating weather prediction models are the following: IFS (Integrated Forecasting System); ALARO, a combination of AROME (Application de la Recherche a l'Operationnel a Meso-Echelle) and ALADIN (Aire Limitee Adaptation Dynamique Developpement International); and COSMO-EULAG, a combination of COSMO (Consortium for Small-scale Modeling) and EULAG (Eulerian and semi-Lagrangian fluid solver). For many of the weather and climate dwarfs ESCAPE provides prototype implementations on different hardware architectures (mainly Intel Skylake CPUs, NVIDIA GPUs, Intel Xeon Phi, Optalysys optical processor) with different programming models. The spectral transform dwarf represents a detailed example of the co-design cycle of an ESCAPE dwarf. The dwarf concept has proven to be extremely useful for the rapid prototyping of alternative algorithms and their interaction with hardware; e.g. the use of a domain-specific language (DSL). Manual adaptations have led to substantial accelerations of key algorithms in numerical weather prediction (NWP) but are not a general recipe for the performance portability of complex NWP models. Existing DSLs are found to require further evolution but are promising tools for achieving the latter. Measurements of energy and time to solution suggest that a future focus needs to be on exploiting the simultaneous use of all available resources in hybrid CPU-GPU arrangements

    Energy-efficient through-life smart design, manufacturing and operation of ships in an industry 4.0 environment

    Get PDF
    Energy efficiency is an important factor in the marine industry to help reduce manufacturing and operational costs as well as the impact on the environment. In the face of global competition and cost-effectiveness, ship builders and operators today require a major overhaul in the entire ship design, manufacturing and operation process to achieve these goals. This paper highlights smart design, manufacturing and operation as the way forward in an industry 4.0 (i4) era from designing for better energy efficiency to more intelligent ships and smart operation through-life. The paper (i) draws parallels between ship design, manufacturing and operation processes, (ii) identifies key challenges facing such a temporal (lifecycle) as opposed to spatial (mass) products, (iii) proposes a closed-loop ship lifecycle framework and (iv) outlines potential future directions in smart design, manufacturing and operation of ships in an industry 4.0 value chain so as to achieve more energy-efficient vessels. Through computational intelligence and cyber-physical integration, we envision that industry 4.0 can revolutionise ship design, manufacturing and operations in a smart product through-life process in the near future

    CERN openlab Whitepaper on Future IT Challenges in Scientific Research

    Get PDF
    This whitepaper describes the major IT challenges in scientific research at CERN and several other European and international research laboratories and projects. Each challenge is exemplified through a set of concrete use cases drawn from the requirements of large-scale scientific programs. The paper is based on contributions from many researchers and IT experts of the participating laboratories and also input from the existing CERN openlab industrial sponsors. The views expressed in this document are those of the individual contributors and do not necessarily reflect the view of their organisations and/or affiliates
    corecore