245,693 research outputs found

    Collaborative simulation and scientific big data analysis: Illustration for sustainability in natural hazards management and chemical process engineering

    Get PDF
    Classical approaches for remote visualization and collaboration used in Computer-Aided Design and Engineering (CAD/E) applications are no longer appropriate due to the increasing amount of data generated, especially using standard networks. We introduce a lightweight and computing platform for scientific simulation, collaboration in engineering, 3D visualization and big data management. This ICT based platform provides scientists an “easy-to-integrate” generic tool, thus enabling worldwide collaboration and remote processing for any kind of data. The service-oriented architecture is based on the cloud computing paradigm and relies on standard internet technologies to be efficient on a large panel of networks and clients. In this paper, we discuss the need of innovations in (i) pre and post processing visualization services, (ii) 3D large scientific data set scalable compression and transmission methods, (iii) collaborative virtual environments, and (iv) collaboration in multi-domains of CAD/E. We propose our open platform for collaborative simulation and scientific big data analysis. This platform is now available as an open project with all core components licensed under LGPL V2.1. We provide two examples of usage of the platform in CAD/E for sustainability engineering from one academic application and one industrial case study. Firstly, we consider chemical process engineering showing the development of a domain specific service. With the rise of global warming issues and with growing importance granted to sustainable development, chemical process engineering has turned to think more and more environmentally. Indeed, the chemical engineer has now taken into account not only the engineering and economic criteria of the process, but also its environmental and social performances. Secondly, an example of natural hazards management illustrates the efficiency of our approach for remote collaboration that involves big data exchange and analysis between distant locations. Finally we underline the platform benefits and we open our platform through next activities in innovation techniques and inventive design

    Leveraging data-driven infrastructure management to facilitate AIOps for big data applications and operations

    Get PDF
    As institutions increasingly shift to distributed and containerized application deployments on remote heterogeneous cloud/cluster infrastructures, the cost and difficulty of efficiently managing and maintaining data-intensive applications have risen. A new emerging solution to this issue is Data-Driven Infrastructure Management (DDIM), where the decisions regarding the management of resources are taken based on data aspects and operations (both on the infrastructure and on the application levels). This chapter will introduce readers to the core concepts underpinning DDIM, based on experience gained from development of the Kubernetes-based BigDataStack DDIM platform (https://bigdatastack.eu/). This chapter involves multiple important BDV topics, including development, deployment, and operations for cluster/cloud-based big data applications, as well as data-driven analytics and artificial intelligence for smart automated infrastructure self-management. Readers will gain important insights into how next-generation DDIM platforms function, as well as how they can be used in practical deployments to improve quality of service for Big Data Applications. This chapter relates to the technical priority Data Processing Architectures of the European Big Data Value Strategic Research & Innovation Agenda [33], as well as the Data Processing Architectures horizontal and Engineering and DevOps for building Big Data Value vertical concerns. The chapter relates to the Reasoning and Decision Making cross-sectorial technology enablers of the AI, Data and Robotics Strategic Research, Innovation & Deployment Agenda [34]

    Big data, modeling, simulation, computational platform and holistic approaches for the fourth industrial revolution

    Get PDF
    Naturally, the mathematical process starts from proving the existence and uniqueness of the solution by the using the theorem, corollary, lemma, proposition, dealing with the simple and non-complex model. Proving the existence and uniqueness solution are guaranteed by governing the infinite amount of solutions and limited to the implementation of a small-scale simulation on a single desktop CPU. Accuracy, consistency and stability were easily controlled by a small data scale. However, the fourth industrial can be described the mathematical process as the advent of cyber-physical systems involving entirely new capabilities for researcher and machines (Xing, 2017). In numerical perspective, the fourth industrial revolution (4iR) required the transition from a uncomplex model and small scale simulation to complex model and big data for visualizing the real-world application in digital dialectical and exciting opportunity. Thus, a big data analytics and its classification are a problem solving for these limitations. Some applications of 4iR will highlight the extension version in terms of models, derivative and discretization, dimension of space and time, behavior of initial and boundary conditions, grid generation, data extraction, numerical method and image processing with high resolution feature in numerical perspective. In statistics, a big data depends on data growth however, from numerical perspective, a few classification strategies will be investigated deals with the specific classifier tool. This paper will investigate the conceptual framework for a big data classification, governing the mathematical modeling, selecting the superior numerical method, handling the large sparse simulation and investigating the parallel computing on high performance computing (HPC) platform. The conceptual framework will benefit to the big data provider, algorithm provider and system analyzer to classify and recommend the specific strategy for generating, handling and analyzing the big data. All the perspectives take a holistic view of technology. Current research, the particular conceptual framework will be described in holistic terms. 4iR has ability to take a holistic approach to explain an important of big data, complex modeling, large sparse simulation and high performance computing platform. Numerical analysis and parallel performance evaluation are the indicators for performance investigation of the classification strategy. This research will benefit to obtain an accurate decision, predictions and trending practice on how to obtain the approximation solution for science and engineering applications. As a conclusion, classification strategies for generating a fine granular mesh, identifying the root causes of failures and issues in real time solution. Furthermore, the big data-driven and data transfer evolution towards high speed of technology transfer to boost the economic and social development for the 4iR (Xing, 2017; Marwala et al., 2017)

    Software for Wearable Devices: Challenges and Opportunities

    Full text link
    Wearable devices are a new form of mobile computer system that provides exclusive and user-personalized services. Wearable devices bring new issues and challenges to computer science and technology. This paper summarizes the development process and the categories of wearable devices. In addition, we present new key issues arising in aspects of wearable devices, including operating systems, database management system, network communication protocol, application development platform, privacy and security, energy consumption, human-computer interaction, software engineering, and big data.Comment: 6 pages, 1 figure, for Compsac 201

    Lessons Learned from a Decade of Providing Interactive, On-Demand High Performance Computing to Scientists and Engineers

    Full text link
    For decades, the use of HPC systems was limited to those in the physical sciences who had mastered their domain in conjunction with a deep understanding of HPC architectures and algorithms. During these same decades, consumer computing device advances produced tablets and smartphones that allow millions of children to interactively develop and share code projects across the globe. As the HPC community faces the challenges associated with guiding researchers from disciplines using high productivity interactive tools to effective use of HPC systems, it seems appropriate to revisit the assumptions surrounding the necessary skills required for access to large computational systems. For over a decade, MIT Lincoln Laboratory has been supporting interactive, on-demand high performance computing by seamlessly integrating familiar high productivity tools to provide users with an increased number of design turns, rapid prototyping capability, and faster time to insight. In this paper, we discuss the lessons learned while supporting interactive, on-demand high performance computing from the perspectives of the users and the team supporting the users and the system. Building on these lessons, we present an overview of current needs and the technical solutions we are building to lower the barrier to entry for new users from the humanities, social, and biological sciences.Comment: 15 pages, 3 figures, First Workshop on Interactive High Performance Computing (WIHPC) 2018 held in conjunction with ISC High Performance 2018 in Frankfurt, German

    Model-driven Engineering IDE for Quality Assessment of Data-intensive Applications

    Full text link
    This article introduces a model-driven engineering (MDE) integrated development environment (IDE) for Data-Intensive Cloud Applications (DIA) with iterative quality enhancements. As part of the H2020 DICE project (ICT-9-2014, id 644869), a framework is being constructed and it is composed of a set of tools developed to support a new MDE methodology. One of these tools is the IDE which acts as the front-end of the methodology and plays a pivotal role in integrating the other tools of the framework. The IDE enables designers to produce from the architectural structure of the general application along with their properties and QoS/QoD annotations up to the deployment model. Administrators, quality assurance engineers or software architects may also run and examine the output of the design and analysis tools in addition to the designer in order to assess the DIA quality in an iterative process
    corecore