2,167 research outputs found

    Technologies and Applications for Big Data Value

    Get PDF
    This open access book explores cutting-edge solutions and best practices for big data and data-driven AI applications for the data-driven economy. It provides the reader with a basis for understanding how technical issues can be overcome to offer real-world solutions to major industrial areas. The book starts with an introductory chapter that provides an overview of the book by positioning the following chapters in terms of their contributions to technology frameworks which are key elements of the Big Data Value Public-Private Partnership and the upcoming Partnership on AI, Data and Robotics. The remainder of the book is then arranged in two parts. The first part “Technologies and Methods” contains horizontal contributions of technologies and methods that enable data value chains to be applied in any sector. The second part “Processes and Applications” details experience reports and lessons from using big data and data-driven approaches in processes and applications. Its chapters are co-authored with industry experts and cover domains including health, law, finance, retail, manufacturing, mobility, and smart cities. Contributions emanate from the Big Data Value Public-Private Partnership and the Big Data Value Association, which have acted as the European data community's nucleus to bring together businesses with leading researchers to harness the value of data to benefit society, business, science, and industry. The book is of interest to two primary audiences, first, undergraduate and postgraduate students and researchers in various fields, including big data, data science, data engineering, and machine learning and AI. Second, practitioners and industry experts engaged in data-driven systems, software design and deployment projects who are interested in employing these advanced methods to address real-world problems

    AI-assisted Automated Workflow for Real-time X-ray Ptychography Data Analysis via Federated Resources

    Full text link
    We present an end-to-end automated workflow that uses large-scale remote compute resources and an embedded GPU platform at the edge to enable AI/ML-accelerated real-time analysis of data collected for x-ray ptychography. Ptychography is a lensless method that is being used to image samples through a simultaneous numerical inversion of a large number of diffraction patterns from adjacent overlapping scan positions. This acquisition method can enable nanoscale imaging with x-rays and electrons, but this often requires very large experimental datasets and commensurately high turnaround times, which can limit experimental capabilities such as real-time experimental steering and low-latency monitoring. In this work, we introduce a software system that can automate ptychography data analysis tasks. We accelerate the data analysis pipeline by using a modified version of PtychoNN -- an ML-based approach to solve phase retrieval problem that shows two orders of magnitude speedup compared to traditional iterative methods. Further, our system coordinates and overlaps different data analysis tasks to minimize synchronization overhead between different stages of the workflow. We evaluate our workflow system with real-world experimental workloads from the 26ID beamline at Advanced Photon Source and ThetaGPU cluster at Argonne Leadership Computing Resources.Comment: 7 pages, 1 figure, to be published in High Performance Computing for Imaging Conference, Electronic Imaging (HPCI 2023

    Technologies and Applications for Big Data Value

    Get PDF
    This open access book explores cutting-edge solutions and best practices for big data and data-driven AI applications for the data-driven economy. It provides the reader with a basis for understanding how technical issues can be overcome to offer real-world solutions to major industrial areas. The book starts with an introductory chapter that provides an overview of the book by positioning the following chapters in terms of their contributions to technology frameworks which are key elements of the Big Data Value Public-Private Partnership and the upcoming Partnership on AI, Data and Robotics. The remainder of the book is then arranged in two parts. The first part “Technologies and Methods” contains horizontal contributions of technologies and methods that enable data value chains to be applied in any sector. The second part “Processes and Applications” details experience reports and lessons from using big data and data-driven approaches in processes and applications. Its chapters are co-authored with industry experts and cover domains including health, law, finance, retail, manufacturing, mobility, and smart cities. Contributions emanate from the Big Data Value Public-Private Partnership and the Big Data Value Association, which have acted as the European data community's nucleus to bring together businesses with leading researchers to harness the value of data to benefit society, business, science, and industry. The book is of interest to two primary audiences, first, undergraduate and postgraduate students and researchers in various fields, including big data, data science, data engineering, and machine learning and AI. Second, practitioners and industry experts engaged in data-driven systems, software design and deployment projects who are interested in employing these advanced methods to address real-world problems

    Task Scheduling Based on Grey Wolf Optimizer Algorithm for Smart Meter Embedded Operating System

    Get PDF
    In recent years, with the rapid development of electric power informatization, smart meters are gradually developing towards intelligent IOT. Smart meters can not only measure user status, but also interconnect and communicate with cell phones, smart homes and other cloud devices, and these core functions are completed by the smart meter embedded operating system. Due to the dynamic heterogeneity of the user program side and the system processing side of the embedded system, resource allocation and task scheduling is a challenging problem for embedded operating systems of smart meters. Smart meters need to achieve fast response and shortest completion time for user program side requests, and also need to take into account the load balancing of each processing node to ensure the reliability of smart meter embedded systems. In this paper, based on the advanced Grey Wolf Optimizer, we study the scheduling principle of the service program nodes in the smart meter operating system, and analyze the problems of the traditional scheduling algorithm to find the optimal solution. Compared with traditional algorithms and classical swarm intelligence algorithms, the algorithm proposed in this paper avoids the dilemma of local optimization, can quickly allocate operating system tasks, effectively shorten the time consumption of task scheduling, ensure the real-time performance of multi task scheduling, and achieve the system tuning balance. Finally, the effectiveness of the algorithm is verified by simulation experiments

    FGQT Q04 - Standardization Roadmap on Quantum Technologies [written by the CEN-CENELEC Focus Group on Quantum Technologies (FGQT)]

    Get PDF
    In 2018, the European Commission launched its long term and large scale Quantum Technology FET Flagship Program. The European Commission is also very interested in boosting standards for quantum technologies (QT). The Quantum Flagship has its own cooperation and coordination activities to “coordinate national strategies and activities” and in its “Quantum Manifesto” [1] explicitly advises to form “advisory boards” to promote collaboration in standardization. The CEN/CENELEC Focus Group for Quantum Technologies (FGQT) was formed in June 2020 with the goal to support the plans of the Commission. Currently, a multitude of standardization activities in QT are ongoing worldwide. While there is overlap in certain areas, other areas of this wide technological field are not being addressed at all. A coordinated approach will be highly beneficial to unleash the full potential of standardization for speeding up progress—also because the pool of standardization experts available for quantum technologies is still very limited. Furthermore, not all areas are yet “ready for standardization”, i.e., while in some fields early standardization is capable of boosting progress, it may be a problem in other areas. Thus, an assessment of standardization readiness of the different areas is required, too. The FGQT was established to identify standardization needs and opportunities for the entire field of QT with the final goal to boost the establishment of new industries in Europe and consequently the development and engineering of unprecedented novel devices and infrastructures for the benefit of European citizens. The QT standardization roadmap follows a constructive approach, starting with basic enabling technologies, from which QT components and subsystems are constructed, which again are assembled into QT systems that in turn form composite systems, constituting the building blocks for use cases. Thus, the roadmap is structured approximating very closely the categories of the EC quantum technology FET Flagship Program: quantum communication, quantum computing and simulation, quantum metrology, sensing, and enhanced imaging, while the basic enabling technologies and sub-systems are organized in two pools —thus supporting re-use in the different system categories. The separate types of QT unit systems are then foundations of general QT infrastructures or composite systems. On the level of use cases, the QT standardization roadmap describes basic domains of applicability, so-called “meta use cases”, while the detailed use cases are listed in a separate document of the FGQT: “FGQT Q05 Use Cases”. Finally, the QT standardization roadmap presents an outlook and conclusions, including an actual prioritization of the single identified standardization needs in the form of sequence diagrams (Gantt charts). This approach differs slightly from the QT “Pillar design” of the EU Quantum Flagship but, in our opinion, it extends it and is better adapted to standardization purposes, while the former is optimally suited as a research program design. The FGQT is an open group of European-based experts, working in QT research areas or enabling technologies, and of developers of components, products, or services related to QT. If you are based in Europe, and interested in guidelines and standards to help setting up a research infrastructure, or structuring and boosting your market relevance; if you want to improve coordination with your stakeholders and are interested in coordination and exchange with other experts in the field of QT—please consider to join the CEN/CENELEC FGQT. NOTE 1 European QT standards development in CEN/CENELEC will take place in the new JTC 22 QT (Joint Technical Committee 22 on Quantum Technologies). The work in JTC 22 QT will be guided by the present roadmap doc ument, and it is expected that the FGQT roadmap-development activity will be absorbed/continued by JTC 22 Q

    Analytical Challenges and Metrological Approaches to Ensuring Dietary Supplement Quality: International Perspectives

    Get PDF
    The increased utilization of metrology resources and expanded application of its’ approaches in the development of internationally agreed upon measurements can lay the basis for regulatory harmonization, support reproducible research, and advance scientific understanding, especially of dietary supplements and herbal medicines. Yet, metrology is often underappreciated and underutilized in dealing with the many challenges presented by these chemically complex preparations. This article discusses the utility of applying rigorous analytical techniques and adopting metrological principles more widely in studying dietary supplement products and ingredients, particularly medicinal plants and other botanicals. An assessment of current and emerging dietary supplement characterization methods is provided, including targeted and non-targeted techniques, as well as data analysis and evaluation approaches, with a focus on chemometrics, toxicity, dosage form performance, and data management. Quality assessment, statistical methods, and optimized methods for data management are also discussed. Case studies provide examples of applying metrological principles in thorough analytical characterization of supplement composition to clarify their health effects. A new frontier for metrology in dietary supplement science is described, including opportunities to improve methods for analysis and data management, development of relevant standards and good practices, and communication of these developments to researchers and analysts, as well as to regulatory and policy decision makers in the public and private sectors. The promotion of closer interactions between analytical, clinical, and pharmaceutical scientists who are involved in research and product development with metrologists who develop standards and methodological guidelines is critical to advance research on dietary supplement characterization and health effects

    An Industrial Data Analysis and Supervision Framework for Predictive Manufacturing Systems

    Get PDF
    Due to the advancements in the Information and Communication Technologies field in the modern interconnected world, the manufacturing industry is becoming a more and more data rich environment, with large volumes of data being generated on a daily basis, thus presenting a new set of opportunities to be explored towards improving the efficiency and quality of production processes. This can be done through the development of the so called Predictive Manufacturing Systems. These systems aim to improve manufacturing processes through a combination of concepts such as Cyber-Physical Production Systems, Machine Learning and real-time Data Analytics in order to predict future states and events in production. This can be used in a wide array of applications, including predictive maintenance policies, improving quality control through the early detection of faults and defects or optimize energy consumption, to name a few. Therefore, the research efforts presented in this document focus on the design and development of a generic framework to guide the implementation of predictive manufacturing systems through a set of common requirements and components. This approach aims to enable manufacturers to extract, analyse, interpret and transform their data into actionable knowledge that can be leveraged into a business advantage. To this end a list of goals, functional and non-functional requirements is defined for these systems based on a thorough literature review and empirical knowledge. Subsequently the Intelligent Data Analysis and Real-Time Supervision (IDARTS) framework is proposed, along with a detailed description of each of its main components. Finally, a pilot implementation is presented for each of this components, followed by the demonstration of the proposed framework in three different scenarios including several use cases in varied real-world industrial areas. In this way the proposed work aims to provide a common foundation for the full realization of Predictive Manufacturing Systems

    Business analytics in industry 4.0: a systematic review

    Get PDF
    Recently, the term “Industry 4.0” has emerged to characterize several Information Technology and Communication (ICT) adoptions in production processes (e.g., Internet-of-Things, implementation of digital production support information technologies). Business Analytics is often used within the Industry 4.0, thus incorporating its data intelligence (e.g., statistical analysis, predictive modelling, optimization) expert system component. In this paper, we perform a Systematic Literature Review (SLR) on the usage of Business Analytics within the Industry 4.0 concept, covering a selection of 169 papers obtained from six major scientific publication sources from 2010 to March 2020. The selected papers were first classified in three major types, namely, Practical Application, Reviews and Framework Proposal. Then, we analysed with more detail the practical application studies which were further divided into three main categories of the Gartner analytical maturity model, Descriptive Analytics, Predictive Analytics and Prescriptive Analytics. In particular, we characterized the distinct analytics studies in terms of the industry application and data context used, impact (in terms of their Technology Readiness Level) and selected data modelling method. Our SLR analysis provides a mapping of how data-based Industry 4.0 expert systems are currently used, disclosing also research gaps and future research opportunities.The work of P. Cortez was supported by FCT - Fundação para a CiĂȘncia e Tecnologia within the R&D Units Project Scope: UIDB/00319/2020. We would like to thank to the three anonymous reviewers for their helpful suggestions
    • 

    corecore