169 research outputs found

    D3M: automated data-driven decision making

    Get PDF
    Data has an undoubtedly impact on society. Storing, processing and analyzing large amounts of available data is currently one of the key success factors for an organization. Nonetheless, we are recently witnessing a change represented by huge and heterogeneous amounts of data. Thus, in order to carry on these data exploitation tasks, organizations must first perform data integration combining data from multiple sources to yield a unified view over them. In this paper, we report on the Automated Data-Driven Decision Making (D3M) project, whose main objective is to provide a mature software solution for automatic data integration with advanced decision making capabilities.This paper has been funded by the Spanish Agencia Estatal de Investigación (AEI) under project / funding scheme PDC2021-121195-I00.Peer ReviewedPostprint (published version

    EU H2020 MSCA RISE Project FIRST - “virtual Factories: Interoperation suppoRting buSiness innovation”

    Get PDF
    FIRST – “virtual Factories: Interoperation suppoRting buSiness innovation”, is a European H2020 project, founded by the RESEARCH AND INNOVATION STAFF EXCHANGE (RISE) Work Programme as part of the Marie Skłodowska-Curie actions. The project concerns with Manufacturing 2.0 and aims at providing the new technology and methodology to describe manufacturing assets; to compose and integrate the existing services into collaborative virtual manufacturing processes; and to deal with evolution of changes. This Chapter provides an overview of the state of the art for the research topics related to the project research objectives, and then it presents the progresses the project achieved up to now towards the implementation of the proposed innovations

    Data Spaces

    Get PDF
    This open access book aims to educate data space designers to understand what is required to create a successful data space. It explores cutting-edge theory, technologies, methodologies, and best practices for data spaces for both industrial and personal data and provides the reader with a basis for understanding the design, deployment, and future directions of data spaces. The book captures the early lessons and experience in creating data spaces. It arranges these contributions into three parts covering design, deployment, and future directions respectively. The first part explores the design space of data spaces. The single chapters detail the organisational design for data spaces, data platforms, data governance federated learning, personal data sharing, data marketplaces, and hybrid artificial intelligence for data spaces. The second part describes the use of data spaces within real-world deployments. Its chapters are co-authored with industry experts and include case studies of data spaces in sectors including industry 4.0, food safety, FinTech, health care, and energy. The third and final part details future directions for data spaces, including challenges and opportunities for common European data spaces and privacy-preserving techniques for trustworthy data sharing. The book is of interest to two primary audiences: first, researchers interested in data management and data sharing, and second, practitioners and industry experts engaged in data-driven systems where the sharing and exchange of data within an ecosystem are critical

    Planning for the Lifecycle Management and Long-Term Preservation of Research Data: A Federated Approach

    Get PDF
    Outcomes of the grant are archived here.The “data deluge” is a recent but increasingly well-understood phenomenon of scientific and social inquiry. Large-scale research instruments extend our observational power by many orders of magnitude but at the same time generate massive amounts of data. Researchers work feverishly to document and preserve changing or disappearing habitats, cultures, languages, and artifacts resulting in volumes of media in various formats. New software tools mine a growing universe of historical and modern texts and connect the dots in our semantic environment. Libraries, archives, and museums undertake digitization programs creating broad access to unique cultural heritage resources for research. Global-scale research collaborations with hundreds or thousands of participants, drive the creation of massive amounts of data, most of which cannot be recreated if lost. The University of Kansas (KU) Libraries in collaboration with two partners, the Greater Western Library Alliance (GWLA) and the Great Plains Network (GPN), received an IMLS National Leadership Grant designed to leverage collective strengths and create a proposal for a scalable and federated approach to the lifecycle management of research data based on the needs of GPN and GWLA member institutions.Institute for Museum and Library Services LG-51-12-0695-1

    Strategies of development and maintenance in supervision, control, synchronization, data acquisition and processing in light sources

    Get PDF
    Programa Oficial de Doutoramento en Tecnoloxías da Información e as Comunicacións. 5032V01[Resumo] Os aceleradores de partículas e fontes de luz sincrotrón, evolucionan constantemente para estar na vangarda da tecnoloxía, levando os límites cada vez mais lonxe para explorar novos dominios e universos. Os sistemas de control son unha parte crucial desas instalacións científicas e buscan logra-la flexibilidade de manobra para poder facer experimentos moi variados, con configuracións diferentes que engloban moitos tipos de detectores, procedementos, mostras a estudar e contornas. As propostas de experimento son cada vez máis ambiciosas e van sempre un paso por diante do establecido. Precísanse detectores cada volta máis rápidos e eficientes, con máis ancho de banda e con máis resolución. Tamén é importante a operación simultánea de varios detectores tanto escalares como mono ou bidimensionáis, con mecanismos de sincronización de precisión que integren as singularidades de cada un. Este traballo estuda as solucións existentes no campo dos sistemas de control e adquisición de datos nos aceleradores de partículas e fontes de luz e raios X, ó tempo que explora novos requisitos e retos no que respecta á sincronización e velocidade de adquisición de datos para novos experimentos, a optimización do deseño, soporte, xestión de servizos e custos de operación. Tamén se estudan diferentes solucións adaptadas a cada contorna.[Resumen] Los aceleradores de partículas y fuentes de luz sincrotrón, evolucionan constantemente para estar en la vanguardia de la tecnología, y poder explorar nuevos dominios. Los sistemas de control son una parte fundamental de esas instalaciones científicas y buscan lograr la máxima flexibilidad para poder llevar a cabo experimentos más variados, con configuraciones diferentes que engloban varios tipos de detectores, procedimientos, muestras a estudiar y entornos. Los experimentos se proponen cada vez más ambiciosos y en ocasiones más allá de los límites establecidos. Se necesitan detectores cada vez más rápidos y eficientes, con más resolución y ancho de banda, que puedan sincronizarse simultáneamente con otros detectores tanto escalares como mono y bidimensionales, integrando las singularidades de cada uno y homogeneizando la adquisición de datos. Este trabajo estudia los sistemas de control y adquisición de datos de aceleradores de partículas y fuentes de luz y rayos X, y explora nuevos requisitos y retos en lo que respecta a la sincronización y velocidad de adquisición de datos, optimización y costo-eficiencia en el diseño, operación soporte, mantenimiento y gestión de servicios. También se estudian diferentes soluciones adaptadas a cada entorno.[Abstract] Particle accelerators and photon sources are constantly evolving, attaining the cutting-edge technologies to push the limits forward and explore new domains. The control systems are a crucial part of these installations and are required to provide flexible solutions to the new challenging experiments, with different kinds of detectors, setups, sample environments and procedures. Experiment proposals are more and more ambitious at each call and go often a step beyond the capabilities of the instrumentation. Detectors shall be faster, with higher efficiency, more resolution, more bandwidth and able to synchronize with other detectors of all kinds; scalars, one or two-dimensional, taking into account their singularities and homogenizing the data acquisition. This work examines the control and data acquisition systems for particle accelerators and X- ray / light sources and explores new requirements and challenges regarding synchronization and data acquisition bandwidth, optimization and cost-efficiency in the design / operation / support. It also studies different solutions depending on the environment

    Modeling industry 4.0 based fog computing environments for application analysis and deployment

    Get PDF
    The extension of the Cloud to the Edge of the network through Fog Computing can have a significant impact on the reliability and latencies of deployed applications. Recent papers have suggested a shift from VM and Container based deployments to a shared environment among applications to better utilize resources. Unfortunately, the existing deployment and optimization methods pay little attention to developing and identifying complete models to such systems which may cause large inaccuracies between simulated and physical run-time parameters. Existing models do not account for application interdependence or the locality of application resources which causes extra communication and processing delays. This paper addresses these issues by carrying out experiments in both cloud and edge systems with various scales and applications. It analyses the outcomes to derive a new reference model with data driven parameter formulations and representations to help understand the effect of migration on these systems. As a result, we can have a more complete characterization of the fog environment. This, together with tailored optimization methods than can handle the heterogeneity and scale of the fog can improve the overall system run-time parameters and improve constraint satisfaction. An Industry 4.0 based case study with different scenarios was used to analyze and validate the effectiveness of the proposed model. Tests were deployed on physical and virtual environments with different scales. The advantages of the model based optimization methods were validated in real physical environments. Based on these tests, we have found that our model is 90% accurate on load and delay predictions for application deployments in both cloud and edge
    corecore