51,698 research outputs found

    J-PET Framework: Software platform for PET tomography data reconstruction and analysis

    Get PDF
    J-PET Framework is an open-source software platform for data analysis, written in C++ and based on the ROOT package. It provides a common environment for implementation of reconstruction, calibration and filtering procedures, as well as for user-level analyses of Positron Emission Tomography data. The library contains a set of building blocks that can be combined by users with even little programming experience, into chains of processing tasks through a convenient, simple and well-documented API. The generic input-output interface allows processing the data from various sources: low-level data from the tomography acquisition system or from diagnostic setups such as digital oscilloscopes, as well as high-level tomography structures e.g. sinograms or a list of lines-of-response. Moreover, the environment can be interfaced with Monte Carlo simulation packages such as GEANT and GATE, which are commonly used in the medical scientific community.Comment: 14 pages, 5 figure

    A Framework for Design-Time Testing of Service-Based Applications at BPEL Level

    Get PDF
    Software applications created on top of the service-oriented architecture (SOA) are increasingly popular but testing them remains a challenge. In this paper a framework named TASSA for testing the functional and non-functional behaviour of service-based applications is presented. The paper focuses on the concept of design time testing, the corresponding testing approach and architectural integration of the consisting TASSA tools. The individual TASSA tools with sample validation scenarios were already presented with a general view of their relation. This paper’s contribution is the structured testing approach, based on the integral use of the tools and their architectural integration. The framework is based on SOA principles and is composable depending on user requirements.The work reported in this paper was supported by a research project funded by the National Scientific Fund, Bulgarian Ministry of Education, Youth and Science, via agreement no. DOO2-182

    Survey on Additive Manufacturing, Cloud 3D Printing and Services

    Full text link
    Cloud Manufacturing (CM) is the concept of using manufacturing resources in a service oriented way over the Internet. Recent developments in Additive Manufacturing (AM) are making it possible to utilise resources ad-hoc as replacement for traditional manufacturing resources in case of spontaneous problems in the established manufacturing processes. In order to be of use in these scenarios the AM resources must adhere to a strict principle of transparency and service composition in adherence to the Cloud Computing (CC) paradigm. With this review we provide an overview over CM, AM and relevant domains as well as present the historical development of scientific research in these fields, starting from 2002. Part of this work is also a meta-review on the domain to further detail its development and structure

    Knowledge-based support in Non-Destructive Testing for health monitoring of aircraft structures

    Get PDF
    Maintenance manuals include general methods and procedures for industrial maintenance and they contain information about principles of maintenance methods. Particularly, Non-Destructive Testing (NDT) methods are important for the detection of aeronautical defects and they can be used for various kinds of material and in different environments. Conventional non-destructive evaluation inspections are done at periodic maintenance checks. Usually, the list of tools used in a maintenance program is simply located in the introduction of manuals, without any precision as regards to their characteristics, except for a short description of the manufacturer and tasks in which they are employed. Improving the identification concepts of the maintenance tools is needed to manage the set of equipments and establish a system of equivalence: it is necessary to have a consistent maintenance conceptualization, flexible enough to fit all current equipment, but also all those likely to be added/used in the future. Our contribution is related to the formal specification of the system of functional equivalences that can facilitate the maintenance activities with means to determine whether a tool can be substituted for another by observing their key parameters in the identified characteristics. Reasoning mechanisms of conceptual graphs constitute the baseline elements to measure the fit or unfit between an equipment model and a maintenance activity model. Graph operations are used for processing answers to a query and this graph-based approach to the search method is in-line with the logical view of information retrieval. The methodology described supports knowledge formalization and capitalization of experienced NDT practitioners. As a result, it enables the selection of a NDT technique and outlines its capabilities with acceptable alternatives

    Darwinian Data Structure Selection

    Get PDF
    Data structure selection and tuning is laborious but can vastly improve an application's performance and memory footprint. Some data structures share a common interface and enjoy multiple implementations. We call them Darwinian Data Structures (DDS), since we can subject their implementations to survival of the fittest. We introduce ARTEMIS a multi-objective, cloud-based search-based optimisation framework that automatically finds optimal, tuned DDS modulo a test suite, then changes an application to use that DDS. ARTEMIS achieves substantial performance improvements for \emph{every} project in 55 Java projects from DaCapo benchmark, 88 popular projects and 3030 uniformly sampled projects from GitHub. For execution time, CPU usage, and memory consumption, ARTEMIS finds at least one solution that improves \emph{all} measures for 86%86\% (37/4337/43) of the projects. The median improvement across the best solutions is 4.8%4.8\%, 10.1%10.1\%, 5.1%5.1\% for runtime, memory and CPU usage. These aggregate results understate ARTEMIS's potential impact. Some of the benchmarks it improves are libraries or utility functions. Two examples are gson, a ubiquitous Java serialization framework, and xalan, Apache's XML transformation tool. ARTEMIS improves gson by 16.516.5\%, 1%1\% and 2.2%2.2\% for memory, runtime, and CPU; ARTEMIS improves xalan's memory consumption by 23.523.5\%. \emph{Every} client of these projects will benefit from these performance improvements.Comment: 11 page

    WS-GUARD: enhancing UDDI Registries with on-line testing capabilities

    Get PDF
    Abstract This thesis investigates the Service Oriented Architecture and in particular the runtime discovery of Web services through the development of an empowered UDDI registry called WS-GUARD (Guaranteeing Uddi Audition at Registration and Discovery). We start by presenting the Audition framework, a specially conceived framework that applies the idea of testing during the Web service registration in the UDDI registry and then we study the practical implications of its implementation focusing on the most advanced Web service technologies. This thesis aims at modifying and extending the registration protocol of Web services into UDDI registries in order to introduce a testing phase before actual service publishing: only those services that pass the audition are admitted in the registry and become publicly available at runtime. A complete prototype implementation of WS-GUARD is described and analysed. Riassunto analitico La tesi ha investigato l'ambito Service Oriented Architecture e in particolare il run-time discovery di Web service attraverso la realizzazione di un registro UDDI potenziato, denominato WS-GUARD (Guaranteeing Uddi Audition at Registration and Discovery). Principale obiettivo del lavoro è stato la modifica dei protocolli di registrazione del registro UDDI. Tale modifica è stata rivolta all'introduzione di una fase di testing preventiva alla tradizionale fase di registrazione. Ammettendo alla registrazione soltanto quei servizi che superino la fase di verifica si intende fornire maggiori garanzie sulla qualità dei servizi che saranno resi dinamicamente reperibili (discovered) a tempo di esecuzione. La tesi discute le modifiche proposte e ne fornisce un'implementazione reale

    Study supporting the interim evaluation of the innovation principle. Final Report November 2019

    Get PDF
    The European Commission has recognised the importance of a more innovation- oriented EU acquis, gradually exploring the ways in which EU rules can support innovation. The ‘innovation principle’ was introduced to ensure that whenever policy is developed, the impact on innovation is fully assessed. However, as further discussed in this Study, the exact contours of the innovation principle have been shaped very gradually within the context of the EU better regulation agenda: originally advocated by industry in the context of the precautionary principle, the innovation principle has gradually been given a more articulate and consistent role, which aims at complementing the precautionary principle by increasing the salience of impacts on innovation during all phases of the policy cycle. This Study presents an evaluation of the current implementation of the innovation principle, limited to two of its three components, i.e. the Research and Innovation Tool included in the Better Regulation Toolbox, and the innovation deals. As a preliminary caveat, it is important to recall that the implementation of the innovation principle is still in its infancy, and thus the Study only represents a very early assessment of the extent to which the innovation principle is being correctly implemented, and whether changes would be required to make the principle more effective and useful in the context of the EU better regulation agenda. The main finding is that the innovation principle has the potential to contribute to the quality and future-proof nature of EU policy, but that significant changes and effort will be needed for this potential to fully materialise. The most evident areas for improvement are related to the lack of a clear legal basis, the lack of a widely acknowledged definition, the lack of awareness among EU officials and stakeholders, and the lack of adequate skills among those that are called to implement the innovation principle. As a result of these problems, the impact of the innovation principle on the innovation-friendliness of the EU acquis has been limited so far. The Commission should clarify in official documents that the Innovation principle does not entail a de- regulatory approach, and is not incompatible with the precautionary principle: this would also help to have the principle fully recognised and endorsed by all EU institutions, as well as by civil society, often concerned with the possible anti-regulatory narrative around the innovation principle in stakeholder discussions. Apart from clarifications, and further dissemination and training, major improvements are possible in the near future, especially if the innovation principle is brought fully in line with the evolving data-driven nature of digital innovation and provides more guidance to the Commission on how to design experimental regulation, including inter alia so-called ‘regulatory sandboxes’. Finally, the Commission should ensure that the innovation principle is given prominence with the transition to the Horizon Europe programme, in particular due to the anticipated launch of ‘missions’ in key domains

    IMP Science Gateway: from the Portal to the Hub of Virtual Experimental Labs in Materials Science

    Full text link
    "Science gateway" (SG) ideology means a user-friendly intuitive interface between scientists (or scientific communities) and different software components + various distributed computing infrastructures (DCIs) (like grids, clouds, clusters), where researchers can focus on their scientific goals and less on peculiarities of software/DCI. "IMP Science Gateway Portal" (http://scigate.imp.kiev.ua) for complex workflow management and integration of distributed computing resources (like clusters, service grids, desktop grids, clouds) is presented. It is created on the basis of WS-PGRADE and gUSE technologies, where WS-PGRADE is designed for science workflow operation and gUSE - for smooth integration of available resources for parallel and distributed computing in various heterogeneous distributed computing infrastructures (DCI). The typical scientific workflows with possible scenarios of its preparation and usage are presented. Several typical use cases for these science applications (scientific workflows) are considered for molecular dynamics (MD) simulations of complex behavior of various nanostructures (nanoindentation of graphene layers, defect system relaxation in metal nanocrystals, thermal stability of boron nitride nanotubes, etc.). The user experience is analyzed in the context of its practical applications for MD simulations in materials science, physics and nanotechnologies with available heterogeneous DCIs. In conclusion, the "science gateway" approach - workflow manager (like WS-PGRADE) + DCI resources manager (like gUSE)- gives opportunity to use the SG portal (like "IMP Science Gateway Portal") in a very promising way, namely, as a hub of various virtual experimental labs (different software components + various requirements to resources) in the context of its practical MD applications in materials science, physics, chemistry, biology, and nanotechnologies.Comment: 6 pages, 5 figures, 3 tables; 6th International Workshop on Science Gateways, IWSG-2014 (Dublin, Ireland, 3-5 June, 2014). arXiv admin note: substantial text overlap with arXiv:1404.545

    Marketing relations and communication infrastructure development in the banking sector based on big data mining

    Get PDF
    Purpose: The article aims to study the methodological tools for applying the technologies of intellectual analysis of big data in the modern digital space, the further implementation of which can become the basis for the marketing relations concept implementation in the banking sector of the Russian Federation‘ economy. Structure/Methodology/Approach: For the marketing relations development in the banking sector in the digital economy, it seems necessary: firstly, to identify the opportunities and advantages of the big data mining in banking marketing; secondly, to identify the sources and methods of processing big data; thirdly, to study the examples of the big data mining successful use by Russian banks and to formulate the recommendations on the big data technologies implementation in the digital marketing banking strategy. Findings: The authors‘ analysis showed that big data technologies processing of open online and offline sources of information significantly increases the data amount available for intelligent analysis, as a result of which the interaction between the bank and the target client reaches a new level of partnership. Practical Implications: Conclusions and generalizations of the study can be applied in the practice of managing financial institutions. The results of the study can be used by bank management to form a digital marketing strategy for long-term communication. Originality/Value: The main contribution of this study is that the authors have identified the main directions of using big data in relationship marketing to generate additional profit, as well as the possibility of intellectual analysis of the client base, aimed at expanding the market share and retaining customers in the banking sector of the economy.peer-reviewe
    corecore