1,984 research outputs found

    Metadata Exploitation in Large-scale Data Migration Projects

    Get PDF
    The inherent complexity of large-scale information integration efforts has led to the proliferation of numerous metadata capabilities to improve upon project management, quality control and governance. In this paper, we utilize complex information integration projects in the context of SAP application consolidation to analyze several new metadata capabilities, which enable improved governance and control of data quality. Further, we identify key focus areas for shaping future industrial and academic research efforts by investigating certain unaddressed aspects around these capabilities that often negatively impact information integration projects

    Visually Analyzing Company-wide Software Service Dependencies: An Industrial Case Study

    Full text link
    Managing dependencies between software services is a crucial task for any company operating cloud applications. Visualizations can help to understand and maintain these complex dependencies. In this paper, we present a force-directed service dependency visualization and filtering tool that has been developed and used within SAP. The tool's use cases include guiding service retirement as well as understanding service deployment landscapes and their relationship to the company's organizational structure. We report how we built and adapted the tool under strict time constraints to address the requirements of our users. We further share insights on how we enabled internal adoption. For us, starting with a minimal viable visualization and then quickly responding to user feedback was essential for convincing users of the tool's value. The final version of the tool enabled users to visually understand company-wide service consumption, supporting data-driven decision making.Comment: 5 pages, 3 figures, 1 table, 11th IEEE Working Conference on Software Visualization (VISSOFT 2023

    Conservation science in NOAA’s National Marine Sanctuaries: description and recent accomplishments

    Get PDF
    This report describes cases relating to the management of national marine sanctuaries in which certain scientific information was required so managers could make decisions that effectively protected trust resources. The cases presented represent only a fraction of difficult issues that marine sanctuary managers deal with daily. They include, among others, problems related to wildlife disturbance, vessel routing, marine reserve placement, watershed management, oil spill response, and habitat restoration. Scientific approaches to address these problems vary significantly, and include literature surveys, data mining, field studies (monitoring, mapping, observations, and measurement), geospatial and biogeographic analysis, and modeling. In most cases there is also an element of expert consultation and collaboration among multiple partners, agencies with resource protection responsibilities, and other users and stakeholders. The resulting management responses may involve direct intervention (e.g., for spill response or habitat restoration issues), proposal of boundary alternatives for marine sanctuaries or reserves, changes in agency policy or regulations, making recommendations to other agencies with resource protection responsibilities, proposing changes to international or domestic shipping rules, or development of new education or outreach programs. (PDF contains 37 pages.

    SIMDAT

    No full text

    Preservation process modelling (including a review of semantic process modelling and workflow languages)

    No full text
    This report describes in a formalised way a comprehensive set of processes for digital preservation. These processes are drawn from a series of relevant projects and standards from the preservation community, including OAIS, TRAC, PLANETS and others. The result is intended to be used as a generic baseline that those interested in audiovisual preservation can refer to, extract and customise processes in order to fit with their specific AV preservation needs

    Data Quality Management in Corporate Practice

    Get PDF
    The 21st century is characterized by a rising quantity and importance of Data and Infor-mation. Companies utilize these in order to gain and maintain competitive advantages. Therefore, the Data and Information is required both in high quantity as well as quality. But while the amount of Data collected is steadily increasing, this does not necessarily mean the same is true for Data Quality. In order to assure high Data Quality, the concept of Data Quality Management (DQM) has been established, incorporating such elements as the assessment of Data Quality as well as its improvement. In order to discuss the issue of Data Quality Management, this paper pursues the following goals: (1) Systematic literature search for publications regarding Data Quality Management (Scientific contributions, Practice reports etc.) (2) Provision of a structured overview of the identified references and the research mate-rial (3) Analysis and evaluation of the scientific contributions with regards to methodology and theoretical foundation (4) Current expression of DQM in practice, differentiated by organization type and in-dustry (based upon the entire research material) as well as assessment of the situation (how well are the design recommendations based upon research results) (5) Summary of unresolved issues and challenges, based upon the research materia

    Building an embedded enterprise performance management solution : an exploratory case study

    Get PDF
    Project Work presented as partial requirement for obtaining the master’s degree in Statistics and Information Systems and Information Technologies ManagementNowadays most companies are struggling to manage large data and spending a lot of money on storing and capturing. To benefit from the stored data, enterprises implement Business Intelligence solutions and technology-driven processes. The most significant advantage of BI is analyzing actionable information and data-driven business decisions for executives and managers. Since technology is evolving very fast, Business Intelligence processes are getting more advanced every day. These advancements are promoting accountability, visibility, timely actionable information, increased return on investment, connected business processes, standardized management processes and augmented organizational flexibility. In a relationship with BI, enterprise performance management provides more predictable answers on these advancements by improving planning, budgeting, financial reporting, and consolidation. Therefore, this study aims to contribute to a better understanding of the implementation processes of embedded Enterprise Performance Management Solutions in ERP Embedded BI Platforms by revealing its methodology, steps, significant milestones, and effectiveness of the organizational structure. The embedded approach is going to be maintained by Business Intelligence based Business Planning and Consolidation tool on Enterprise Resource Planning System. Embedded Enterprise Performance Management solutions consist of Analysis Reporting, Business Planning, and Consolidation. Thoroughly they cover budgeting, planning, and consolidation as an advance altogether. The Implementation of an artefact aims to satisfy market competition requirements and to compete with financial demands which are originated from the growth rate at the organizational level There are several studies in the literature focuses on the critical success factors of BI projects, but there are not many studies which are mainly focused on the process evaluation of embedded enterprise performance management solutions and their success on organizations. This study will be an exploratory design research case study of a Group Company which is professionalized in language translation in 30 different countries on five different continents

    Helmholtz Portfolio Theme Large-Scale Data Management and Analysis (LSDMA)

    Get PDF
    The Helmholtz Association funded the "Large-Scale Data Management and Analysis" portfolio theme from 2012-2016. Four Helmholtz centres, six universities and another research institution in Germany joined to enable data-intensive science by optimising data life cycles in selected scientific communities. In our Data Life cycle Labs, data experts performed joint R&D together with scientific communities. The Data Services Integration Team focused on generic solutions applied by several communities

    Towards a Rosetta Stone for translating data between information systems

    Get PDF
    Information systems are an important organizational asset and offer numerous benefits. However, organizations face continued challenges when upgrading ageing information systems, and the data contained within, to newer platforms. This article explores, through conversations with information systems professionals in four organizations, the potential development of a ‘Rosetta Stone’, which can translate data between systems and be used to help overcome various challenges associated with their modernization. Despitemixed feedback regarding theRosetta Stone concept from interviewees, solutions highlighted in literature combinedwith participant feedback presented theories for its development, primarily as a tool to enable meaningful interpretation of data, rather than direct translation. The conclusion reflects on data collected to recommend a framework for how the tool might be developed and has the potential to be of significant interest to practitioners, open-source communities and organizations
    • 

    corecore