37,464 research outputs found

    Designing Traceability into Big Data Systems

    Full text link
    Providing an appropriate level of accessibility and traceability to data or process elements (so-called Items) in large volumes of data, often Cloud-resident, is an essential requirement in the Big Data era. Enterprise-wide data systems need to be designed from the outset to support usage of such Items across the spectrum of business use rather than from any specific application view. The design philosophy advocated in this paper is to drive the design process using a so-called description-driven approach which enriches models with meta-data and description and focuses the design process on Item re-use, thereby promoting traceability. Details are given of the description-driven design of big data systems at CERN, in health informatics and in business process management. Evidence is presented that the approach leads to design simplicity and consequent ease of management thanks to loose typing and the adoption of a unified approach to Item management and usage.Comment: 10 pages; 6 figures in Proceedings of the 5th Annual International Conference on ICT: Big Data, Cloud and Security (ICT-BDCS 2015), Singapore July 2015. arXiv admin note: text overlap with arXiv:1402.5764, arXiv:1402.575

    A situational approach for the definition and tailoring of a data-driven software evolution method

    Get PDF
    Successful software evolution heavily depends on the selection of the right features to be included in the next release. Such selection is difficult, and companies often report bad experiences about user acceptance. To overcome this challenge, there is an increasing number of approaches that propose intensive use of data to drive evolution. This trend has motivated the SUPERSEDE method, which proposes the collection and analysis of user feedback and monitoring data as the baseline to elicit and prioritize requirements, which are then used to plan the next release. However, every company may be interested in tailoring this method depending on factors like project size, scope, etc. In order to provide a systematic approach, we propose the use of Situational Method Engineering to describe SUPERSEDE and guide its tailoring to a particular context.Peer ReviewedPostprint (author's final draft

    Adding Value to Statistics in the Data Revolution Age

    Get PDF
    As many statistical offices in accordance with the European Statistical System commitment to Vision 2020, since the second half of 2014 Istat has implemented its internal standardisation and industrialisation process within the framework of a common Business Architecture. Istat modernisation programme aims at building services and infrastructures within a plug-and-play framework to foster innovation, promote reuse and move towards full integration and interoperability of statistical process, consistent with a service-oriented architecture. This is expected to lead to higher effectiveness and productivity by improving the quality of statistical information and reducing the response burden. This paper addresses the strategy adopted by Istat which is focused on exploiting administrative data and new data sources in order to achieve its key goals enhancing value to users. The strategy is based on some priorities that consider services centred on users and stakeholders as well as Linked Open Data, to allow Machine-to-Machine data and metadata integration through definition of common statistical ontologies and semantics

    Software Engineering Timeline: major areas of interest and multidisciplinary trends

    Get PDF
    Ingeniería del software. EvolucionSociety today cannot run without software and by extension, without Software Engineering. Since this discipline emerged in 1968, practitioners have learned valuable lessons that have contributed to current practices. Some have become outdated but many are still relevant and widely used. From the personal and incomplete perspective of the authors, this paper not only reviews the major milestones and areas of interest in the Software Engineering timeline helping software engineers to appreciate the state of things, but also tries to give some insights into the trends that this complex engineering will see in the near future

    Safety-Critical Systems and Agile Development: A Mapping Study

    Full text link
    In the last decades, agile methods had a huge impact on how software is developed. In many cases, this has led to significant benefits, such as quality and speed of software deliveries to customers. However, safety-critical systems have widely been dismissed from benefiting from agile methods. Products that include safety critical aspects are therefore faced with a situation in which the development of safety-critical parts can significantly limit the potential speed-up through agile methods, for the full product, but also in the non-safety critical parts. For such products, the ability to develop safety-critical software in an agile way will generate a competitive advantage. In order to enable future research in this important area, we present in this paper a mapping of the current state of practice based on {a mixed method approach}. Starting from a workshop with experts from six large Swedish product development companies we develop a lens for our analysis. We then present a systematic mapping study on safety-critical systems and agile development through this lens in order to map potential benefits, challenges, and solution candidates for guiding future research.Comment: Accepted at Euromicro Conf. on Software Engineering and Advanced Applications 2018, Prague, Czech Republi
    corecore