1,110,192 research outputs found

    BOA: Framework for Automated Builds

    Get PDF
    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions.Comment: 3 pages, 2 figure

    Configuration management and software measurement in the Ground Systems Development Environment (GSDE)

    Get PDF
    A set of functional requirements for software configuration management (CM) and metrics reporting for Space Station Freedom ground systems software are described. This report is one of a series from a study of the interfaces among the Ground Systems Development Environment (GSDE), the development systems for the Space Station Training Facility (SSTF) and the Space Station Control Center (SSCC), and the target systems for SSCC and SSTF. The focus is on the CM of the software following delivery to NASA and on the software metrics that relate to the quality and maintainability of the delivered software. The CM and metrics requirements address specific problems that occur in large-scale software development. Mechanisms to assist in the continuing improvement of mission operations software development are described

    GeneLink: a database to facilitate genetic studies of complex traits

    Get PDF
    BACKGROUND: In contrast to gene-mapping studies of simple Mendelian disorders, genetic analyses of complex traits are far more challenging, and high quality data management systems are often critical to the success of these projects. To minimize the difficulties inherent in complex trait studies, we have developed GeneLink, a Web-accessible, password-protected Sybase database. RESULTS: GeneLink is a powerful tool for complex trait mapping, enabling genotypic data to be easily merged with pedigree and extensive phenotypic data. Specifically designed to facilitate large-scale (multi-center) genetic linkage or association studies, GeneLink securely and efficiently handles large amounts of data and provides additional features to facilitate data analysis by existing software packages and quality control. These include the ability to download chromosome-specific data files containing marker data in map order in various formats appropriate for downstream analyses (e.g., GAS and LINKAGE). Furthermore, an unlimited number of phenotypes (either qualitative or quantitative) can be stored and analyzed. Finally, GeneLink generates several quality assurance reports, including genotyping success rates of specified DNA samples or success and heterozygosity rates for specified markers. CONCLUSIONS: GeneLink has already proven an invaluable tool for complex trait mapping studies and is discussed primarily in the context of our large, multi-center study of hereditary prostate cancer (HPC). GeneLink is freely available at

    Monitoring the CMS strip tracker readout system

    Get PDF
    The CMS Silicon Strip Tracker at the LHC comprises a sensitive area of approximately 200 m2 and 10 million readout channels. Its data acquisition system is based around a custom analogue front-end chip. Both the control and the readout of the front-end electronics are performed by off-detector VME boards in the counting room, which digitise the raw event data and perform zero-suppression and formatting. The data acquisition system uses the CMS online software framework to configure, control and monitor the hardware components and steer the data acquisition. The first data analysis is performed online within the official CMS reconstruction framework, which provides many services, such as distributed analysis, access to geometry and conditions data, and a Data Quality Monitoring tool based on the online physics reconstruction. The data acquisition monitoring of the Strip Tracker uses both the data acquisition and the reconstruction software frameworks in order to provide real-time feedback to shifters on the operational state of the detector, archiving for later analysis and possibly trigger automatic recovery actions in case of errors. Here we review the proposed architecture of the monitoring system and we describe its software components, which are already in place, the various monitoring streams available, and our experiences of operating and monitoring a large-scale system

    The NEST software development infrastructure

    Get PDF
    Software development in the Computational Sciences has reached a critical level of complexity in the recent years. This “complexity bottleneck” occurs for both the programming languages and technologies that are used during development and for the infrastructure, which is needed to sustain the development of large-scale software projects and keep the code base manageable [1].As the development shifts from specialized and solution-tailored in-house code (often developed by a single or only few developers) towards more general software packages written by larger teams of programmers, it becomes inevitable to use professional software engineering tools also in the realm of scientific software development. In addition the move to collaboration-based large-scale projects (e.g. BrainScaleS) also means a larger user base, which depends and relies on the quality and correctness of the code.In this contribution, we present the tools and infrastructure that have been introduced over the years to support the development of NEST, a simulator for large networks of spiking neuronal networks [2]. In particular, we show our use of• version control systems• bug tracking software• web-based wiki and blog engines• frameworks for carrying out unit tests• systems for continuous integration.References:[1] Gregory Wilson (2006). Where's the Real Bottleneck in Scientific Computing? American Scientist, 94(1): 5-6, doi:10.1511/2006.1.5.[2] Marc-Oliver Gewaltig and Markus Diesmann (2007) NEST (Neural Simulation Tool), Scholarpedia, 2(4): 1430

    Mindcontrol: a web application for brain segmentation quality control

    Get PDF
    Tissue classification plays a crucial role in the investigation of normal neural development, brain-behavior relationships, and the disease mechanisms of many psychiatric and neurological illnesses. Ensuring the accuracy of tissue classification is important for quality research and, in particular, the translation of imaging biomarkers to clinical practice. Assessment with the human eye is vital to correct various errors inherent to all currently available segmentation algorithms. Manual quality assurance becomes methodologically difficult at a large scale - a problem of increasing importance as the number of data sets is on the rise. To make this process more efficient, we have developed Mindcontrol, an open-source web application for the collaborative quality control of neuroimaging processing outputs. The Mindcontrol platform consists of a dashboard to organize data, descriptive visualizations to explore the data, an imaging viewer, and an in-browser annotation and editing toolbox for data curation and quality control. Mindcontrol is flexible and can be configured for the outputs of any software package in any data organization structure. Example configurations for three large, open-source datasets are presented: the 1000 Functional Connectomes Project (FCP), the Consortium for Reliability and Reproducibility (CoRR), and the Autism Brain Imaging Data Exchange (ABIDE) Collection. These demo applications link descriptive quality control metrics, regional brain volumes, and thickness scalars to a 3D imaging viewer and editing module, resulting in an easy-to-implement quality control protocol that can be scaled for any size and complexity of study

    Time At Your Service: Schedulability Analysis of Real-Time and Distributed Services

    Get PDF
    The software today is distributed over several processing units. At a large scale this may span over the globe via the internet, or at the micro scale, a software may be distributed on several small processing units embedded in one device. Real-time distributed software and services need to be timely and respond to the requests in time. The Quality of Service of real time software depends on how it schedules its tasks to be executed. The state of the art in programming distributed software, like in Java, the scheduling is left to the underlying infrastructure and in particular the operating system, which is not anymore in the control of the applications. In this thesis, we introduce a software paradigm based on object orientation in which real-time concurrent objects are enabled to specify their own scheduling strategy. We developed high-level formal models for specifying distributed software based on this paradigm in which the quality of service requirements are specified as deadlines on performing and finishing tasks. At this level we developed techniques to verify that these requirements are satisfied. This research has opened the way to a new approach to modeling and analysis of a range of applications such as continuous planning in the context of logistics software in a dynamic environment as well as developing software for multi-core systems. Industrial companies (DEAL services) and research centers (the Uppsala Programming for Multicore Architectures Resrearch Center UPMARC) have already shown interest in the results of this thesis.LEI Universiteit LeidenFoundations of Software Technolog

    Towards Modeling Software Quality of Virtual Reality Applications from Users' Perspectives

    Full text link
    Virtual Reality (VR) technology has become increasingly popular in recent years as a key enabler of the Metaverse. VR applications have unique characteristics, including the revolutionized human-computer interaction mechanisms, that distinguish them from traditional software. Hence, user expectations for the software quality of VR applications diverge from those for traditional software. Investigating these quality expectations is crucial for the effective development and maintenance of VR applications, which remains an under-explored area in prior research. To bridge the gap, we conduct the first large-scale empirical study to model the software quality of VR applications from users' perspectives. To this end, we analyze 1,132,056 user reviews of 14,150 VR applications across seven app stores through a semiautomatic review mining approach. We construct a taxonomy of 12 software quality attributes that are of major concern to VR users. Our analysis reveals that the VR-specific quality attributes are of utmost importance to users, which are closely related to the most unique properties of VR applications like revolutionized interaction mechanisms and immersive experiences. Our examination of relevant user complaints reveals the major factors impacting user satisfaction with VR-specific quality attributes. We identify that poor design or implementation of the movement mechanisms, control mechanisms, multimedia systems, and physics, can significantly degrade the user experience. Moreover, we discuss the implications of VR quality assurance for both developers and researchers to shed light on future work. For instance, we suggest developers implement sufficient accessibility and comfort options for users with mobility limitations, sensory impairments, and other specific needs to customize the interaction mechanisms. Our datasets and results will be released to facilitate follow-up studies

    Improving Software Systems By Flow Control Analysis

    Get PDF
    Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub) system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity
    corecore