701,222 research outputs found

    A Case for Human-Driven Software Development

    Get PDF
    International audienceHuman-Computer Interaction (HCI) plays a critical role in software systems, especially when targeting vulnerable individuals (e.g., assistive technologies). However, there exists a gap between well-tooled software development methodologies and HCI techniques, which are generally isolated from the development toolchain and require specific expertise. In this paper, we propose a human-driven software development methodology making User Interface (UI) a full-fledged dimension of software design. To make this methodology useful in practice, a UI design language and a user modeling language are integrated into a tool suite that guides the stakeholders during the development process, while ensuring the conformance between the UI design and its implementation

    An Automated Framework for Detecting Change in the Source Code and Test Case Change Recommendation

    Get PDF
    Improvements and acceleration in software development have contributed towards high-quality services in all domains and all fields of industry, causing increasing demands for high-quality software developments. The industry is adopting human resources with high skills, advanced methodologies, and technologies to match the high-quality software development demands to accelerate the development life cycle. In the software development life cycle, one of the biggest challenges is the change management between the version of the source codes. Various reasons, such as changing the requirements or adapting available updates or technological upgrades, can cause the source code's version. The change management affects the correctness of the software service's release and the number of test cases. It is often observed that the development life cycle is delayed due to a lack of proper version control and due to repetitive testing iterations. Hence the demand for better version control-driven test case reduction methods cannot be ignored. The parallel research attempts propose several version control mechanisms. Nevertheless, most version controls are criticized for not contributing toward the test case generation of reduction. Henceforth, this work proposes a novel probabilistic rule-based test case reduction method to simplify the software development's testing and version control mechanism. Software developers highly adopt the refactoring process for making efficient changes such as code structure and functionality or applying changes in the requirements. This work demonstrates very high accuracy for change detection and management. This results in higher accuracy for test case reductions. The outcome of this work is to reduce the development time for the software to make the software development industry a better and more efficient world

    Quality-driven optimized resource allocation

    Get PDF
    The assurance of a good software product quality necessitates a managed software process. Periodic product evaluation (inspection and testing) should be executed during the development process in order to simultaneously guarantee the timeliness and quality aspects of the development workflow. A faithful prediction of the efforts needed forms the basis of a project management (PM) in order to perform a proper human resource allocation to the different development and QA activities. However, even robust resource demand and quality estimation tools, like COCOMO II and COQUALMO do not cover the timeliness point of view sufficiently due to their static nature. Correspondingly, continuous quality monitoring and quality driven supervisory control of the development process became vital aspects in PM. A well-established complementary approach uses the Weibull model to describe the dynamics of the development and QA process by a mathematical model based on the observations gained during the development process. Supervisory PM control has to concentrate development and QA resources to eliminate quality bottlenecks, as different parts (modules) of the product under development may reveal different defect density levels. Nevertheless, traditional heuristic quality management is unable to perform optimal resource allocation in the case of complex target programs. This paper presents a model-based quality-driven optimized resource allocation method. It combines the COQUALMO model as early quality predictor and empirical knowledge formulated by a Weibull model gained by the continuous monitoring of the QA process flow. An exact mathematical optimization technique is used for human resource, like tester allocation

    Measuring Occupants' Behaviour for Buildings' Dynamic Cosimulation

    Get PDF
    Measuring and identifying human behaviours are key aspects to support the simulation processes that have a significant role in buildings' (and cities') design and management. In fact, layout assessments and control strategies are deeply influenced by the prediction of building performance. However, the missing inclusion of the human component within the building-related processes leads to large discrepancies between actual and simulated outcomes. This paper presents a methodology for measuring specific human behaviours in buildings and developing human-in-the-loop design applied to retrofit and renovation interventions. The framework concerns the detailed building monitoring and the development of stochastic and data-driven behavioural models and their coupling within energy simulation software using a cosimulation approach. The methodology has been applied to a real case study to illustrate its applicability. A one-year monitoring has been carried out through a dedicated sensor network for the data recording and to identify the triggers of users' actions. Then, two stochastic behavioural models (i.e., one for predicting light switching and one for window opening) have been developed (using the measured data) and coupled within the IESVE simulation software. A simplified energy model of the case study has been created to test the behavioural approach. The outcomes highlight that the behavioural approach provides more accurate results than a standard one when compared to real profiles. The adoption of behavioural profiles leads to a reduction of the discrepancy with respect to real profiles up to 58% and 26% when simulating light switching and ventilation, respectively, in comparison to standard profiles. Using data-driven techniques to include the human component in the simulation processes would lead to better predictions both in terms of energy use and occupants' comfort sensations. These aspects can be also included in building control processes (e.g., building management systems) to enhance the environmental and system management

    An Evaluation View of an Ensemble Artefact for Decision Support using Action Design Research

    Get PDF
    This paper investigates the integration of content, context and process (CCP) into the Action Design Research (ADR) framework to account for the interplay of organisational issues in artefact design and development. The investigation is conducted through a case study in which successive ICT student teams incrementally build, over several semesters, a tailored, low cost business intelligence (BI) system as an ensemble artefact for an organisation in the not-for-profit (NFP) sector. During project development, CCP’s human-centred approach to evaluation complements ADR’s more prescribed technology-driven software testing. The integration of CCP into ADR as an evaluation view offers an holistic approach to assessing an ensemble artefact. The resultant conceptual framework is presented as a model with an explication of unexpected design and research outcomes

    An Interdisciplinary Model for Graphical Representation

    Get PDF
    International audienceThe paper questions whether data-driven and problem-driven models are sufficient for a software to automatically represent a meaningful graphi-cal representation of scientific findings. The paper presents descriptive and prescriptive case studies to understand the benefits and the shortcomings of existing models that aim to provide graphical representations of data-sets. First, the paper considers data-sets coming from the field of software metrics and shows that existing models can provide the expected outcomes for descriptive scientific studies. Second, the paper presents data-sets coming from the field of human mobility and sustainable development, and shows that a more comprehensive model is needed in the case of prescriptive scientific fields requiring interdisciplinary research. Finally, an interdisciplinary problem-driven model is proposed to guide the software users, and specifically scientists, to produce meaningful graphical representation of research findings. The proposal is indeed based not only on a data-driven and/or problem-driven model but also on the different knowledge domains and scientific aims of the experts, who can provide the information needed for a higher-order structure of the data, supporting the graphical representation output

    Service-Oriented Architecture Modeling: Bridging the Gap between Structure and Behavior

    Get PDF
    International audienceModel-driven development of large-scale software systems is highly likely to produce models that describe the systems from many diverse perspectives using a variety of modeling languages. Checking and maintaining consistency of information captured in such multi-modeling environments is known to be challenging. In this paper we describe an approach to systematically synchronize multi-models. The approach specifically addresses the problem of synchronizing business processes and domain models in a Service-oriented Architecture development environment. In the approach, the human effort required to synchronize independently developed models is supplemented with significant automated support. This process is used to identify concept divergences, that is, a concept in one model which cannot be matched with concepts in the other model. We automate the propagation of divergence resolution decisions across the conflicting models. We illustrate the approach using models developed for a Car Crash Crisis Management System (CCCMS), a case study problem used to assess Aspect-oriented Modeling approaches

    Case Study of the Space Shuttle Cockpit Avionics Upgrade Software

    Get PDF
    The purpose of the Space Shuttle Cockpit Avionics Upgrade project was to reduce crew workload and improve situational awareness. The upgrade was to augment the Shuttle avionics system with new hardware and software. An early version of this system was used to gather human factor statistics in the Space Shuttle Motion Simulator of the Johnson Space Center for one month by multiple teams of astronauts. The results were compiled by NASA Ames Research Center and it was was determined that the system provided a better than expected increase in situational awareness and reduction in crew workload. Even with all of the benefits nf the system, NASA cancelled the project towards the end of the development cycle. A major success of this project was the validation of the hardware architecture and software design. This was significant because the project incorporated new technology and approaches for the development of human rated space software. This paper serves as a case study to document knowledge gained and techniques that can be applied for future space avionics development efforts. The major technological advances were the use of reflective memory concepts for data acquisition and the incorporation of Commercial off the Shelf (COTS) products in a human rated space avionics system. The infused COTS products included a real time operating system, a resident linker and loader, a display generation tool set, and a network data manager. Some of the successful design concepts were the engineering of identical outputs in multiple avionics boxes using an event driven approach and inter-computer communication, a reconfigurable data acquisition engine, the use of a dynamic bus bandwidth allocation algorithm. Other significant experiences captured were the use of prototyping to reduce risk, and the correct balance between Object Oriented and Functional based programming

    Timed Refinement for Verification of Real-Time Object Code Programs

    Get PDF
    Real-time systems such as medical devices, surgical robots, and microprocessors are safety- critical applications that have hard timing constraint. The correctness of real-time systems is important as the failure may result in severe consequences such as loss of money, time and human life. These real-time systems have software to control their behavior. Typically, these software have source code which is converted to object code and then executed in safety-critical embedded devices. Therefore, it is important to ensure that both source code and object code are error-free. When dealing with safety-critical systems, formal verification techniques have laid the foundation for ensuring software correctness. Refinement based technique in formal verification can be used for the verification of real- time interrupt-driven object code. This dissertation presents an automated tool that verifies the functional and timing correctness of real-time interrupt-driven object code programs. The tool has been developed in three stages. In the first stage, a novel timed refinement procedure that checks for timing properties has been developed and applied on six case studies. The required model and an abstraction technique were generated manually. The results indicate that the proposed abstraction technique reduces the size of the implementation model by at least four orders of magnitude. In the second stage, the proposed abstraction technique has been automated. This technique has been applied to thirty different case studies. The results indicate that the automated abstraction technique can easily reduce the model size, which would in turn significantly reduce the verification time. In the final stage, two new automated algorithms are proposed which would check the functional properties through safety and liveness. These algorithms were applied to the same thirty case studies. The results indicate that the functional verification can be performed in less than a second for the reduced model. The benefits of automating the verification process for real-time interrupt-driven object code include: 1) the overall size of the implementation model has reduced significantly; 2) the verification is within a reasonable time; 3) can be applied multiple times in the system development process.Several parts of this dissertation was funded by a grant from the United States Government and the generous support of the American people through the United States Department of State and the United States Agency for International Development (USAID) under the Pakistan ? U.S. Science & Technology Cooperation Program. The contents do not necessarily reflect the views of the United States Government
    corecore