411 research outputs found

    Insights into the Importance-Performance Paradox of Software Product Attributes

    Get PDF
    The importance-performance analysis (IPA) is widely used for identifying which quality attributes should be improved for maximizing user satisfaction. The two dimensional grid of IPA are based on user-perceived attribute importance and performance. If the user-perceived attribute importance is high but its performance low then enhancing the performance of this attribute is likely to result in higher user satisfaction. But some studies have found that user importance may depend on attribute performance. This confounds the IPA analysis. Yet there is no study which has investigated whether this phenomenon is applicable for IS (Information Systems) products. This study conducted with an ERP system users show that user importance of an attribute is indeed dependent on its performance. For some attributes users overestimate their importance when the performance is low and underestimate them when the performance is high while for others the reverse is the case. Implications of this phenomenon for practice are discussed

    GTTA3: An Extension of the General Theory of Technology Adoption

    Get PDF
    The value perspective to technology adoption has many advantages. They include identifying user benefits beyond perceived usefulness and perceived enjoyment and user costs beyond effort expectancy (inverse of ease of use). However, although traditionally benefits and sacrifices (costs) are considered to be the two dimensions of value, we suggest that a third dimension of user needs is missing in these conceptualizations. The value perception of users depends not only on the benefits provided versus costs incurred but also on the needs profile of the users. This tripartite conceptual of value is useful in providing deeper insights into technology adoption

    Have the Agile Values endured? An empirical investigation on the 20th anniversary of the Agile Manifesto (2001)

    Get PDF
    This study investigates whether the Agile Values introduced in the Agile Manifesto (2001) have endured today two decades later and whether they are still relevant to software developers. Further, are they positively correlated with work and affective outcomes of software development projects? We find out by conducting a survey with team members of 58 software development project in one of the largest global IT firms. To our surprise we find all the four Agile values have endured. The agile values still resonated with software developers. Additionally, overall, the values were positively correlated with team motivation, project effectiveness and project innovation. However, they were negative correlated with project efficiency and had no correlation with work exhaustion of team members. As expected, projects using Agile and plan-driven methodologies showed differential findings

    What more can software development learn from Agile manufacturing? A roadmap on the 20th anniversary of the Agile manifesto

    Get PDF
    The concept of agility originated in manufacturing and was later adopted by the software development discipline. In this article we argue that in the process some important aspects of the agility theory have been either ignored or misinterpreted. A historical review of the evolving paradigms and practices in software development and manufacturing on the 20th anniversary of the Agile Manifesto (2001) suggests that if the ideas and principles underlying agility are faithfully implemented it would lead to significant improvement in the software development process

    How Do Users Choose Between Technologies? Insights from a User Value Perspective

    Get PDF
    TAM (Technology Acceptance Model) is concerned with why workers reject or accept technological tools that have been provided to support the work they are doing. TAM assumes the worker does not get a choice in the tools they use. Their only choice is to use or not use the tool. However, in today’s changing work environment employees often use different technologies to accomplish the same work. In this context, we examine how users choose the tools they use at the workplace. A correct understanding of this will not only enable organizations deploying these technologies to influence the choice of tools they want their employees to use at the workplace but will also help providers of these technological tools to design them for maximum adoption among users

    A Brief History of Software Development and Manufacturing

    Get PDF
    In this article we discover the roots and maturation of software development methods and practices through a comparative study. We notice that the evolution of software development methods has mirrored the evolution in manufacturing paradigms. Further, investigations reveal that the change software development methods have lagged the change in manufacturing paradigms indicating the source of inspiration for software development and practices is manufacturing and not the other way around. This investigation is useful and timely, especially in the context of plan-driven versus agile methods conundrum. It helps us acquire an in-depth understanding of how software development methods originated, why some of them have prevailed while others have not. Further, these insights help us assess the relevance of current practices and methods of software development and predict their future trajectory

    A Theory of Agile Software Development

    Get PDF
    The Agile Software Development Method (ASDM), in its present form, is guided by the Agile manifesto which consists of an Agile philosophy and a set of 12 principles. Despite the apparent effect of Agile philosophy and principles on the practice of software development around the world, neither its theoretical contribution nor its theoretical base has yet been articulated. In response to calls in literature, in this study we propose and articulate a theory of ASDM to describe and explain its effects. The theory is based on a synthesis of the key concepts underlying Agile principles and is expressed as a model of relationships. The article describes the theory formulation process and elaborates its key propositions. The limitations of the proposed theory and areas of future research are discussed

    Methods for Massive, Reliable, and Timely Access for Wireless Internet of Things (IoT)

    Get PDF

    Model for Quantitative Estimation of Functionality Influence on the Final Value of a Software Product

    Get PDF
    The gap between software development requirements and available resources of software developers continues to widen. This requires changes in the development and organization of software development. This study introduced a quantitative software development management methodology that estimates the relative importance and risk of functionality retention or abundance, which determines the final value of the software product. The final value of a software product is interpreted as a function of its requirements and functionalities, represented as a computational graph (called a software product graph). The software product graph allows the relative importance of functionalities to be estimated by calculating the corresponding partial derivatives of the value function. The risk of not implementing functionality was estimated by reducing the product's final value. This model was applied to two EU projects, CareHD and vINCI. In vINCI, functionalities with the most significant added value to the application are developed based on the implemented model, and those related to the least value are abandoned. Optimization was not implemented in the CareHD project, and proceeded as initially designed. Consequently, only 71% of the CareHD potential value was achieved. The proposed model enables rational management and organization of software product development with real-time quantitative evaluation of functionalities impacts and, assessment of the risks of omitting them without a significant impact. Quantitative evaluation of the impacts and risks of retention or abundance is possible based on the proposed algorithm, which is the core of the model. This model is a tool for the rational organization and development of software products

    QoS framework for video streaming in home networks

    Get PDF
    In this thesis we present a new SNR scalable video coding scheme. An important advantage of the proposed scheme is that it requires just a standard video decoder for processing each layer. The quality of the delivered video depends on the allocation of bit rates to the base and enhancement layers. For a given total bit rate, the combination with a bigger base layer delivers higher quality. The absence of dependencies between frames in enhancement layers makes the system resilient to losses of arbitrary frames from an enhancement layer. Furthermore, that property can be used in a more controlled fashion. An important characteristic of any video streaming scheme is the ability to handle network bandwidth fluctuations. We made a streaming technique that observes the network conditions and based on the observations reconfigures the layer configuration in order to achieve the best possible quality. A change of the network conditions forces a change in the number of layers or the bit rate of these layers. Knowledge of the network conditions allows delivery of a video of higher quality by choosing an optimal layer configuration. When the network degrades, the amount of data transmitted per second is decreased by skipping frames from an enhancement layer on the sender side. The presented video coding scheme allows skipping any frame from an enhancement layer, thus enabling an efficient real-time control over transmission at the network level and fine-grained control over the decoding of video data. The methodology proposed is not MPEG-2 specific and can be applied to other coding standards. We made a terminal resource manager that enables trade-offs between quality and resource consumption due to the use of scalable video coding in combination with scalable video algorithms. The controller developed for the decoding process optimizes the perceived quality with respect to the CPU power available and the amount of input data. The controller does not depend on the type of scalability technique and can therefore be used with any scalable video. The controller uses the strategy that is created offline by means of a Markov Decision Process. During the evaluation it was found that the correctness of the controller behavior depends on the correctness of parameter settings for MDP, so user tests should be employed to find the optimal settings
    • …
    corecore