58,139 research outputs found

    An Empirical Study on Decision making for Quality Requirements

    Full text link
    [Context] Quality requirements are important for product success yet often handled poorly. The problems with scope decision lead to delayed handling and an unbalanced scope. [Objective] This study characterizes the scope decision process to understand influencing factors and properties affecting the scope decision of quality requirements. [Method] We studied one company's scope decision process over a period of five years. We analyzed the decisions artifacts and interviewed experienced engineers involved in the scope decision process. [Results] Features addressing quality aspects explicitly are a minor part (4.41%) of all features handled. The phase of the product line seems to influence the prevalence and acceptance rate of quality features. Lastly, relying on external stakeholders and upfront analysis seems to lead to long lead-times and an insufficient quality requirements scope. [Conclusions] There is a need to make quality mode explicit in the scope decision process. We propose a scope decision process at a strategic level and a tactical level. The former to address long-term planning and the latter to cater for a speedy process. Furthermore, we believe it is key to balance the stakeholder input with feedback from usage and market in a more direct way than through a long plan-driven process

    Model-driven Enterprise Systems Configuration

    Get PDF
    Enterprise Systems potentially lead to significant efficiency gains but require a well-conducted configuration process. A promising idea to manage and simplify the configuration process is based on the premise of using reference models for this task. Our paper continues along this idea and delivers a two-fold contribution: first, we present a generic process for the task of model-driven Enterprise Systems configuration including the steps of (a) Specification of configurable reference models, (b) Configuration of configurable reference models, (c) Transformation of configured reference models to regular build time models, (d) Deployment of the generated build time models, (e) Controlling of implementation models to provide input to the configuration, and (f) Consolidation of implementation models to provide input to reference model specification. We discuss inputs and outputs as well as the involvement of different roles and validation mechanisms. Second, we present an instantiation case of this generic process for Enterprise Systems configuration based on Configurable EPCs

    Using Feature Models for Distributed Deployment in Extended Smart Home Architecture

    Get PDF
    Nowadays, smart home is extended beyond the house itself to encompass connected platforms on the Cloud as well as mobile personal devices. This Smart Home Extended Architecture (SHEA) helps customers to remain in touch with their home everywhere and any time. The endless increase of connected devices in the home and outside within the SHEA multiplies the deployment possibilities for any application. Therefore, SHEA should be taken from now as the actual target platform for smart home application deployment. Every home is different and applications offer different services according to customer preferences. To manage this variability, we extend the feature modeling from software product line domain with deployment constraints and we present an example of a model that could address this deployment challenge

    A Quality Model for Actionable Analytics in Rapid Software Development

    Get PDF
    Background: Accessing relevant data on the product, process, and usage perspectives of software as well as integrating and analyzing such data is crucial for getting reliable and timely actionable insights aimed at continuously managing software quality in Rapid Software Development (RSD). In this context, several software analytics tools have been developed in recent years. However, there is a lack of explainable software analytics that software practitioners trust. Aims: We aimed at creating a quality model (called Q-Rapids quality model) for actionable analytics in RSD, implementing it, and evaluating its understandability and relevance. Method: We performed workshops at four companies in order to determine relevant metrics as well as product and process factors. We also elicited how these metrics and factors are used and interpreted by practitioners when making decisions in RSD. We specified the Q-Rapids quality model by comparing and integrating the results of the four workshops. Then we implemented the Q-Rapids tool to support the usage of the Q-Rapids quality model as well as the gathering, integration, and analysis of the required data. Afterwards we installed the Q-Rapids tool in the four companies and performed semi-structured interviews with eight product owners to evaluate the understandability and relevance of the Q-Rapids quality model. Results: The participants of the evaluation perceived the metrics as well as the product and process factors of the Q-Rapids quality model as understandable. Also, they considered the Q-Rapids quality model relevant for identifying product and process deficiencies (e.g., blocking code situations). Conclusions: By means of heterogeneous data sources, the Q-Rapids quality model enables detecting problems that take more time to find manually and adds transparency among the perspectives of system, process, and usage.Comment: This is an Author's Accepted Manuscript of a paper to be published by IEEE in the 44th Euromicro Conference on Software Engineering and Advanced Applications (SEAA) 2018. The final authenticated version will be available onlin

    Human Resource and Employment Practices in Telecommunications Services, 1980-1998

    Get PDF
    [Excerpt] In the academic literature on manufacturing, much research and debate have focused on whether firms are adopting some form of “high-performance” or “high-involvement” work organization based on such practices as employee participation, teams, and increased discretion, skills, and training for frontline workers (Ichniowski et al., 1996; Kochan and Osterman, 1994; MacDuffie, 1995). Whereas many firms in the telecommunications industry flirted with these ideas in the 1980s, they did not prove to be a lasting source of inspiration for the redesign of work and employment practices. Rather, work restructuring in telecommunications services has been driven by the ability of firms to leverage network and information technologies to reduce labor costs and create customer segmentation strategies. “Good jobs” versus “bad jobs,” or higher versus lower wage jobs, do not vary according to whether firms adopt a high- involvement model. They vary along two other dimensions: (1) within firms and occupations, by the value-added of the customer segment that an employee group serves; and (2) across firms, by union and nonunion status. We believe that this customer segmentation strategy is becoming a more general model for employment practices in large-scale service | operations; telecommunications services firms may be somewhat more | advanced than other service firms in adopting this strategy because of certain unique industry characteristics. The scale economies of network technology are such that once a company builds the network infrastructure to a customer’s specifications, the cost of additional services is essentially zero. As a result, and notwithstanding technological uncertainty, all of the industry’s major players are attempting to take advantage of system economies inherent in the nature of the product market and technology to provide customized packages of multimedia products to identified market segments. They have organized into market-driven business units providing differentiated services to large businesses and institutions, small businesses, and residential customers. They have used information technologies and process reengineering to customize specific services to different segments according to customer needs and ability to pay. Variation in work and employment practices, or labor market segmentation, follows product market segmentation. As a result, much of the variation in employment practices in this industry is within firms and within occupations according to market segment rather than across firms. In addition, despite market deregulation beginning in 1984 and opportunities for new entrants, a tightly led oligopoly structure is replacing the regulated Bell System monopoly. Former Bell System companies, the giants of the regulated period, continue to dominate market share in the post-1984 period. Older players and new entrants alike are merging and consolidating in order to have access to multimedia markets. What is striking in this industry, therefore, is the relative lack of variation in management and employment practices across firms after more than a decade of experience with deregulation. We attribute this lack of variation to three major sources. (1) Technological advances and network economics provide incentives for mergers, organizational consolidation, and, as indicated above, similar business strategies. (2) The former Bell System companies have deep institutional ties, and they continue to benchmark against and imitate each other so that ideas about restructuring have diffused quickly among them. (3) Despite overall deunionization in the industry, they continue to have high unionization rates; de facto pattern bargaining within the Bell system has remained quite strong. Therefore, similar employment practices based on inherited collective bargaining agreements continue to exist across former Bell System firms

    Product strategies and survival in schumpeterian environments: evidence from the US security software industry.

    Get PDF
    This paper seeks to explore the drivers of survival in environments characterized by high rates of entry and exit, fragmented market shares, rapid pace of product innovation and proliferation of young ventures. The paper aims to underscore the role played by postentry product strategies, along with their interaction, after carefully controlling for "at entry" factors and demographic conditions. Based on a population of 270 firms that entered the US security software industry between 1989 and 1998, we find evidence that surviving entities are those that are more aggressive in the adoption of versioning and portfolio broadening strategies. In particular, focusing on any one of these two strategies leads to a higher probability of survival as opposed to adopting a mixed strategy.Survival; Versioning; Portfolio broadening; Young ventures; Sotware;

    Consolidation of Customized Product Copies into Software Product Lines

    Get PDF
    In software development, project constraints lead to customer-specific variants by copying and adapting the product. During this process, modifications are scattered all over the code. Although this is flexible and efficient in the short term, a Software Product Line (SPL) offers better results in the long term, regarding cost reduction, time-to-market, and quality attributes. This book presents a novel approach named SPLevo, which consolidates customized product copies into an SPL

    Consumption, planned obsolescence and waste

    Get PDF
    In the five decades since Vance Packard published The Waste Makers (1960), planned product obsolescence has developed in many subtle and sophisticated ways. Yet its social and environmental impact remains largely unacknowledged; planned obsolescence continues to be elaborated and to undermine consumer choice, increase costs of owning and using products, accelerating the destruction of useful objects and resulting in higher levels of ecological spoiling. It is a phenomenon widely acknowledged though little discussed. Conceptual and empirical detail will be discussed in relation to i) ‘in-built’ technological obsolescence the design; development and incorporation of functionally fragile components leading to premature malfunction, ii) stylistic obsolescence; the styling or fashioning of myriad consumer objects such that they are deemed to have ‘worn out’ stylistically and aesthetically before they have failed functionally and, iii) the ‘superfluous within the necessary’; the over-elaboration of products such that they are functionally ‘overprogrammed’, the specific design of many objects such that they cannot be repaired or adapted for alternate uses and, the way that many products urge and often require the subsequent consumption of extra goods and services simply to maintain them
    • 

    corecore