91,604 research outputs found

    Fundamental Laws and Assumptions of Software Maintenance

    Get PDF
    Researchers must pay far more attention to discovering and validating the principles that underlie software maintenance and evolution. This was one of the major conclusions reached during the International Workshop on Empirical Studies of Software Maintenance. This workship, held in November 1996 in Monterey, California, brought together an international group of researchers to discuss the successes, challenges and open issues in software maintenance and evolution. This article documents the discussion of the subgroup on fundamental laws and assumption of software maintenance. The participants of this group in included researchers in software engineering, the behavioral sciences, information systems and statistics. Their main conclusion was that insufficient effort has been paid to synthesizing research conjectures into validated theories and this problem has slowed progress in software maintenance. To help remedy this situation they made the following recommendations: (1) when we use empirical methods, an explicit goal should be to develop theories, (2) we should look to other disciplines for help where it is appropriate, and (3) our studies should use a wider range of empirical methods (Also cross-referenced as UMIACS-TR-97-21

    Formal change impact analyses for emulated control software

    Get PDF
    Processor emulators are a software tool for allowing legacy computer programs to be executed on a modern processor. In the past emulators have been used in trivial applications such as maintenance of video games. Now, however, processor emulation is being applied to safety-critical control systems, including military avionics. These applications demand utmost guarantees of correctness, but no verification techniques exist for proving that an emulated system preserves the original system’s functional and timing properties. Here we show how this can be done by combining concepts previously used for reasoning about real-time program compilation, coupled with an understanding of the new and old software architectures. In particular, we show how both the old and new systems can be given a common semantics, thus allowing their behaviours to be compared directly

    Empirical studies of open source evolution

    Get PDF
    Copyright @ 2008 Springer-VerlagThis chapter presents a sample of empirical studies of Open Source Software (OSS) evolution. According to these studies, the classical results from the studies of proprietary software evoltion, such as Lehman’s laws of software evolution, might need to be revised, if not fully, at least in part, to account for the OSS observations. The book chapter also summarises what appears to be the empirical status of each of Lehman’s laws with respect to OSS and highlights the threads to validity that frequently emerge in these empirical studies. The chapter also discusses related topics for further research

    A framework for the simulation of structural software evolution

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2008 ACM.As functionality is added to an aging piece of software, its original design and structure will tend to erode. This can lead to high coupling, low cohesion and other undesirable effects associated with spaghetti architectures. The underlying forces that cause such degradation have been the subject of much research. However, progress in this field is slow, as its complexity makes it difficult to isolate the causal flows leading to these effects. This is further complicated by the difficulty of generating enough empirical data, in sufficient quantity, and attributing such data to specific points in the causal chain. This article describes a framework for simulating the structural evolution of software. A complete simulation model is built by incrementally adding modules to the framework, each of which contributes an individual evolutionary effect. These effects are then combined to form a multifaceted simulation that evolves a fictitious code base in a manner approximating real-world behavior. We describe the underlying principles and structures of our framework from a theoretical and user perspective; a validation of a simple set of evolutionary parameters is then provided and three empirical software studies generated from open-source software (OSS) are used to support claims and generated results. The research illustrates how simulation can be used to investigate a complex and under-researched area of the development cycle. It also shows the value of incorporating certain human traits into a simulation—factors that, in real-world system development, can significantly influence evolutionary structures

    Method for finding metabolic properties based on the general growth law. Liver examples. A General framework for biological modeling

    Full text link
    We propose a method for finding metabolic parameters of cells, organs and whole organisms, which is based on the earlier discovered general growth law. Based on the obtained results and analysis of available biological models, we propose a general framework for modeling biological phenomena and discuss how it can be used in Virtual Liver Network project. The foundational idea of the study is that growth of cells, organs, systems and whole organisms, besides biomolecular machinery, is influenced by biophysical mechanisms acting at different scale levels. In particular, the general growth law uniquely defines distribution of nutritional resources between maintenance needs and biomass synthesis at each phase of growth and at each scale level. We exemplify the approach considering metabolic properties of growing human and dog livers and liver transplants. A procedure for verification of obtained results has been introduced too. We found that two examined dogs have high metabolic rates consuming about 0.62 and 1 gram of nutrients per cubic centimeter of liver per day, and verified this using the proposed verification procedure. We also evaluated consumption rate of nutrients in human livers, determining it to be about 0.088 gram of nutrients per cubic centimeter of liver per day for males, and about 0.098 for females. This noticeable difference can be explained by evolutionary development, which required females to have greater liver processing capacity to support pregnancy. We also found how much nutrients go to biomass synthesis and maintenance at each phase of liver and liver transplant growth. Obtained results demonstrate that the proposed approach can be used for finding metabolic characteristics of cells, organs, and whole organisms, which can further serve as important inputs for many applications in biology (protein expression), biotechnology (synthesis of substances), and medicine.Comment: 20 pages, 6 figures, 4 table

    Robustness - a challenge also for the 21st century: A review of robustness phenomena in technical, biological and social systems as well as robust approaches in engineering, computer science, operations research and decision aiding

    Get PDF
    Notions on robustness exist in many facets. They come from different disciplines and reflect different worldviews. Consequently, they contradict each other very often, which makes the term less applicable in a general context. Robustness approaches are often limited to specific problems for which they have been developed. This means, notions and definitions might reveal to be wrong if put into another domain of validity, i.e. context. A definition might be correct in a specific context but need not hold in another. Therefore, in order to be able to speak of robustness we need to specify the domain of validity, i.e. system, property and uncertainty of interest. As proofed by Ho et al. in an optimization context with finite and discrete domains, without prior knowledge about the problem there exists no solution what so ever which is more robust than any other. Similar to the results of the No Free Lunch Theorems of Optimization (NLFTs) we have to exploit the problem structure in order to make a solution more robust. This optimization problem is directly linked to a robustness/fragility tradeoff which has been observed in many contexts, e.g. 'robust, yet fragile' property of HOT (Highly Optimized Tolerance) systems. Another issue is that robustness is tightly bounded to other phenomena like complexity for which themselves exist no clear definition or theoretical framework. Consequently, this review rather tries to find common aspects within many different approaches and phenomena than to build a general theorem for robustness, which anyhow might not exist because complex phenomena often need to be described from a pluralistic view to address as many aspects of a phenomenon as possible. First, many different robustness problems have been reviewed from many different disciplines. Second, different common aspects will be discussed, in particular the relationship of functional and structural properties. This paper argues that robustness phenomena are also a challenge for the 21st century. It is a useful quality of a model or system in terms of the 'maintenance of some desired system characteristics despite fluctuations in the behaviour of its component parts or its environment' (s. [Carlson and Doyle, 2002], p. 2). We define robustness phenomena as solution with balanced tradeoffs and robust design principles and robustness measures as means to balance tradeoffs. --

    Trademarks and Competition: The Recent History

    Get PDF

    The Crypto-democracy and the Trustworthy

    Full text link
    In the current architecture of the Internet, there is a strong asymmetry in terms of power between the entities that gather and process personal data (e.g., major Internet companies, telecom operators, cloud providers, ...) and the individuals from which this personal data is issued. In particular, individuals have no choice but to blindly trust that these entities will respect their privacy and protect their personal data. In this position paper, we address this issue by proposing an utopian crypto-democracy model based on existing scientific achievements from the field of cryptography. More precisely, our main objective is to show that cryptographic primitives, including in particular secure multiparty computation, offer a practical solution to protect privacy while minimizing the trust assumptions. In the crypto-democracy envisioned, individuals do not have to trust a single physical entity with their personal data but rather their data is distributed among several institutions. Together these institutions form a virtual entity called the Trustworthy that is responsible for the storage of this data but which can also compute on it (provided first that all the institutions agree on this). Finally, we also propose a realistic proof-of-concept of the Trustworthy, in which the roles of institutions are played by universities. This proof-of-concept would have an important impact in demonstrating the possibilities offered by the crypto-democracy paradigm.Comment: DPM 201
    • 

    corecore