120 research outputs found

    Remodularization Analysis Using Semantic Clustering

    Get PDF
    International audienceIn this paper, we report an experience on using and adapting Semantic Clustering to evaluate software remodularizations. Semantic Clustering is an approach that relies on information retrieval and clustering techniques to extract sets of similar classes in a system, according to their vocabularies. We adapted Semantic Clustering to support remodularization analysis. We evaluate our adaptation using six real-world remodularizations of four software systems. We report that Semantic Clustering and conceptual metrics can be used to express and explain the intention of the architects when performing common modularization operators, such as module decomposition

    Dimensionality Reduction of Quality Objectives for Web Services Design Modularization

    Full text link
    With the increasing use of service-oriented Architecture (SOA) in new software development, there is a growing and urgent need to improve current practice in service-oriented design. To improve the design of Web services, the search for Web services interface modularization solutions deals, in general, with a large set of conflicting quality metrics. Deciding about which and how the quality metrics are used to evaluate generated solutions are always left to the designer. Some of these objectives could be correlated or conflicting. In this paper, we propose a dimensionality reduction approach based on Non-dominated Sorting Genetic Algorithm (NSGA-II) to address the Web services re-modularization problem. Our approach aims at finding the best-reduced set of objectives (e.g. quality metrics) that can generate near optimal Web services modularization solutions to fix quality issues in Web services interface. The algorithm starts with a large number of interface design quality metrics as objectives (e.g. coupling, cohesion, number of ports, number of port types, and number of antipatterns) that are reduced based on the nonlinear correlation information entropy (NCIE).The statistical analysis of our results, based on a set of 22 real world Web services provided by Amazon and Yahoo, confirms that our dimensionality reduction Web services interface modularization approach reduced significantly the number of objectives on several case studies to a minimum of 2 objectives and performed significantly better than the state-of-the-art modularization techniques in terms of generating well-designed Web services interface for users.Master of ScienceSoftware Engineering, College of Engineering & Computer ScienceUniversity of Michigan-Dearbornhttps://deepblue.lib.umich.edu/bitstream/2027.42/145687/1/Thesis Report_Hussein Skaf.pdfDescription of Thesis Report_Hussein Skaf.pdf : Thesi

    Software (re)modularization: Fight against the structure erosion and migration preparation

    No full text
    Software systems, and in particular, Object-Oriented sys- tems are models of the real world that manipulate representa- tions of its entities through models of its processes. The real world is not static: new laws are created, concurrents offer new functionalities, users have renewed expectation toward what a computer should offer them, memory constraints are added, etc. As a result, software systems must be continuously updated or face the risk of becoming gradually out-dated and irrelevant [34]. In the meantime, details and multiple abstraction levels result in a high level of com- plexity, and completely analyzing real software systems is impractical. For example, the Windows operating system consists of more than 60 millions lines of code (500,000 pages printed double-face, about 16 times the Encyclopedia Universalis). Maintaining such large applications is a trade- off between having to change a model that nobody can understand in details and limiting the impact of possible changes. Beyond maintenance, a good structure gives to the software systems good qualities for migration towards modern paradigms as web services or components, and the problem of architecture extraction is very close to the classical remodularization problem

    Intelligent Web Services Architecture Evolution Via An Automated Learning-Based Refactoring Framework

    Full text link
    Architecture degradation can have fundamental impact on software quality and productivity, resulting in inability to support new features, increasing technical debt and leading to significant losses. While code-level refactoring is widely-studied and well supported by tools, architecture-level refactorings, such as repackaging to group related features into one component, or retrofitting files into patterns, remain to be expensive and risky. Serval domains, such as Web services, heavily depend on complex architectures to design and implement interface-level operations, provided by several companies such as FedEx, eBay, Google, Yahoo and PayPal, to the end-users. The objectives of this work are: (1) to advance our ability to support complex architecture refactoring by explicitly defining Web service anti-patterns at various levels of abstraction, (2) to enable complex refactorings by learning from user feedback and creating reusable/personalized refactoring strategies to augment intelligent designers’ interaction that will guide low-level refactoring automation with high-level abstractions, and (3) to enable intelligent architecture evolution by detecting, quantifying, prioritizing, fixing and predicting design technical debts. We proposed various approaches and tools based on intelligent computational search techniques for (a) predicting and detecting multi-level Web services antipatterns, (b) creating an interactive refactoring framework that integrates refactoring path recommendation, design-level human abstraction, and code-level refactoring automation with user feedback using interactive mutli-objective search, and (c) automatically learning reusable and personalized refactoring strategies for Web services by abstracting recurring refactoring patterns from Web service releases. Based on empirical validations performed on both large open source and industrial services from multiple providers (eBay, Amazon, FedEx and Yahoo), we found that the proposed approaches advance our understanding of the correlation and mutual impact between service antipatterns at different levels, revealing when, where and how architecture-level anti-patterns the quality of services. The interactive refactoring framework enables, based on several controlled experiments, human-based, domain-specific abstraction and high-level design to guide automated code-level atomic refactoring steps for services decompositions. The reusable refactoring strategy packages recurring refactoring activities into automatable units, improving refactoring path recommendation and further reducing time-consuming and error-prone human intervention.Ph.D.College of Engineering & Computer ScienceUniversity of Michigan-Dearbornhttps://deepblue.lib.umich.edu/bitstream/2027.42/142810/1/Wang Final Dissertation.pdfDescription of Wang Final Dissertation.pdf : Dissertatio

    Activity Report 2012. Project-Team RMOD. Analyses and Languages Constructs for Object-Oriented Application Evolution

    Get PDF
    Activity Report 2012 Project-Team RMOD Analyses and Languages Constructs for Object-Oriented Application Evolutio

    Project-Team RMoD 2013 Activity Report

    Get PDF
    Activity Report 2013 Project-Team RMOD Analyses and Languages Constructs for Object-Oriented Application Evolutio

    Types and concept analysis for legacy systems

    Get PDF
    We combine type inference and concept analysis in order to gain insight into legacy software systems. Type inference for Cobol yields the types for variables and program parameters. These types are used to perform mathematical concept analysis on legacy systems. We have developed ConceptRefinery, a tool for interactively manipulating concepts. We show how this tools facilitates experiments with concept analysis, and lets reengineers employ their knowedge of the legacy system to refine the results of concept analysis

    Why and How Java Developers Break APIs

    Full text link
    Modern software development depends on APIs to reuse code and increase productivity. As most software systems, these libraries and frameworks also evolve, which may break existing clients. However, the main reasons to introduce breaking changes in APIs are unclear. Therefore, in this paper, we report the results of an almost 4-month long field study with the developers of 400 popular Java libraries and frameworks. We configured an infrastructure to observe all changes in these libraries and to detect breaking changes shortly after their introduction in the code. After identifying breaking changes, we asked the developers to explain the reasons behind their decision to change the APIs. During the study, we identified 59 breaking changes, confirmed by the developers of 19 projects. By analyzing the developers' answers, we report that breaking changes are mostly motivated by the need to implement new features, by the desire to make the APIs simpler and with fewer elements, and to improve maintainability. We conclude by providing suggestions to language designers, tool builders, software engineering researchers and API developers.Comment: Accepted at International Conference on Software Analysis, Evolution and Reengineering, SANER 2018; 11 page
    • …
    corecore