630 research outputs found

    A Low-Effort Recommendation System with High Accuracy - A New Approach with Ranked Pareto-Fronts

    Get PDF
    In recent studies on recommendation systems, the choice-based conjoint analysis has been suggested as a method for measuring consumer preferences. This approach achieves high recommendation accuracy and does not suffer from the start-up problem because it is also applicable for recommendations for new consumers or of new products. However, this method requires massive consumer input, which causes consumer reluctance. In a simulation study, we demonstrate the high accuracy, but also the high user’s effort for using a utility-based recommendation system using a choice-based conjoint analysiswith hierarchical Bayes estimation. In order to reduce the conflict between consumer effort and recommendation accuracy, we develop a novel approach that only shows Paretoefficient alternatives and ranks them according to the number of dominated attributes. We demonstrate that, in terms of the decision accuracy of the recommended products, the ranked Pareto-front approach performs better than a recommendation system that employs choice-based conjoint analysis. Furthermore, the consumer’s effort is kept low and comparable to that of simple systems that require little consumer input

    Methodology for Comparison of Algorithms for Real-World Multi-objective Optimization Problems: Space Surveillance Network Design

    Get PDF
    Space Situational Awareness (SSA) is an activity vital to protecting national and commercial satellites from damage or destruction due to collisions. Recent research has demonstrated a methodology using evolutionary algorithms (EAs) which is intended to develop near-optimal Space Surveillance Network (SSN) architectures in the sense of low cost, low latency, and high resolution. That research is extended here by (1) developing and applying a methodology to compare the performance of two or more algorithms against this problem, and (2) analyzing the effects of using reduced data sets in those searches. Computational experiments are presented in which the performance of five multi-objective search algorithms are compared to one another using four binary comparison methods, each quantifying the relationship between two solution sets in different ways. Relative rankings reveal strengths and weaknesses of evaluated algorithms empowering researchers to select the best algorithm for their specific needs. The use of reduced data sets is shown to be useful for producing relative rankings of algorithms that are representative of rankings produced using the full set

    A multi-objective performance optimisation framework for video coding

    Get PDF
    Digital video technologies have become an essential part of the way visual information is created, consumed and communicated. However, due to the unprecedented growth of digital video technologies, competition for bandwidth resources has become fierce. This has highlighted a critical need for optimising the performance of video encoders. However, there is a dual optimisation problem, wherein, the objective is to reduce the buffer and memory requirements while maintaining the quality of the encoded video. Additionally, through the analysis of existing video compression techniques, it was found that the operation of video encoders requires the optimisation of numerous decision parameters to achieve the best trade-offs between factors that affect visual quality; given the resource limitations arising from operational constraints such as memory and complexity. The research in this thesis has focused on optimising the performance of the H.264/AVC video encoder, a process that involved finding solutions for multiple conflicting objectives. As part of this research, an automated tool for optimising video compression to achieve an optimal trade-off between bit rate and visual quality, given maximum allowed memory and computational complexity constraints, within a diverse range of scene environments, has been developed. Moreover, the evaluation of this optimisation framework has highlighted the effectiveness of the developed solution

    Intelligent Web Services Architecture Evolution Via An Automated Learning-Based Refactoring Framework

    Full text link
    Architecture degradation can have fundamental impact on software quality and productivity, resulting in inability to support new features, increasing technical debt and leading to significant losses. While code-level refactoring is widely-studied and well supported by tools, architecture-level refactorings, such as repackaging to group related features into one component, or retrofitting files into patterns, remain to be expensive and risky. Serval domains, such as Web services, heavily depend on complex architectures to design and implement interface-level operations, provided by several companies such as FedEx, eBay, Google, Yahoo and PayPal, to the end-users. The objectives of this work are: (1) to advance our ability to support complex architecture refactoring by explicitly defining Web service anti-patterns at various levels of abstraction, (2) to enable complex refactorings by learning from user feedback and creating reusable/personalized refactoring strategies to augment intelligent designers’ interaction that will guide low-level refactoring automation with high-level abstractions, and (3) to enable intelligent architecture evolution by detecting, quantifying, prioritizing, fixing and predicting design technical debts. We proposed various approaches and tools based on intelligent computational search techniques for (a) predicting and detecting multi-level Web services antipatterns, (b) creating an interactive refactoring framework that integrates refactoring path recommendation, design-level human abstraction, and code-level refactoring automation with user feedback using interactive mutli-objective search, and (c) automatically learning reusable and personalized refactoring strategies for Web services by abstracting recurring refactoring patterns from Web service releases. Based on empirical validations performed on both large open source and industrial services from multiple providers (eBay, Amazon, FedEx and Yahoo), we found that the proposed approaches advance our understanding of the correlation and mutual impact between service antipatterns at different levels, revealing when, where and how architecture-level anti-patterns the quality of services. The interactive refactoring framework enables, based on several controlled experiments, human-based, domain-specific abstraction and high-level design to guide automated code-level atomic refactoring steps for services decompositions. The reusable refactoring strategy packages recurring refactoring activities into automatable units, improving refactoring path recommendation and further reducing time-consuming and error-prone human intervention.Ph.D.College of Engineering & Computer ScienceUniversity of Michigan-Dearbornhttps://deepblue.lib.umich.edu/bitstream/2027.42/142810/1/Wang Final Dissertation.pdfDescription of Wang Final Dissertation.pdf : Dissertatio

    Intelligent Software Bugs Localization, Triage and Prioritization

    Full text link
    One of the time-consuming software maintenance tasks is the localization of software bugs especially in large systems. Developers have to follow a tedious process to reproduce the abnormal behavior then inspect a large number of files in order to resolve the bugs. Furthermore, software developers are usually overwhelmed with several reports of critical bugs to be addressed urgently and simultaneously. The management of these bugs is a complex problem due to the limited resources and the deadlines-pressure. Another critical task in this process is to assign appropriate priority to the bugs and eventually assign them to the right developers for resolution. Several studies have been proposed for bugs localization, the majority of them are recommending classes as outputs which may still require high inspection effort. In addition, there is a significant difference between the natural language used in bug reports and the programming language which limits the efficiency of existing approaches since most of them are mainly based on lexical similarity. Most of the existing studies treated bug reports in isolation when assigning them to developers. They also lack the understanding of dynamics of changing bug priorities. Thus, developers may spend considerable cognitive efforts moving between completely unrelated bug reports. To address these challenges, we proposed the following research contributions: 1. We proposed an automated approach to find and rank the potential classes and methods in order to localize software defects. Our approach finds a good balance between minimizing the number of recommended classes and maximizing the relevance of the proposed solution using a hybrid multi-objective optimization algorithm combining local and global search. Our hybrid multi-objective approach is able to successfully locate the true buggy methods within the top 10 recommendations for over 78% of the bug reports leading to a significant reduction of developers' effort comparing to class-level bug localization techniques. 2. We proposed an automated bugs triage approach based on the dependencies between several open bug reports. We defined the dependency between two bug reports as the number of common files to be inspected to localize the bugs. Then, we adopted multi-objective search to rank the bug reports for programmers. The results show a significant time reduction of over 30% in localizing the bugs simultaneously comparing to the traditional bugs prioritization technique based on only priorities. 3. We performed an empirical study to observe and understand the changes in bugs' priority in order to build a 3-W model on Why and When bug priorities change, and Who performs the change. We conducted interviews and a survey with practitioners as well as performed a quantitative analysis large database of bugs reports. As a result, we observed frequent changes in bug priorities and their impact on delaying critical bug fixes especially before shipping a new release.Ph.D.College of Engineering & Computer ScienceUniversity of Michigan-Dearbornhttp://deepblue.lib.umich.edu/bitstream/2027.42/170906/1/Rafi Almhana Final Dissertation.pdfDescription of Rafi Almhana Final Dissertation.pdf : Dissertatio

    Search-Based Software Bugs Localization, Triage and Prioritization

    Full text link
    http://deepblue.lib.umich.edu/bitstream/2027.42/170552/1/PhD_Dissertation___Rafi_Almhana__Copy_ (1).pdfSEL

    Tools for evolutionary acquisition : a study of Multi-Attribute Tradespace Exploration (MATE) applied to the Space Based Radar (SBR)

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2003.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Statement of responsibility on t.p. reads: 2nd Lieutenant Timothy J. Spaulding, USAF.Includes bibliographical references (p. 139-142).by Timothy J. Spaulding.S.M

    Numerical and Evolutionary Optimization 2020

    Get PDF
    This book was established after the 8th International Workshop on Numerical and Evolutionary Optimization (NEO), representing a collection of papers on the intersection of the two research areas covered at this workshop: numerical optimization and evolutionary search techniques. While focusing on the design of fast and reliable methods lying across these two paradigms, the resulting techniques are strongly applicable to a broad class of real-world problems, such as pattern recognition, routing, energy, lines of production, prediction, and modeling, among others. This volume is intended to serve as a useful reference for mathematicians, engineers, and computer scientists to explore current issues and solutions emerging from these mathematical and computational methods and their applications

    Multiobjective Best Theory Diagrams for cross-ply composite plates employing polynomial, zig-zag, trigonometric and exponential thickness expansions

    Get PDF
    This paper presents Best Theory Diagrams (BTDs) for plates considering all the displacement and stress components as objectives. The BTD is a diagram in which the minimum number of terms that have to be used to achieve the desired accuracy can be read. Maclaurin, zig-zag, trigonometric and exponential expansions are employed for the static analysis of cross-ply composite plates. The Equivalent Single Layer (ESL) approach is considered, and the Unified Formulation developed by Carrera is used. The governing equations are derived from the Principle of Virtual Displacements (PVD), and Navier-type closed-form solutions are adopted. BTDs are obtained using the Axiomatic/Asymptotic Method (AAM) and genetic algorithms (GA). The results show that the BTD can be used as a tool to assess the accuracy and computational efficiency of any structural models and to draw guidelines to develop structural models. The inclusion of the multiobjective capability extends the BTD validity to the recognition of the role played by each output parameter in the refinement of a structural model

    Life-cycle optimization of building performance: a collection of case studies

    Get PDF
    The building sector is one of the most impacting on the energy demand and on the environment in developed countries, together with industry and transports. The European Union introduced the topic of nearly zero-energy building (nZEB) and promoted a deep renovations in the existing building stock with the aim of reducing the energy consumption and environmental impacts of the building sector. The design of a nZEB, and in general of a low-energy building, involves different aspects like the economic cost, the comfort indoor, the energy consumption, the life cycle environmental impacts, the different points of view of policy makers, investors and inhabitants. Thus, the adoption of a multicriteria approach is often required in the design process to manage some potential conflicting domains. In detail, one of the most suitable approaches is to integrate the preliminary building design (or renovation) phase in a multi-objective optimization problem, allowing to rapidly compare many alternatives and to identify the most adapt interventions
    • …
    corecore