23,231 research outputs found

    ExplainIt! -- A declarative root-cause analysis engine for time series data (extended version)

    Full text link
    We present ExplainIt!, a declarative, unsupervised root-cause analysis engine that uses time series monitoring data from large complex systems such as data centres. ExplainIt! empowers operators to succinctly specify a large number of causal hypotheses to search for causes of interesting events. ExplainIt! then ranks these hypotheses, reducing the number of causal dependencies from hundreds of thousands to a handful for human understanding. We show how a declarative language, such as SQL, can be effective in declaratively enumerating hypotheses that probe the structure of an unknown probabilistic graphical causal model of the underlying system. Our thesis is that databases are in a unique position to enable users to rapidly explore the possible causal mechanisms in data collected from diverse sources. We empirically demonstrate how ExplainIt! had helped us resolve over 30 performance issues in a commercial product since late 2014, of which we discuss a few cases in detail.Comment: SIGMOD Industry Track 201

    Energy efficiency parametric design tool in the framework of holistic ship design optimization

    Get PDF
    Recent International Maritime Organization (IMO) decisions with respect to measures to reduce the emissions from maritime greenhouse gases (GHGs) suggest that the collaboration of all major stakeholders of shipbuilding and ship operations is required to address this complex techno-economical and highly political problem efficiently. This calls eventually for the development of proper design, operational knowledge, and assessment tools for the energy-efficient design and operation of ships, as suggested by the Second IMO GHG Study (2009). This type of coordination of the efforts of many maritime stakeholders, with often conflicting professional interests but ultimately commonly aiming at optimal ship design and operation solutions, has been addressed within a methodology developed in the EU-funded Logistics-Based (LOGBASED) Design Project (2004–2007). Based on the knowledge base developed within this project, a new parametric design software tool (PDT) has been developed by the National Technical University of Athens, Ship Design Laboratory (NTUA-SDL), for implementing an energy efficiency design and management procedure. The PDT is an integral part of an earlier developed holistic ship design optimization approach by NTUA-SDL that addresses the multi-objective ship design optimization problem. It provides Pareto-optimum solutions and a complete mapping of the design space in a comprehensive way for the final assessment and decision by all the involved stakeholders. The application of the tool to the design of a large oil tanker and alternatively to container ships is elaborated in the presented paper

    Reducing Electricity Demand Charge for Data Centers with Partial Execution

    Full text link
    Data centers consume a large amount of energy and incur substantial electricity cost. In this paper, we study the familiar problem of reducing data center energy cost with two new perspectives. First, we find, through an empirical study of contracts from electric utilities powering Google data centers, that demand charge per kW for the maximum power used is a major component of the total cost. Second, many services such as Web search tolerate partial execution of the requests because the response quality is a concave function of processing time. Data from Microsoft Bing search engine confirms this observation. We propose a simple idea of using partial execution to reduce the peak power demand and energy cost of data centers. We systematically study the problem of scheduling partial execution with stringent SLAs on response quality. For a single data center, we derive an optimal algorithm to solve the workload scheduling problem. In the case of multiple geo-distributed data centers, the demand of each data center is controlled by the request routing algorithm, which makes the problem much more involved. We decouple the two aspects, and develop a distributed optimization algorithm to solve the large-scale request routing problem. Trace-driven simulations show that partial execution reduces cost by 3%−−10.5%3\%--10.5\% for one data center, and by 15.5%15.5\% for geo-distributed data centers together with request routing.Comment: 12 page

    Search engine ranking factors analysis : Moz digital marketing company survey study

    Get PDF
    Project Work presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Information Systems and Technologies ManagementThe use of the Internet increases every year in the world for multiple purposes and at significant rates. In the same way, access to electronic business and personal pages allowing commercial transactions follows these high evolution rates. Many studies on this subject have pointed that it is important for most businesses to have a web presence. The key to be found by the right product or service target audience, at the right moment, according to most of authors, lies with search engines (SE) advent. However, there had been frequently changes in search engines ranking website classification algorithms during the last years. To accomplish this model evolution, the Search Engine Optimization (SEO) professionals must to frequently adopt to constant changes regarding ranking classification strategies from SE schemas of work. In this work the author explored a wide range of factors that may influence search engine result pages (SERP’s) and examined recent aspects of user experience over a website that are increasing importance regarding the optimization to be done over the web pages, internal and external page links, and its technical components. In addition, it seems that the user action and involvement over the website are key factors that Google will probably continue to adopt to determine websites rank in SERP’s. As an empirical study, all efforts to discover the SE website promotion ranking factors are based on trial and error activities and there is no official knowledge base regarding these protected secrets kept by the major players of this valuable market. Due to the lack of published academic research works in this area, the present work has discovered and documented SE ranking factors based on survey data by a large quantity of companies in digital marketing segment. At the end of the project the author intends to present the state-of-the-art in this field of study as well as some market perception evolution of this subject based heavily on practical experiments and most recent literature in this area. Moreover, it is growing the debate about the limits of digital marketing. Due the powerful influence of SE to market and people behavior, the presented study data and considerations raise an important forum of discussion now and in the future concerning ethics and socially acceptable limits and controls over personal information on the internet

    Using entropy and AHP-TOPSIS for comprehensive evaluation of internet shopping malls and solution optimality

    Get PDF
    Consumers are switching from offline to online to buy everything due to this reason nowadays Internet shopping malls (ISMs) are setting up a very crucial role in the economy. For assessment and ranking are basically a critical work which could be exploitation of Internet shopping malls information resources when consider in a scientific way, there are many methods for the evaluation and ranking of e-commerce sites. Taking into consideration Traffic Rank, Inbound Links, Competition, Speed, and Keyword Statistics, in literature Multi Criteria Decision Making (MCDM) methods are rarely used by the researchers to find the rank of Internet Shopping Malls (ISMs) on the basis of primary/secondary data of these influencing factors. This study, therefore, is unique to narrow down the gap in literature by employing MCDM methods i.e. Entropy and Analytic Hierarchy Process (AHP) to collect the weight of influencing factors and Technique for Order Preference by Similarity to Ideal (TOPSIS) to find the rank of Internet Shopping Malls (ISMs). After finding out the rank of selected criteria, solution optimality needs to be done to find the average ideal solution matrix. Conclusion and managerial implications of the study are also discussed.N/

    PERFORMANCE EVALUATION ON QUALITY OF ASIAN AIRLINES WEBSITES – AN AHP PPROACH

    Get PDF
    In recent years, many people have devoted their efforts to the issue of quality of Web site. The concept of quality is consisting of many criteria: quality of service perspective, a user perspective, a content perspective or indeed a usability perspective. Because of its possible instant worldwide audience a Website’s quality and reliability are crucial. The very special nature of the web applications and websites pose unique software testing challenges. Webmasters, Web applications developers, and Website quality assurance managers need tools and methods that can match up to the new needs. This research conducts some tests to measure the quality web site of Asian flag carrier airlines via web diagnostic tools online. We propose a methodology for determining and evaluate the best airlines websites based on many criteria of website quality. The approach has been implemented using Analytical Hierarchy Process (AHP) to generate the weights for the criteria which are much better and guarantee more fairly preference of criteria. The proposed model uses the AHP pairwise comparisons and the measure scale to generate the weights for the criteria which are much better and guarantee more fairly preference of criteria. The result of this study confirmed that the airlines websites of Asian are neglecting performance and quality criteria
    • …
    corecore