1,285,743 research outputs found

    The Intuitive Appeal of Explainable Machines

    Get PDF
    Algorithmic decision-making has become synonymous with inexplicable decision-making, but what makes algorithms so difficult to explain? This Article examines what sets machine learning apart from other ways of developing rules for decision-making and the problem these properties pose for explanation. We show that machine learning models can be both inscrutable and nonintuitive and that these are related, but distinct, properties. Calls for explanation have treated these problems as one and the same, but disentangling the two reveals that they demand very different responses. Dealing with inscrutability requires providing a sensible description of the rules; addressing nonintuitiveness requires providing a satisfying explanation for why the rules are what they are. Existing laws like the Fair Credit Reporting Act (FCRA), the Equal Credit Opportunity Act (ECOA), and the General Data Protection Regulation (GDPR), as well as techniques within machine learning, are focused almost entirely on the problem of inscrutability. While such techniques could allow a machine learning system to comply with existing law, doing so may not help if the goal is to assess whether the basis for decision-making is normatively defensible. In most cases, intuition serves as the unacknowledged bridge between a descriptive account and a normative evaluation. But because machine learning is often valued for its ability to uncover statistical relationships that defy intuition, relying on intuition is not a satisfying approach. This Article thus argues for other mechanisms for normative evaluation. To know why the rules are what they are, one must seek explanations of the process behind a model’s development, not just explanations of the model itself

    iSee: a case-based reasoning platform for the design of explanation experiences.

    Get PDF
    Explainable Artificial Intelligence (XAI) is an emerging field within Artificial Intelligence (AI) that has provided many methods that enable humans to understand and interpret the outcomes of AI systems. However, deciding on the best explanation approach for a given AI problem is currently a challenging decision-making task. This paper presents the iSee project, which aims to address some of the XAI challenges by providing a unifying platform where personalized explanation experiences are generated using Case-Based Reasoning. An explanation experience includes the proposed solution to a particular explainability problem and its corresponding evaluation, provided by the end user. The ultimate goal is to provide an open catalog of explanation experiences that can be transferred to other scenarios where trustworthy AI is required

    A Personalized Recommender System Based on Explanation Facilities Using Collaborative Filtering

    Get PDF
    Collaborative filtering (CF) is the most successful recommendation method, but its widespread use has exposed some limitations, such as sparsity, scalability, and black box. Many researchers have focused on sparsity and scalability problem but a little has tried to solve the black box problem. Most CF recommender systems are black boxes, providing no transparency into the working of the recommendation. This research suggests an improved CF recommender system with explanation facilities to overcome the black box problem. Explanation facilities make it possible to expose the reasoning and data behind a recommendation. Therefore, explanations provide us with a mechanism for handling errors that come with a recommendation. Furthermore, it is proposed to use web usage mining and product taxonomy to enhance the recommendation quality for e-commerce environment. For such purposes, it is developed a recommender system named WebCF-Exp, Web usage mining driven Collaborative Filtering with Explanation facilities. To test the performance of WebCF-Exp, EBIB research internet shopping mall and explanation interfaces are developed. Experiments are conducted with the data provided by EBIB Research Internet shopping mall
    • …
    corecore