486,732 research outputs found

    An analysis of the cost and benefit of search interactions

    Get PDF
    Interactive Information Retrieval (IR) systems often provide various features and functions, such as query suggestions and relevance feedback, that a user may or may not decide to use. The decision to take such an option has associated costs and may lead to some benefit. Thus, a savvy user would take decisions that maximises their net benefit. In this paper, we formally model the costs and benefits of various decisions that users, implicitly or explicitly, make when searching. We consider and analyse the following scenarios: (i) how long a user's query should be? (ii) should the user pose a specific or vague query? (iii) should the user take a suggestion or re-formulate? (iv) when should a user employ relevance feedback? and (v) when would the "find similar" functionality be worthwhile to the user? To this end, we build a series of cost-benefit models exploring a variety of parameters that affect the decisions at play. Through the analyses, we are able to draw a number of insights into different decisions, provide explanations for observed behaviours and generate numerous testable hypotheses. This work not only serves as a basis for future empirical work, but also as a template for developing other cost-benefit models involving human-computer interaction

    Building cost-benefit models of information interactions

    Get PDF
    Modeling how people interact with search interfaces has been of particular interest and importance to the field of Interactive Information Retrieval. Recently, there has been a move to developing formal models of the interaction between the user and the system, whether it be to: (i) run a simulation, (ii) conduct an economic analysis, (iii) measure system performance, or (iv) simply to better understand user interactions and hypothesise about user behaviours. In such models, they consider the costs and the benefits that arise through the interaction with the interface/system and the information surfaced during the course of interaction. In this half day tutorial, we will focus on describing a series of cost-benefit models that have been proposed in the literature and how they have been applied in various scenarios. The tutorial will be structured into two parts. First, we will provide an overview of Decision Theory and Cost-Benefit Analysis techniques, and how they can and have be applied to a variety of Interactive Information Retrieval scenarios. For example, when do facets helps?, under what conditions are query suggestions useful? and is it better to bookmark or re-find? The second part of the tutorial will be dedicated to building cost-benefit models where we will discuss different techniques to build and develop such models. In the practical session, we will also discuss how costs and benefits can be estimated, and how the models can help inform and guide experimentation. During the tutorial participants will be challenged to build cost models for a number of problems (or even bring their own problems to solve)

    Balancing and Sequencing of Mixed Model Assembly Lines

    Get PDF
    Assembly lines are cost efficient production systems that mass produce identical products. Due to customer demand, manufacturers use mixed model assembly lines to produce customized products that are not identical. To stay efficient, management decisions for the line such as number of workers and assembly task assignment to stations need to be optimized to increase throughput and decrease cost. In each station, the work to be done depends on the exact product configuration, and is not consistent across all products. In this dissertation, a mixed model line balancing integer program (IP) that considers parallel workers, zoning, task assignment, and ergonomic constraints with the objective of minimizing the number of workers is proposed. Upon observing the limitation of the IP, a Constraint Programming (CP) model that is based on CPLEX CP Optimizer is developed to solve larger assembly line balancing problems. Data from an automotive OEM are used to assess the performance of both the MIP and CP models. Using the OEM data, we show that the CP model outperforms the IP model for bigger problems. A sensitivity analysis is done to assess the cost of enforcing some of the constraint on the computation complexity and the amount of violations to these constraints once they are disabled. Results show that some of the constraints are helpful in reducing the computation time. Specifically, the assignment constraints in which decision variables are fixed or bounded result in a smaller search space. Finally, since the line balance for mixed model is based on task duration averages, we propose a mixed model sequencing model that minimize the number of overload situation that might occur due to variability in tasks times by providing an optimal production sequence. We consider the skip-policy to manage overload situations and allow interactions between stations via workers swimming. An IP model formulation is proposed and a GRASP solution heuristic is developed to solve the problem. Data from the literature are used to assess the performance of the developed heuristic and to show the benefit of swimming in reducing work overload situations

    Parallelizing RRT on large-scale distributed-memory architectures

    Get PDF
    This paper addresses the problem of parallelizing the Rapidly-exploring Random Tree (RRT) algorithm on large-scale distributed-memory architectures, using the Message Passing Interface. We compare three parallel versions of RRT based on classical parallelization schemes. We evaluate them on different motion planning problems and analyze the various factors influencing their performance

    Surface-based constraints on target selection and distractor rejection: Evidence from preview search

    Get PDF
    In preview search when an observer ignores an early appearing set of distractors, there can subsequently be impeded detection of new targets that share the colour of this preview. This “negative carry-over effect” has been attributed to an active inhibitory process targeted against the old items and inadvertently their features. Here we extend negative carry-over effects to the case of stereoscopically defined surfaces of coplanar elements without common features. In Experiment 1 observers previewed distractors in one surface (1000 ms), before being presented with the target and new distractors divided over the old and a new surface either above or below the old one. Participants were slower and less efficient to detect targets in the old surface. In Experiment 2 in both the first and second display the items were divided over two planes in the proportion 66/33% such that no new planes appeared following the preview, and there was no majority of items in any one plane in the final combined display. The results showed that participants were slower to detect the target when it occurred in the old majority surface. Experiment 3 held constant the 2D properties of the stimuli while varying the presence of binocular depth cues. The carry-over effect only occurred in the presence of binocular depth cues, ruling out any account of the results in terms of 2-D cues. The results suggest well formed surfaces in addition to simple features may be targets for inhibition in search

    Where’s the evidence? a systematic review of economic analyses of residential aged care infrastructure

    Get PDF
    © The Author(s). 2017 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.Background Residential care infrastructure, in terms of the characteristics of the organisation (such as proprietary status, size, and location) and the physical environment, have been found to directly influence resident outcomes. This review aimed to summarise the existing literature of economic evaluations of residential care infrastructure. Methods A systematic review of English language articles using AgeLine, CINAHL, Econlit, Informit (databases in Health; Business and Law; Social Sciences), Medline, ProQuest, Scopus, and Web of Science with retrieval up to 14 December 2015. The search strategy combined terms relating to nursing homes, economics, and older people. Full economic evaluations, partial economic evaluations, and randomised trials reporting more limited economic information, such as estimates of resource use or costs of interventions were included. Data was extracted using predefined data fields and synthesized in a narrative summary to address the stated review objective. Results Fourteen studies containing an economic component were identified. None of the identified studies attempted to systematically link costs and outcomes in the form of a cost-benefit, cost-effectiveness, or cost-utility analysis. There was a wide variation in approaches taken for valuing the outcomes associated with differential residential care infrastructures: 8 studies utilized various clinical outcomes as proxies for the quality of care provided, and 2 focused on resident outcomes including agitation, quality of life, and the quality of care interactions. Only 2 studies included residents living with dementia. Conclusions Robust economic evidence is needed to inform aged care facility design. Future research should focus on identifying appropriate and meaningful outcome measures that can be used at a service planning level, as well as the broader health benefits and cost-saving potential of different organisational and environmental characteristics in residential care

    Search algorithms as a framework for the optimization of drug combinations

    Get PDF
    Combination therapies are often needed for effective clinical outcomes in the management of complex diseases, but presently they are generally based on empirical clinical experience. Here we suggest a novel application of search algorithms, originally developed for digital communication, modified to optimize combinations of therapeutic interventions. In biological experiments measuring the restoration of the decline with age in heart function and exercise capacity in Drosophila melanogaster, we found that search algorithms correctly identified optimal combinations of four drugs with only one third of the tests performed in a fully factorial search. In experiments identifying combinations of three doses of up to six drugs for selective killing of human cancer cells, search algorithms resulted in a highly significant enrichment of selective combinations compared with random searches. In simulations using a network model of cell death, we found that the search algorithms identified the optimal combinations of 6-9 interventions in 80-90% of tests, compared with 15-30% for an equivalent random search. These findings suggest that modified search algorithms from information theory have the potential to enhance the discovery of novel therapeutic drug combinations. This report also helps to frame a biomedical problem that will benefit from an interdisciplinary effort and suggests a general strategy for its solution.Comment: 36 pages, 10 figures, revised versio

    Snowmass CF1 Summary: WIMP Dark Matter Direct Detection

    Get PDF
    As part of the Snowmass process, the Cosmic Frontier WIMP Direct Detection subgroup (CF1) has drawn on input from the Cosmic Frontier and the broader Particle Physics community to produce this document. The charge to CF1 was (a) to summarize the current status and projected sensitivity of WIMP direct detection experiments worldwide, (b) motivate WIMP dark matter searches over a broad parameter space by examining a spectrum of WIMP models, (c) establish a community consensus on the type of experimental program required to explore that parameter space, and (d) identify the common infrastructure required to practically meet those goals.Comment: Snowmass CF1 Final Summary Report: 47 pages and 28 figures with a 5 page appendix on instrumentation R&

    Target absent trials in configural contextual cuing

    Get PDF
    In contextual cueing (CC), reaction times to find targets in repeated displays are faster than in displays that have never been seen before. This has been demonstrated using target-distractor configurations, global background colors, naturalistic scenes and the co-variation of target with distractors. The majority of CC studies have used displays where the target is always present. This paper investigates what happens when the target is sometimes absent. Experiment 1 shows that, although configural CC occurs in displays when the target is always present, there is no CC when the target is always absent. Experiment 2 shows that there is no CC when the same spatial layout can be both target present and target absent on different trials. The presence of distractors in locations that contain targets on other trials appears to interfere with CC and even disrupts the expression of previously learned contexts (Experiments 3-5). The results show that it is the target-distractor associations that are important in producing CC and, consistent with a response selection account, changing the response type from an orientation task to a detection task removes the CC effect
    corecore