17,456 research outputs found

    Fourier analysis of frequency domain discrete event simulation experiments

    Get PDF
    Frequency Domain Experiments (FDEs) were first used in discrete-event simulation to perform system parameter sensitivity analysis for factor screening in stochastic system simulations. FDEs are based on the intuitive assertion that if one or more system parameters are oscillated at fixed frequencies throughout a simulation run, then oscillations at the same frequencies will be induced in the system\u27s response. Spectral (Fourier) analysis of these induced oscillations is then used to characterize and analyze the system. Since their introduction 12 years ago, significant work has been done to extend the applicability of FDEs to regression analysis, simulation optimization and gradient estimation. Two fundamental theoretical and data analysis FDE problems remain, however. Both problems are addressed in this dissertation.;To perform a FDE Fourier analysis, a sampled data sequence of response observations is used; i.e., the selected system response is sampled using a suitable oscillation (sampling) index. The choice of an appropriate oscillation index is an open problem in the literature known as the FDE indexing problem. This dissertation presents a solution to the FDE indexing problem. Specifically, a FDE Fourier data analysis algorithm is developed which uses the simulation clock as the oscillation index. This algorithm is based on the well-established theory of counting (Poisson) processes. The algorithm is implemented and tested on a variety of systems including several networks of nonstationary M/G/1 queues.;To justify the use of Fourier methods, a basic FDE model assumption is that if a particular system response statistic is sensitive to a system parameter, then sinusoidal variation of that system parameter at a fixed frequency will induce similar sinusoidal variations in the response statistic, at the same frequency. There is, however, a lack of theoretical support for this model assumption. This dissertation provides some of that theoretical support; i.e., the FDE Fourier data analysis algorithm developed in this dissertation is used to analyze the frequency response of a M/M/1 queuing system. An equation is derived which accurately characterizes the extent to which the departure process from a M/M/1 queuing system can be modeled as an amplitude-modulated, phase-shifted version of the oscillated arrival process

    The uncertain representation ranking framework for concept-based video retrieval

    Get PDF
    Concept based video retrieval often relies on imperfect and uncertain concept detectors. We propose a general ranking framework to define effective and robust ranking functions, through explicitly addressing detector uncertainty. It can cope with multiple concept-based representations per video segment and it allows the re-use of effective text retrieval functions which are defined on similar representations. The final ranking status value is a weighted combination of two components: the expected score of the possible scores, which represents the risk-neutral choice, and the scores’ standard deviation, which represents the risk or opportunity that the score for the actual representation is higher. The framework consistently improves the search performance in the shot retrieval task and the segment retrieval task over several baselines in five TRECVid collections and two collections which use simulated detectors of varying performance

    A survey on the use of relevance feedback for information access systems

    Get PDF
    Users of online search engines often find it difficult to express their need for information in the form of a query. However, if the user can identify examples of the kind of documents they require then they can employ a technique known as relevance feedback. Relevance feedback covers a range of techniques intended to improve a user's query and facilitate retrieval of information relevant to a user's information need. In this paper we survey relevance feedback techniques. We study both automatic techniques, in which the system modifies the user's query, and interactive techniques, in which the user has control over query modification. We also consider specific interfaces to relevance feedback systems and characteristics of searchers that can affect the use and success of relevance feedback systems

    Fluid-Structure Interaction Simulation of a Coriolis Mass Flowmeter using a Lattice Boltzmann Method

    Get PDF
    In this paper we use a fluid-structure interaction (FSI) approach to simulate a Coriolis mass flowmeter (CMF). The fluid dynamics are calculated by the open source framework OpenLB, based on the lattice Boltzmann method (LBM). For the structural dynamics we employ the open source software Elmer, an implementation of the finite element method (FEM). A staggered coupling approach between the two software packages is presented. The finite element mesh is created by the mesh generator Gmsh to ensure a complete open source workflow. The Eigenmodes of the CMF, which are calculated by modal analysis are compared with measurement data. Using the estimated excitation frequency, a fully coupled, partitioned, FSI simulation is applied to simulate the phase shift of the investigated CMF design. The calculated phaseshift values are in good agreement to the measurement data and verify the suitability of the model to numerically describe the working principle of a CMF

    Porqpine: a peer-to-peer search engine

    Get PDF
    In this paper, we present a fully distributed and collaborative search engine for web pages: Porqpine. This system uses a novel query-based model and collaborative filtering techniques in order to obtain user-customized results. All knowledge about users and profiles is stored in each user node?s application. Overall the system is a multi-agent system that runs on the computers of the user community. The nodes interact in a peer-to-peer fashion in order to create a real distributed search engine where information is completely distributed among all the nodes in the network. Moreover, the system preserves the privacy of user queries and results by maintaining the anonymity of the queries? consumers and results? producers. The knowledge required by the system to work is implicitly caught through the monitoring of users actions, not only within the system?s interface but also within one of the most popular web browsers. Thus, users are not required to explicitly feed knowledge about their interests into the system since this process is done automatically. In this manner, users obtain the benefits of a personalized search engine just by installing the application on their computer. Porqpine does not intend to shun completely conventional centralized search engines but to complement them by issuing more accurate and personalized results.Postprint (published version

    MADmap: A Massively Parallel Maximum-Likelihood Cosmic Microwave Background Map-Maker

    Full text link
    MADmap is a software application used to produce maximum-likelihood images of the sky from time-ordered data which include correlated noise, such as those gathered by Cosmic Microwave Background (CMB) experiments. It works efficiently on platforms ranging from small workstations to the most massively parallel supercomputers. Map-making is a critical step in the analysis of all CMB data sets, and the maximum-likelihood approach is the most accurate and widely applicable algorithm; however, it is a computationally challenging task. This challenge will only increase with the next generation of ground-based, balloon-borne and satellite CMB polarization experiments. The faintness of the B-mode signal that these experiments seek to measure requires them to gather enormous data sets. MADmap is already being run on up to O(1011)O(10^{11}) time samples, O(108)O(10^8) pixels and O(104)O(10^4) cores, with ongoing work to scale to the next generation of data sets and supercomputers. We describe MADmap's algorithm based around a preconditioned conjugate gradient solver, fast Fourier transforms and sparse matrix operations. We highlight MADmap's ability to address problems typically encountered in the analysis of realistic CMB data sets and describe its application to simulations of the Planck and EBEX experiments. The massively parallel and distributed implementation is detailed and scaling complexities are given for the resources required. MADmap is capable of analysing the largest data sets now being collected on computing resources currently available, and we argue that, given Moore's Law, MADmap will be capable of reducing the most massive projected data sets
    corecore