6,581 research outputs found

    Online Bin Covering: Expectations vs. Guarantees

    Full text link
    Bin covering is a dual version of classic bin packing. Thus, the goal is to cover as many bins as possible, where covering a bin means packing items of total size at least one in the bin. For online bin covering, competitive analysis fails to distinguish between most algorithms of interest; all "reasonable" algorithms have a competitive ratio of 1/2. Thus, in order to get a better understanding of the combinatorial difficulties in solving this problem, we turn to other performance measures, namely relative worst order, random order, and max/max analysis, as well as analyzing input with restricted or uniformly distributed item sizes. In this way, our study also supplements the ongoing systematic studies of the relative strengths of various performance measures. Two classic algorithms for online bin packing that have natural dual versions are Harmonic and Next-Fit. Even though the algorithms are quite different in nature, the dual versions are not separated by competitive analysis. We make the case that when guarantees are needed, even under restricted input sequences, dual Harmonic is preferable. In addition, we establish quite robust theoretical results showing that if items come from a uniform distribution or even if just the ordering of items is uniformly random, then dual Next-Fit is the right choice.Comment: IMADA-preprint-c

    Weakly Equivalent Arrays

    Full text link
    The (extensional) theory of arrays is widely used to model systems. Hence, efficient decision procedures are needed to model check such systems. Current decision procedures for the theory of arrays saturate the read-over-write and extensionality axioms originally proposed by McCarthy. Various filters are used to limit the number of axiom instantiations while preserving completeness. We present an algorithm that lazily instantiates lemmas based on weak equivalence classes. These lemmas are easier to interpolate as they only contain existing terms. We formally define weak equivalence and show correctness of the resulting decision procedure

    No Eigenvalue in Finite Quantum Electrodynamics

    Get PDF
    We re-examine Quantum Electrodynamics (QED) with massless electron as a finite quantum field theory as advocated by Gell-Mann-Low, Baker-Johnson, Adler, Jackiw and others. We analyze the Dyson-Schwinger equation satisfied by the massless electron in finite QED and conclude that the theory admits no nontrivial eigenvalue for the fine structure constant.Comment: 13 pages, Late

    Simplicial Quantum Gravity on a Computer

    Full text link
    We describe a method of Monte-Carlo simulations of simplicial quantum gravity coupled to matter fields. We concentrate mainly on the problem of implementing effectively the random, dynamical triangulation and building in a detailed-balance condition into the elementary transformations of the triangulation. We propose a method of auto-tuning the parameters needed to balance simulations of the canonical ensemble. This method allows us to prepare a whole set of jobs and therefore is very useful in systematic determining the phase diagram in two dimensional coupling space. It is of particular importance when the jobs are run on a parallel machine.Comment: 24 pages, PostScrip

    Analisa Perbandingan Algoritma K-Means Dan Fuzzy C-Means(Studi Kasus:Topik Skripsi Sistem Informasi)

    Get PDF
    The performance of each algorithm is very important, as well as the selection of a thesis topic for students final year. Clustering is a grouping of data without specific data based on the class. Clustering can be used to label the data class is not yet known. The method used is the CRISP-DM which through understanding of business processes, understanding the data, the data preparation, modeling, evaluation and deployment. The algorithm used for the formation of clusters is a K-Means algorithm and Fuzzy C-Means. K-Means and Fuzzy C-Means is one of data method non-hierarchical clustering.RapidMiner 7.0 is using the research to aid clustering of attributes used are the academic year, sex and thesis topic. The result this research is efficiency based on time. The result are used as a feedback in the use of an algorithm to study the case further

    3D tomography of cells in micro-channels

    Get PDF
    We combine confocal imaging, microfluidics and image analysis to record 3D-images of cells in flow. This enables us to recover the full 3D representation of several hundred living cells per minute. Whereas 3D confocal imaging has thus far been limited to steady specimen, we overcome this restriction and present a method to access the 3D shape of moving objects. The key of our principle is a tilted arrangement of the micro-channel with respect to the focal plane of the microscope. This forces cells to traverse the focal plane in an inclined manner. As a consequence, individual layers of passing cells are recorded which can then be assembled to obtain the volumetric representation. The full 3D information allows for a detailed comparisons with theoretical and numerical predictions unfeasible with e.g.\ 2D imaging. Our technique is exemplified by studying flowing red blood cells in a micro-channel reflecting the conditions prevailing in the microvasculature. We observe two very different types of shapes: `croissants' and `slippers'. Additionally, we perform 3D numerical simulations of our experiment to confirm the observations. Since 3D confocal imaging of cells in flow has not yet been realized, we see high potential in the field of flow cytometry where cell classification thus far mostly relies on 1D scattering and fluorescence signals
    corecore