177,067 research outputs found
Computer classification of linear codes
We present algorithms for classification of linear codes over finite fields,
based on canonical augmentation and on lattice point enumeration. We apply
these algorithms to obtain classification results over fields with 2, 3 and 4
elements. We validate a correct implementation of the algorithms with known
classification results from the literature, which we partially extend to larger
ranges of parameters.Comment: 18 pages, 9 tables; this paper is a merge and extension of
arXiv:1907.10363 and arXiv:1912.0935
LinCode - computer classification of linear codes
We present an algorithm for the classification of linear codes over finite
fields, based on lattice point enumeration. We validate a correct
implementation of our algorithm with known classification results from the
literature, which we partially extend to larger ranges of parameters.Comment: 12 pages, 5 table
Optimal binary linear codes of dimension at most seven
AbstractWe classify optimal [n,k,d] binary linear codes of dimension ⩽7, with one exception, where by optimal we mean that no [n−1,k,d],[n+1,k+1,d], or [n+1,k,d+1] code exists. In particular, we present (new) classification results for codes with parameters [40,7,18], [43,7,20], [59,7,28], [75,7,36], [79,7,38], [82,7,40], [87,7,42], and [90,7,44]. These classifications are accomplished with the aid of the first author's computer program Extension for extending from residual codes, and the second author's program Split
A generalization of the cylinder conjecture for divisible codes
We extend the original cylinder conjecture on point sets in affine
three-dimensional space to the more general framework of divisible linear codes
over and their classification. Through a mix of linear
programming, combinatorial techniques and computer enumeration, we investigate
the structural properties of these codes. In this way, we can prove a reduction
theorem for a generalization of the cylinder conjecture, show some instances
where it does not hold and prove its validity for small values of . In
particular, we correct a flawed proof for the original cylinder conjecture for
and present the first proof for .Comment: 16 page
Exemplar codes for facial attributes and tattoo recognition
Abstract When implementing real-world computer vision systems, researchers can use mid-level representations as a tool to adjust the trade-off between accuracy and efficiency. Unfortunately, existing mid-level representations that improve accuracy tend to decrease efficiency, or are specifically tailored to work well within one pipeline or vision problem at the exclusion of others. We introduce a novel, efficient mid-level representation that improves classification efficiency without sacrificing accuracy. Our Exemplar Codes are based on linear classifiers and probability normalization from extreme value theory. We apply Exemplar Codes to two problems: facial attribute extraction and tattoo classification. In these settings, our Exemplar Codes are competitive with the state of the art and offer efficiency benefits, making it possible to achieve high accuracy even on commodity hardware with a low computational budget
Semi-Supervised Sparse Coding
Sparse coding approximates the data sample as a sparse linear combination of
some basic codewords and uses the sparse codes as new presentations. In this
paper, we investigate learning discriminative sparse codes by sparse coding in
a semi-supervised manner, where only a few training samples are labeled. By
using the manifold structure spanned by the data set of both labeled and
unlabeled samples and the constraints provided by the labels of the labeled
samples, we learn the variable class labels for all the samples. Furthermore,
to improve the discriminative ability of the learned sparse codes, we assume
that the class labels could be predicted from the sparse codes directly using a
linear classifier. By solving the codebook, sparse codes, class labels and
classifier parameters simultaneously in a unified objective function, we
develop a semi-supervised sparse coding algorithm. Experiments on two
real-world pattern recognition problems demonstrate the advantage of the
proposed methods over supervised sparse coding methods on partially labeled
data sets
- …