2,195 research outputs found
Dependence of ground state energy of classical n-vector spins on n
We study the ground state energy E_G(n) of N classical n-vector spins with
the hamiltonian H = - \sum_{i>j} J_ij S_i.S_j where S_i and S_j are n-vectors
and the coupling constants J_ij are arbitrary. We prove that E_G(n) is
independent of n for all n > n_{max}(N) = floor((sqrt(8N+1)-1) / 2) . We show
that this bound is the best possible. We also derive an upper bound for E_G(m)
in terms of E_G(n), for m<n. We obtain an upper bound on the frustration in the
system, as measured by F(n), which is defined to be (\sum_{i>j} |J_ij| +
E_G(n)) / (\sum_{i>j} |J_ij|). We describe a procedure for constructing a set
of J_ij's such that an arbitrary given state, {S_i}, is the ground state.Comment: 6 pages, 2 figures, submitted to Physical Review
Maximizing Algebraic Connectivity of Constrained Graphs in Adversarial Environments
This paper aims to maximize algebraic connectivity of networks via topology
design under the presence of constraints and an adversary. We are concerned
with three problems. First, we formulate the concave maximization topology
design problem of adding edges to an initial graph, which introduces a
nonconvex binary decision variable, in addition to subjugation to general
convex constraints on the feasible edge set. Unlike previous methods, our
method is justifiably not greedy and capable of accommodating these additional
constraints. We also study a scenario in which a coordinator must selectively
protect edges of the network from a chance of failure due to a physical
disturbance or adversarial attack. The coordinator needs to strategically
respond to the adversary's action without presupposed knowledge of the
adversary's feasible attack actions. We propose three heuristic algorithms for
the coordinator to accomplish the objective and identify worst-case preventive
solutions. Each algorithm is shown to be effective in simulation and we provide
some discussion on their compared performance.Comment: 8 pages, submitted to European Control Conference 201
Polynomiality for Bin Packing with a Constant Number of Item Types
We consider the bin packing problem with d different item sizes s_i and item
multiplicities a_i, where all numbers are given in binary encoding. This
problem formulation is also known as the 1-dimensional cutting stock problem.
In this work, we provide an algorithm which, for constant d, solves bin
packing in polynomial time. This was an open problem for all d >= 3.
In fact, for constant d our algorithm solves the following problem in
polynomial time: given two d-dimensional polytopes P and Q, find the smallest
number of integer points in P whose sum lies in Q.
Our approach also applies to high multiplicity scheduling problems in which
the number of copies of each job type is given in binary encoding and each type
comes with certain parameters such as release dates, processing times and
deadlines. We show that a variety of high multiplicity scheduling problems can
be solved in polynomial time if the number of job types is constant
Symmetric Submodular Function Minimization Under Hereditary Family Constraints
We present an efficient algorithm to find non-empty minimizers of a symmetric
submodular function over any family of sets closed under inclusion. This for
example includes families defined by a cardinality constraint, a knapsack
constraint, a matroid independence constraint, or any combination of such
constraints. Our algorithm make oracle calls to the submodular
function where is the cardinality of the ground set. In contrast, the
problem of minimizing a general submodular function under a cardinality
constraint is known to be inapproximable within (Svitkina
and Fleischer [2008]).
The algorithm is similar to an algorithm of Nagamochi and Ibaraki [1998] to
find all nontrivial inclusionwise minimal minimizers of a symmetric submodular
function over a set of cardinality using oracle calls. Their
procedure in turn is based on Queyranne's algorithm [1998] to minimize a
symmetric submodularComment: 13 pages, Submitted to SODA 201
Community Detection in Hypergraphs, Spiked Tensor Models, and Sum-of-Squares
We study the problem of community detection in hypergraphs under a stochastic
block model. Similarly to how the stochastic block model in graphs suggests
studying spiked random matrices, our model motivates investigating statistical
and computational limits of exact recovery in a certain spiked tensor model. In
contrast with the matrix case, the spiked model naturally arising from
community detection in hypergraphs is different from the one arising in the
so-called tensor Principal Component Analysis model. We investigate the
effectiveness of algorithms in the Sum-of-Squares hierarchy on these models.
Interestingly, our results suggest that these two apparently similar models
exhibit significantly different computational to statistical gaps.Comment: In proceedings of 2017 International Conference on Sampling Theory
and Applications (SampTA
Validation and determination of a reference interval for Canine HbA1c using an immunoturbidimetric assay
Background:
Hemoglobin A1c (HbA1c) provides a reliable measure of glycemic control over 2–3 months in human diabetes mellitus. In dogs, presence of HbA1c has been demonstrated, but there are no validated commercial assays.
Objective:
The purpose of the study was to validate a commercially available automated immunoturbidimetric assay for canine HbA1c and determine an RI in a hospital population.
Methods:
The specificity of the assay was assessed by inducing glycosylation in vitro using isolated canine hemoglobin, repeatability by measuring canine samples 5 times in succession, long term inter-assay imprecision by measuring supplied control materials, stability using samples stored at 4°C over 5 days and −20°C over 8 weeks, linearity by mixing samples of known HbA1c in differing proportions, and the effect of anticoagulants with paired samples. An RI was determined using EDTA-anticoagulated blood samples from 60 nondiabetic hospitalized animals of various ages and breeds. Hemoglobin A1c was also measured in 10 diabetic dogs.
Results:
The concentration of HbA1c increased proportionally with glucose concentration in vitro. For repeat measurements, the CV was 4.08% (range 1.16–6.10%). Samples were stable for 5 days at 4°C. The assay was linear within the assessed range. Heparin- and EDTA-anticoagulated blood provided comparable results. The RI for HbA1c was 9–18.5 mmol/mol. There was no apparent effect of age or breed on HbA1c. In diabetic dogs, HbA1c ranged from 14 to 48 mmol/mol.
Conclusions:
The assay provides a reliable method for canine HbA1c measurement with good analytic performance
- …