6 research outputs found
Benchmarking quantum co-processors in an application-centric, hardware-agnostic and scalable way
Existing protocols for benchmarking current quantum co-processors fail to
meet the usual standards for assessing the performance of
High-Performance-Computing platforms. After a synthetic review of these
protocols -- whether at the gate, circuit or application level -- we introduce
a new benchmark, dubbed Atos Q-score (TM), that is application-centric,
hardware-agnostic and scalable to quantum advantage processor sizes and beyond.
The Q-score measures the maximum number of qubits that can be used effectively
to solve the MaxCut combinatorial optimization problem with the Quantum
Approximate Optimization Algorithm. We give a robust definition of the notion
of effective performance by introducing an improved approximation ratio based
on the scaling of random and optimal algorithms. We illustrate the behavior of
Q-score using perfect and noisy simulations of quantum processors. Finally, we
provide an open-source implementation of Q-score that makes it easy to compute
the Q-score of any quantum hardware
On the minimum bisection of random regular graphs
In this paper we give new asymptotically almost sure lower and upper bounds
on the bisection width of random regular graphs. The main contribution is a
new lower bound on the bisection width of , based on a first moment
method together with a structural decomposition of the graph, thereby improving
a 27 year old result of Kostochka and Melnikov. We also give a complementary
upper bound of , combining known spectral ideas with original
combinatorial insights. Developping further this approach, with the help of
Monte Carlo simulations, we obtain a non-rigorous upper bound of .Comment: 48 pages, 20 figure
MAX CUT in Weighted Random Intersection Graphs and Discrepancy of Sparse Random Set Systems
Let be a set of vertices, a set of labels, and let
be an matrix of independent Bernoulli random
variables with success probability . A random instance
of the weighted random intersection graph model
is constructed by drawing an edge with weight
between any two vertices for which this weight is larger than 0. In this
paper we study the average case analysis of Weighted Max Cut, assuming the
input is a weighted random intersection graph, i.e. given
we wish to find a partition of into two
sets so that the total weight of the edges having one endpoint in each set is
maximized. We initially prove concentration of the weight of a maximum cut of
around its expected value, and then show that,
when the number of labels is much smaller than the number of vertices, a random
partition of the vertices achieves asymptotically optimal cut weight with high
probability (whp). Furthermore, in the case and constant average degree,
we show that whp, a majority type algorithm outputs a cut with weight that is
larger than the weight of a random cut by a multiplicative constant strictly
larger than 1. Then, we highlight a connection between the computational
problem of finding a weighted maximum cut in
and the problem of finding a 2-coloring with minimum discrepancy for a set
system with incidence matrix . We exploit this connection
by proposing a (weak) bipartization algorithm for the case that, when it terminates, its output can be used to find
a 2-coloring with minimum discrepancy in . Finally, we prove that, whp
this 2-coloring corresponds to a bipartition with maximum cut-weight in
.Comment: 18 page
On the max-cut of sparse random graphs
We consider the problem of estimating the size of a maximum cut (Max-Cut problem) in a random Erdős-Rényi graph on n nodes and ⌊cn⌋ edges. It is shown in Coppersmith et al. that the size of the maximum cut in this graph normalized by the number of nodes belongs to the asymptotic region [c/2 + 0.37613√c, c/2 + 0.58870√c] with high probability (w.h.p.) as n increases, for all sufficiently large c. The upper bound was obtained by application of the first moment method, and the lower bound was obtained by constructing algorithmically a cut which achieves the stated lower bound. In this paper, we improve both upper and lower bounds by introducing a novel bounding technique. Specifically, we establish that the size of the maximum cut normalized by the number of nodes belongs to the interval [c/2 + 0.47523√
c, c/2 + 0.55909√c] w.h.p. as n increases, for all sufficiently large c. Instead of considering the expected number of cuts achieving a particular value as is done in the application of the first moment method, we observe that every maximum size cut satisfies a certain local optimality property, and we compute the expected number of cuts with a given value satisfying this local optimality property. Estimating this expectation amounts to solving a rather involved two dimensional large deviations problem. We solve this underlying large deviation problem asymptotically as c increases and use it to obtain an improved upper bound on the Max-Cut value. The lower bound is obtained by application of the second moment method, coupled with the same local optimality constraint, and is shown to work up to the stated lower bound value c/2+0.47523√c. It is worth noting that both bounds are stronger than the ones obtained by standard first and second moment methods. Finally, we also obtain an improved lower bound of (Formula presented.) on the Max-Cut for the random cubic graph or any cubic graph with large girth, improving the previous best bound of 1.33773n