1,809 research outputs found
Giant dipole resonance with exact treatment of thermal fluctuations
The shape fluctuations due to thermal effects in the giant dipole resonance
(GDR) observables are calculated using the exact free energies evaluated at
fixed spin and temperature. The results obtained are compared with Landau
theory calculations done by parameterizing the free energy. The Landau theory
is found to be insufficient when the shell effects are dominating.Comment: 5 pages, 2 figure
Local Graph Coloring and Index Coding
We present a novel upper bound for the optimal index coding rate. Our bound
uses a graph theoretic quantity called the local chromatic number. We show how
a good local coloring can be used to create a good index code. The local
coloring is used as an alignment guide to assign index coding vectors from a
general position MDS code. We further show that a natural LP relaxation yields
an even stronger index code. Our bounds provably outperform the state of the
art on index coding but at most by a constant factor.Comment: 14 Pages, 3 Figures; A conference version submitted to ISIT 2013;
typos correcte
Graph Theory versus Minimum Rank for Index Coding
We obtain novel index coding schemes and show that they provably outperform
all previously known graph theoretic bounds proposed so far. Further, we
establish a rather strong negative result: all known graph theoretic bounds are
within a logarithmic factor from the chromatic number. This is in striking
contrast to minrank since prior work has shown that it can outperform the
chromatic number by a polynomial factor in some cases. The conclusion is that
all known graph theoretic bounds are not much stronger than the chromatic
number.Comment: 8 pages, 2 figures. Submitted to ISIT 201
On Approximating the Sum-Rate for Multiple-Unicasts
We study upper bounds on the sum-rate of multiple-unicasts. We approximate
the Generalized Network Sharing Bound (GNS cut) of the multiple-unicasts
network coding problem with independent sources. Our approximation
algorithm runs in polynomial time and yields an upper bound on the joint source
entropy rate, which is within an factor from the GNS cut. It
further yields a vector-linear network code that achieves joint source entropy
rate within an factor from the GNS cut, but \emph{not} with
independent sources: the code induces a correlation pattern among the sources.
Our second contribution is establishing a separation result for vector-linear
network codes: for any given field there exist networks for which
the optimum sum-rate supported by vector-linear codes over for
independent sources can be multiplicatively separated by a factor of
, for any constant , from the optimum joint entropy
rate supported by a code that allows correlation between sources. Finally, we
establish a similar separation result for the asymmetric optimum vector-linear
sum-rates achieved over two distinct fields and
for independent sources, revealing that the choice of field
can heavily impact the performance of a linear network code.Comment: 10 pages; Shorter version appeared at ISIT (International Symposium
on Information Theory) 2015; some typos correcte
- …