307 research outputs found
Sampling Arborescences in Parallel
We study the problem of sampling a uniformly random directed rooted spanning tree, also known as an arborescence, from a possibly weighted directed graph. Classically, this problem has long been known to be polynomial-time solvable; the exact number of arborescences can be computed by a determinant [Tutte, 1948], and sampling can be reduced to counting [Jerrum et al., 1986; Jerrum and Sinclair, 1996]. However, the classic reduction from sampling to counting seems to be inherently sequential. This raises the question of designing efficient parallel algorithms for sampling. We show that sampling arborescences can be done in RNC.
For several well-studied combinatorial structures, counting can be reduced to the computation of a determinant, which is known to be in NC [Csanky, 1975]. These include arborescences, planar graph perfect matchings, Eulerian tours in digraphs, and determinantal point processes. However, not much is known about efficient parallel sampling of these structures. Our work is a step towards resolving this mystery
MODELS FOR GREENFIELD AND INCREMENTAL CELLULAR NETWORK PLANNING
Mobility, as provided in cellular networks, is largely affected by the location of the base stations. To a large extent, the location of base stations is determined by the quantity of base stations available to provide coverage. It is therefore not surprising that the quantity and subsequent location of base stations will not only impact service delivery but also have a large associated cost for implementation. Generally, the higher the quantity of base stations required to provide coverage, the greater the cost of implementation and operation of the radio network. This thesis proposes a modified optimization model to aid the cell planning process. This model, unlike those surveyed, is applicable to both green field and incremental network designs. The variation in model design is fundamental in ensuring cost effective growth and expansion of cellular networks. Numerical studies of the modified model applied to both abstract and real system configurations are carried out using MATLAB. Terrain data from Kampala, Uganda, was used to aid the study. Results show that the antenna height significantly determines the solution of the objective function. In addition, it is shown that slight variations in the cost association between the antenna height and the site construction requirements can be decisively used for predefined targeted network planning. A comparison is also made between an actual network installation and the estimates provided by the model. As expected, results from the study show that the difference between the estimated count and the actual count can be adEquately minimized by slight variations in antenna height requirements
Quadratic Speedups in Parallel Sampling from Determinantal Distributions
We study the problem of parallelizing sampling from distributions related to
determinants: symmetric, nonsymmetric, and partition-constrained determinantal
point processes, as well as planar perfect matchings. For these distributions,
the partition function, a.k.a. the count, can be obtained via matrix
determinants, a highly parallelizable computation; Csanky proved it is in NC.
However, parallel counting does not automatically translate to parallel
sampling, as classic reductions between the two are inherently sequential. We
show that a nearly quadratic parallel speedup over sequential sampling can be
achieved for all the aforementioned distributions. If the distribution is
supported on subsets of size of a ground set, we show how to approximately
produce a sample in time with polynomially
many processors for any constant . In the two special cases of symmetric
determinantal point processes and planar perfect matchings, our bound improves
to and we show how to sample exactly in these cases.
As our main technical contribution, we fully characterize the limits of
batching for the steps of sampling-to-counting reductions. We observe that only
steps can be batched together if we strive for exact sampling, even in
the case of nonsymmetric determinantal point processes. However, we show that
for approximate sampling, steps can be
batched together, for any entropically independent distribution, which includes
all mentioned classes of determinantal point processes. Entropic independence
and related notions have been the source of breakthroughs in Markov chain
analysis in recent years, so we expect our framework to prove useful for
distributions beyond those studied in this work.Comment: 33 pages, SPAA 202
Contribution to resource management in cellular access networks with limited backhaul capacity
La interfaz radio de los sistemas de comunicaciones móviles es normalmente considerada como
la única limitación de capacidad en la red de acceso radio. Sin embargo, a medida que se van
desplegando nuevas y más eficientes interfaces radio, y de que el tráfico de datos y multimedia va
en aumento, existe la creciente preocupación de que la infraestructura de transporte (backhaul) de
la red celular pueda convertirse en el cuello de botella en algunos escenarios. En este contexto, la
tesis se centra en el desarrollo de técnicas de gestión de recursos que consideran de manera
conjunta la gestión de recursos en la interfaz radio y el backhaul. Esto conduce a un nuevo
paradigma donde los recursos del backhaul se consideran no sólo en la etapa de dimensionamiento,
sino que además son incluidos en la problemática de gestión de recursos.
Sobre esta base, el primer objetivo de la tesis consiste en evaluar los requerimientos de
capacidad en las redes de acceso radio que usan IP como tecnología de transporte, de acuerdo a las
recientes tendencias de la arquitectura de red. En particular, se analiza el impacto que tiene una
solución de transporte basada en IP sobre la capacidad de transporte necesaria para satisfacer los
requisitos de calidad de servicio en la red de acceso. La evaluación se realiza en el contexto de la
red de acceso radio de UMTS, donde se proporciona una caracterización detallada de la interfaz
Iub. El análisis de requerimientos de capacidad se lleva a cabo para dos diferentes escenarios:
canales dedicados y canales de alta velocidad. Posteriormente, con el objetivo de aprovechar
totalmente los recursos disponibles en el acceso radio y el backhaul, esta tesis propone un marco de
gestión conjunta de recursos donde la idea principal consiste en incorporar las métricas de la red de
transporte dentro del problema de gestión de recursos. A fin de evaluar los beneficios del marco de
gestión de recursos propuesto, esta tesis se centra en la evaluación del problema de asignación de
base, como estrategia para distribuir el tráfico entre las estaciones base en función de los niveles de
carga tanto en la interfaz radio como en el backhaul. Este problema se analiza inicialmente
considerando una red de acceso radio genérica, mediante la definición de un modelo analítico
basado en cadenas de Markov. Dicho modelo permite calcular la ganancia de capacidad que puede
alcanzar la estrategia de asignación de base propuesta. Posteriormente, el análisis de la estrategia
propuesta se extiende considerando tecnologías específicas de acceso radio. En particular, en el
contexto de redes WCDMA se desarrolla un algoritmo de asignación de base basado en simulatedannealing
cuyo objetivo es maximizar una función de utilidad que refleja el grado de satisfacción
de las asignaciones respecto los recursos radio y transporte. Finalmente, esta tesis aborda el diseño
y evaluación de un algoritmo de asignación de base para los futuros sistemas de banda ancha
basados en OFDMA. En este caso, el problema de asignación de base se modela como un problema
de optimización mediante el uso de un marco de funciones de utilidad y funciones de coste de
recursos. El problema planteado, que considera que existen restricciones de recursos tanto en la
interfaz radio como en el backhaul, es mapeado a un problema de optimización conocido como
Multiple-Choice Multidimensional Knapsack Problem (MMKP). Posteriormente, se desarrolla un
algoritmo de asignación de base heurístico, el cual es evaluado y comparado con esquemas de
asignación basados exclusivamente en criterios radio. El algoritmo concebido se basa en el uso de
los multiplicadores de Lagrange y está diseñado para aprovechar de manera simultánea el balanceo
de carga en la intefaz radio y el backhaul.Postprint (published version
Graph Sparsification by Edge-Connectivity and Random Spanning Trees
We present new approaches to constructing graph sparsifiers --- weighted
subgraphs for which every cut has the same value as the original graph, up to a
factor of . Our first approach independently samples each
edge with probability inversely proportional to the edge-connectivity
between and . The fact that this approach produces a sparsifier resolves
a question posed by Bencz\'ur and Karger (2002). Concurrent work of Hariharan
and Panigrahi also resolves this question. Our second approach constructs a
sparsifier by forming the union of several uniformly random spanning trees.
Both of our approaches produce sparsifiers with
edges. Our proofs are based on extensions of Karger's contraction algorithm,
which may be of independent interest
Techniques for computing with low-independence randomness
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1990.Includes bibliographical references (p. 105-110).by John Taylor Rompel.Ph.D
Hadwiger Integration of Definable Functions
This thesis denes and classies valuations on denable functionals. The intrinsicvolumes are valuations on tame subsets of R^n, and by easy extension, valuations on functionals on R^n with nitely many level sets, each a tame subset of R^n. We extend these valuations, which we call Hadwiger integrals, to denable functionals on R^n, and present some important properties of the valuations. With the appropriate topologies on the set of denable functionals, we obtain dual classication theorems for general valuations on such functionals. We also explore integral transforms, convergence results, and applications of the Hadwiger integrals
Inductive biases and metaknowledge representations for search-based optimization
"What I do not understand, I can still create."- H. Sayama
The following work follows closely the aforementioned bonmot. Guided by questions such as: ``How can evolutionary processes exhibit learning behavior and consolidate knowledge?´´, ``What are cognitive models of problem-solving?´´ and ``How can we harness these altogether as computational techniques?´´, we clarify within this work essentials required to implement them for metaheuristic search and optimization.We therefore look into existing models of computational problem-solvers and compare these with existing methodology in literature. Particularly, we find that the meta-learning model, which frames problem-solving in terms of domain-specific inductive biases and the arbitration thereof through means of high-level abstractions resolves outstanding issues with methodology proposed within the literature. Noteworthy, it can be also related to ongoing research on algorithm selection and configuration frameworks. We therefore look in what it means to implement such a model by first identifying inductive biases in terms of algorithm components and modeling these with density estimation techniques. And secondly, propose methodology to process metadata generated by optimization algorithms in an automated manner through means of deep pattern recognition architectures for spatio-temporal feature extraction. At last we look into an exemplary shape optimization problem which allows us to gain insight into what it means to apply our methodology to application scenarios. We end our work with a discussion on future possible directions to explore and discuss the limitations of such frameworks for system deployment
- …