154 research outputs found
Odd Paths, Cycles and -joins: Connections and Algorithms
Minimizing the weight of an edge set satisfying parity constraints is a
challenging branch of combinatorial optimization as witnessed by the binary
hypergraph chapter of Alexander Schrijver's book ``Combinatorial Optimization''
(Chapter 80). This area contains relevant graph theory problems including open
cases of the Max Cut problem, or some multiflow problems. We clarify the
interconnections of some problems and establish three levels of difficulties.
On the one hand, we prove that the Shortest Odd Path problem in an undirected
graph without cycles of negative total weight and several related problems are
NP-hard, settling a long-standing open question asked by Lov\'asz (Open Problem
27 in Schrijver's book ``Combinatorial Optimization''. On the other hand, we
provide a polynomial-time algorithm to the closely related and well-studied
Minimum-weight Odd -Join problem for non-negative weights, whose
complexity, however, was not known; more generally, we solve the Minimum-weight
Odd -Join problem in FPT time when parameterized by . If negative
weights are also allowed, then finding a minimum-weight odd -join is
equivalent to the Minimum-weight Odd -Join problem for arbitrary weights,
whose complexity is only conjectured to be polynomially solvable. The analogous
problems for digraphs are also considered.Comment: 24 pages, 2 figure
群ラベル付きグラフにおける組合せ最適化
学位の種別: 課程博士審査委員会委員 : (主査)東京大学教授 岩田 覚, 東京大学教授 定兼 邦彦, 東京大学教授 今井 浩, 国立情報学研究所教授 河原林 健一, 東京大学准教授 平井 広志University of Tokyo(東京大学
A constraint programming approach to the additional relay placement problem in wireless sensor networks
A Wireless Sensor Network (WSN) is composed of many sensor nodes which transmit their data wirelessly over a multi-hop network to data sinks. Since WSNs are subject to node failures, the network topology should be robust, so that when a failure does occur, data delivery can continue from all surviving nodes. A WSN is k-robust if an alternate length-constrained route to a sink is available for each surviving node after the failure of up to k-1 nodes. Determining whether a network is k-robust is an NP-complete problem. We develop a Constraint Programming (CP) approach for solving this problem which outperforms a Mixed-Integer Programming (MIP) model on larger problems. A network can be made robust by deploying extra relay nodes, and we extend our CP approach to an optimisation problem by using QuickXplain to search for a minimal set of relays, and compare it to a state-of-the-art local search approach
Tree Graphs and Orthogonal Spanning Tree Decompositions
Given a graph G, we construct T(G), called the tree graph of G. The vertices of T(G) are the spanning trees of G, with edges between vertices when their respective spanning trees differ only by a single edge. In this paper we detail many new results concerning tree graphs, involving topics such as clique decomposition, planarity, and automorphism groups. We also investigate and present a number of new results on orthogonal tree decompositions of complete graphs
Bilingual dictionary generation and enrichment via graph exploration
In recent years, we have witnessed a steady growth of linguistic information represented and exposed as linked data on the Web. Such linguistic linked data have stimulated the development and use of openly available linguistic knowledge graphs, as is the case with the Apertium RDF, a collection of interconnected bilingual dictionaries represented and accessible through Semantic Web standards. In this work, we explore techniques that exploit the graph nature of bilingual dictionaries to automatically infer new links (translations). We build upon a cycle density based method: partitioning the graph into biconnected components for a speed-up, and simplifying the pipeline through a careful structural analysis that reduces hyperparameter tuning requirements. We also analyse the shortcomings of traditional evaluation metrics used for translation inference and propose to complement them with new ones, both-word precision (BWP) and both-word recall (BWR), aimed at being more informative of algorithmic improvements. Over twenty-seven language pairs, our algorithm produces dictionaries about 70% the size of existing Apertium RDF dictionaries at a high BWP of 85% from scratch within a minute. Human evaluation shows that 78% of the additional translations generated for dictionary enrichment are correct as well. We further describe an interesting use-case: inferring synonyms within a single language, on which our initial human-based evaluation shows an average accuracy of 84%. We release our tool as free/open-source software which can not only be applied to RDF data and Apertium dictionaries, but is also easily usable for other formats and communities.This work was partially funded by the Prêt-à-LLOD project within the European Union’s Horizon 2020 research and innovation programme under grant agreement no. 825182. This article is also based upon work from COST Action CA18209 NexusLinguarum, “European network for Web-centred linguistic data science”, supported by COST (European Cooperation in Science and Technology). It has been also partially supported by the Spanish projects TIN2016-78011-C4-3-R and PID2020-113903RB-I00 (AEI/FEDER, UE), by DGA/FEDER, and by the Agencia Estatal de Investigación of the Spanish Ministry of Economy and Competitiveness and the European Social Fund through the “Ramón y Cajal” program (RYC2019-028112-I)
Embedding strategies for adiabatic quantum computation
Màster Oficial de Ciència i Tecnologia Quàntiques / Quantum Science and Technology, Facultat de Física, Universitat de Barcelona. Curs: 2021-2022. Tutora: Marta P. Estarellas, co-tutors: Matthias Werner, Ana Palacios, Jordi RiuQuantum Annealing (QA) is an alternative to gate based Quantum Computation (QC) to solve problems not efficiently tractable on classical devices. Right now, QA is advantageous over QC in the noisy intermediate-scale quantum (NISQ) era for its lesser need of error correction codes and the resource overhead they suppose. However, hardware limitations in terms of connectivity and feasible interactions create incompatibilities between the chip and the structure of the problem, which leads to what is called the graph embedding problem. To circumvent this obstacle, we first analyse the current solutions based on
heuristic algorithms and their limitations. We then explore the potential of a digital assisted annealing (DaA) approach. The novelty of this technique relies on the fact that the state generated by the quantum annealer is used as the initial state of the variational circuit, the role of which is to approach a target solution the annealer could not reach by itself due to its hardware limitations. We complete this thesis with a detailed study on the performance of our approach for different scenarios and branches we would like to explore
VLSI Design
This book provides some recent advances in design nanometer VLSI chips. The selected topics try to present some open problems and challenges with important topics ranging from design tools, new post-silicon devices, GPU-based parallel computing, emerging 3D integration, and antenna design. The book consists of two parts, with chapters such as: VLSI design for multi-sensor smart systems on a chip, Three-dimensional integrated circuits design for thousand-core processors, Parallel symbolic analysis of large analog circuits on GPU platforms, Algorithms for CAD tools VLSI design, A multilevel memetic algorithm for large SAT-encoded problems, etc
Recommended from our members
On Extractors and Exposure-Resilient Functions for Sublogarithmic Entropy
We study resilient functions and exposure-resilient functions in the low-entropy regime. A resilient function (a.k.a. deterministic extractor for oblivious bit-fixing sources) maps any distribution on n -bit strings in which k bits are uniformly random and the rest are fixed into an output distribution that is close to uniform. With exposure-resilient functions, all the input bits are random, but we ask that the output be close to uniform conditioned on any subset of n - k input bits. In this paper, we focus on the case that k is sublogarithmic in n.
We simplify and improve an explicit construction of resilient functions for k sublogarithmic in n due to Kamp and Zuckerman (SICOMP 2006), achieving error exponentially small in k rather than polynomially small in k. Our main result is that when k is sublogarithmic in n, the short output length of this construction (O(log k) output bits) is optimal for extractors computable by a large class of space-bounded streaming algorithms.
Next, we show that a random function is a resilient function with high probability if and only if k is superlogarithmic in n, suggesting that our main result may apply more generally. In contrast, we show that a random function is a static (resp. adaptive) exposure-resilient function with high probability even if k is as small as a constant (resp. loglog n). No explicit exposure-resilient functions achieving these parameters are known.Engineering and Applied SciencesMathematic
- …