978 research outputs found
Approximating the generalized terminal backup problem via half-integral multiflow relaxation
We consider a network design problem called the generalized terminal backup
problem. Whereas earlier work investigated the edge-connectivity constraints
only, we consider both edge- and node-connectivity constraints for this
problem. A major contribution of this paper is the development of a strongly
polynomial-time 4/3-approximation algorithm for the problem. Specifically, we
show that a linear programming relaxation of the problem is half-integral, and
that the half-integral optimal solution can be rounded to a 4/3-approximate
solution. We also prove that the linear programming relaxation of the problem
with the edge-connectivity constraints is equivalent to minimizing the cost of
half-integral multiflows that satisfy flow demands given from terminals. This
observation presents a strongly polynomial-time algorithm for computing a
minimum cost half-integral multiflow under flow demand constraints
Discrete Convex Functions on Graphs and Their Algorithmic Applications
The present article is an exposition of a theory of discrete convex functions
on certain graph structures, developed by the author in recent years. This
theory is a spin-off of discrete convex analysis by Murota, and is motivated by
combinatorial dualities in multiflow problems and the complexity classification
of facility location problems on graphs. We outline the theory and algorithmic
applications in combinatorial optimization problems
Node-Connectivity Terminal Backup, Separately-Capacitated Multiflow, and Discrete Convexity
The terminal backup problems (Anshelevich and Karagiozova (2011)) form a
class of network design problems: Given an undirected graph with a requirement
on terminals, the goal is to find a minimum cost subgraph satisfying the
connectivity requirement. The node-connectivity terminal backup problem
requires a terminal to connect other terminals with a number of node-disjoint
paths. This problem is not known whether is NP-hard or tractable. Fukunaga
(2016) gave a -approximation algorithm based on LP-rounding scheme using a
general LP-solver. In this paper, we develop a combinatorial algorithm for the
relaxed LP to find a half-integral optimal solution in time, where is the number of nodes, is
the number of edges, is the number of terminals, is the maximum
edge-cost, is the maximum edge-capacity, and is
the time complexity of a max-flow algorithm in a network with nodes and
edges. The algorithm implies that the -approximation algorithm for
the node-connectivity terminal backup problem is also efficiently implemented.
For the design of algorithm, we explore a connection between the
node-connectivity terminal backup problem and a new type of a multiflow, called
a separately-capacitated multiflow. We show a min-max theorem which extends
Lov\'{a}sz-Cherkassky theorem to the node-capacity setting. Our results build
on discrete convexity in the node-connectivity terminal backup problem.Comment: A preliminary version of this paper was appeared in the proceedings
of the 47th International Colloquium on Automata, Languages and Programming
(ICALP 2020
On the Complexity of -Closeness Anonymization and Related Problems
An important issue in releasing individual data is to protect the sensitive
information from being leaked and maliciously utilized. Famous privacy
preserving principles that aim to ensure both data privacy and data integrity,
such as -anonymity and -diversity, have been extensively studied both
theoretically and empirically. Nonetheless, these widely-adopted principles are
still insufficient to prevent attribute disclosure if the attacker has partial
knowledge about the overall sensitive data distribution. The -closeness
principle has been proposed to fix this, which also has the benefit of
supporting numerical sensitive attributes. However, in contrast to
-anonymity and -diversity, the theoretical aspect of -closeness has
not been well investigated.
We initiate the first systematic theoretical study on the -closeness
principle under the commonly-used attribute suppression model. We prove that
for every constant such that , it is NP-hard to find an optimal
-closeness generalization of a given table. The proof consists of several
reductions each of which works for different values of , which together
cover the full range. To complement this negative result, we also provide exact
and fixed-parameter algorithms. Finally, we answer some open questions
regarding the complexity of -anonymity and -diversity left in the
literature.Comment: An extended abstract to appear in DASFAA 201
Building Clusters with Lower-Bounded Sizes
Classical clustering problems search for a partition of objects into a fixed number of clusters. In many scenarios however the number of clusters is not known or necessarily fixed. Further, clusters are sometimes only considered to be of significance if they have a certain size. We discuss clustering into sets of minimum cardinality k without a fixed number of sets and present a general model for these types of problems. This general framework allows the comparison of different measures to assess the quality of a clustering. We specifically consider nine quality-measures and classify the complexity of the resulting problems with respect to k. Further, we derive some polynomial-time solvable cases for k = 2 with connections to matching-type problems which, among other graph problems, then are used to compute approximations for larger values of k
Resource Buying Games
In resource buying games a set of players jointly buys a subset of a finite
resource set E (e.g., machines, edges, or nodes in a digraph). The cost of a
resource e depends on the number (or load) of players using e, and has to be
paid completely by the players before it becomes available. Each player i needs
at least one set of a predefined family S_i in 2^E to be available. Thus,
resource buying games can be seen as a variant of congestion games in which the
load-dependent costs of the resources can be shared arbitrarily among the
players. A strategy of player i in resource buying games is a tuple consisting
of one of i's desired configurations S_i together with a payment vector p_i in
R^E_+ indicating how much i is willing to contribute towards the purchase of
the chosen resources. In this paper, we study the existence and computational
complexity of pure Nash equilibria (PNE, for short) of resource buying games.
In contrast to classical congestion games for which equilibria are guaranteed
to exist, the existence of equilibria in resource buying games strongly depends
on the underlying structure of the S_i's and the behavior of the cost
functions. We show that for marginally non-increasing cost functions, matroids
are exactly the right structure to consider, and that resource buying games
with marginally non-decreasing cost functions always admit a PNE
Comparison of MRI Spectroscopy software packages performance and application on HCV-infected patients’ real data
Treballs Finals de Grau d'Enginyeria Biomèdica. Facultat de Medicina i Ciències de la Salut. Universitat de Barcelona. Curs: 2022-2023. Tutor/Director: Sala Llonch, Roser, Laredo Gregorio, Carlos1H MRS is conceived as a pioneer methodology for brain metabolism inspection and health status
appraisal. Post-processing interventions are required to obtain explicit metabolite quantification
values from which to derive diagnosis. On the grounds of addressing and covering such operation,
multiple software packages have been recently developed and launched leading to an amorphous
assortment of spectroscopic image processing tools, with lack of standardization and regulation.
The current study thereby intends to judge the coherence and consistency of compound estimation
outputs in terms of result variability by intercorrelation and intracorrelation analyses between
appointed programs, being LCModel, Osprey, TARQUIN, and spant toolbox. The examination is
performed on a 83-subject SVS short-TE 3T SIEMENS PRESS spectroscopic acquisitions’
collection, including healthy controls and HCV-infected patients assisted with DAA treatment. The
analytical core of the project assesses software performance through the creation of a Python script
in order to automatically compute and display the results sought. The statistical tests providing
enough information to draw substantial conclusions stem from extraction of coefficient of
determination (R2
), Pearson’s coefficient (r), and intraclass correlation coefficient (ICC) together
with representation of boxplots, rainclouds, and scatter plots easing data visualization. A clinical
implementation is also entailed on the same basis, whose purpose is to reveal actual DAA
treatment effect on HCV-infected patients by means of metabolite concentration alteration and
hypothetical restoration. Conclusions declare evident and alarming variability among MRS
platforms compromising the rigor, sharpness and systematization demanded in this discipline since
quantification results hold incoherences, although they do not seem to affect or oppose medical
determinations jeopardizing patient’s health. However, it would be interesting to extend the analysis
to a greater cohort of subjects to reinforce and get to more solid resolutions
- …