978 research outputs found

    Approximating the generalized terminal backup problem via half-integral multiflow relaxation

    Get PDF
    We consider a network design problem called the generalized terminal backup problem. Whereas earlier work investigated the edge-connectivity constraints only, we consider both edge- and node-connectivity constraints for this problem. A major contribution of this paper is the development of a strongly polynomial-time 4/3-approximation algorithm for the problem. Specifically, we show that a linear programming relaxation of the problem is half-integral, and that the half-integral optimal solution can be rounded to a 4/3-approximate solution. We also prove that the linear programming relaxation of the problem with the edge-connectivity constraints is equivalent to minimizing the cost of half-integral multiflows that satisfy flow demands given from terminals. This observation presents a strongly polynomial-time algorithm for computing a minimum cost half-integral multiflow under flow demand constraints

    Discrete Convex Functions on Graphs and Their Algorithmic Applications

    Full text link
    The present article is an exposition of a theory of discrete convex functions on certain graph structures, developed by the author in recent years. This theory is a spin-off of discrete convex analysis by Murota, and is motivated by combinatorial dualities in multiflow problems and the complexity classification of facility location problems on graphs. We outline the theory and algorithmic applications in combinatorial optimization problems

    Node-Connectivity Terminal Backup, Separately-Capacitated Multiflow, and Discrete Convexity

    Get PDF
    The terminal backup problems (Anshelevich and Karagiozova (2011)) form a class of network design problems: Given an undirected graph with a requirement on terminals, the goal is to find a minimum cost subgraph satisfying the connectivity requirement. The node-connectivity terminal backup problem requires a terminal to connect other terminals with a number of node-disjoint paths. This problem is not known whether is NP-hard or tractable. Fukunaga (2016) gave a 4/34/3-approximation algorithm based on LP-rounding scheme using a general LP-solver. In this paper, we develop a combinatorial algorithm for the relaxed LP to find a half-integral optimal solution in O(mlog(nUA)MF(kn,m+k2n))O(m\log (nUA)\cdot \operatorname{MF}(kn,m+k^2n)) time, where nn is the number of nodes, mm is the number of edges, kk is the number of terminals, AA is the maximum edge-cost, UU is the maximum edge-capacity, and MF(n,m)\operatorname{MF}(n',m') is the time complexity of a max-flow algorithm in a network with nn' nodes and mm' edges. The algorithm implies that the 4/34/3-approximation algorithm for the node-connectivity terminal backup problem is also efficiently implemented. For the design of algorithm, we explore a connection between the node-connectivity terminal backup problem and a new type of a multiflow, called a separately-capacitated multiflow. We show a min-max theorem which extends Lov\'{a}sz-Cherkassky theorem to the node-capacity setting. Our results build on discrete convexity in the node-connectivity terminal backup problem.Comment: A preliminary version of this paper was appeared in the proceedings of the 47th International Colloquium on Automata, Languages and Programming (ICALP 2020

    On the Complexity of tt-Closeness Anonymization and Related Problems

    Full text link
    An important issue in releasing individual data is to protect the sensitive information from being leaked and maliciously utilized. Famous privacy preserving principles that aim to ensure both data privacy and data integrity, such as kk-anonymity and ll-diversity, have been extensively studied both theoretically and empirically. Nonetheless, these widely-adopted principles are still insufficient to prevent attribute disclosure if the attacker has partial knowledge about the overall sensitive data distribution. The tt-closeness principle has been proposed to fix this, which also has the benefit of supporting numerical sensitive attributes. However, in contrast to kk-anonymity and ll-diversity, the theoretical aspect of tt-closeness has not been well investigated. We initiate the first systematic theoretical study on the tt-closeness principle under the commonly-used attribute suppression model. We prove that for every constant tt such that 0t<10\leq t<1, it is NP-hard to find an optimal tt-closeness generalization of a given table. The proof consists of several reductions each of which works for different values of tt, which together cover the full range. To complement this negative result, we also provide exact and fixed-parameter algorithms. Finally, we answer some open questions regarding the complexity of kk-anonymity and ll-diversity left in the literature.Comment: An extended abstract to appear in DASFAA 201

    Building Clusters with Lower-Bounded Sizes

    Get PDF
    Classical clustering problems search for a partition of objects into a fixed number of clusters. In many scenarios however the number of clusters is not known or necessarily fixed. Further, clusters are sometimes only considered to be of significance if they have a certain size. We discuss clustering into sets of minimum cardinality k without a fixed number of sets and present a general model for these types of problems. This general framework allows the comparison of different measures to assess the quality of a clustering. We specifically consider nine quality-measures and classify the complexity of the resulting problems with respect to k. Further, we derive some polynomial-time solvable cases for k = 2 with connections to matching-type problems which, among other graph problems, then are used to compute approximations for larger values of k

    Resource Buying Games

    Full text link
    In resource buying games a set of players jointly buys a subset of a finite resource set E (e.g., machines, edges, or nodes in a digraph). The cost of a resource e depends on the number (or load) of players using e, and has to be paid completely by the players before it becomes available. Each player i needs at least one set of a predefined family S_i in 2^E to be available. Thus, resource buying games can be seen as a variant of congestion games in which the load-dependent costs of the resources can be shared arbitrarily among the players. A strategy of player i in resource buying games is a tuple consisting of one of i's desired configurations S_i together with a payment vector p_i in R^E_+ indicating how much i is willing to contribute towards the purchase of the chosen resources. In this paper, we study the existence and computational complexity of pure Nash equilibria (PNE, for short) of resource buying games. In contrast to classical congestion games for which equilibria are guaranteed to exist, the existence of equilibria in resource buying games strongly depends on the underlying structure of the S_i's and the behavior of the cost functions. We show that for marginally non-increasing cost functions, matroids are exactly the right structure to consider, and that resource buying games with marginally non-decreasing cost functions always admit a PNE

    Comparison of MRI Spectroscopy software packages performance and application on HCV-infected patients’ real data

    Full text link
    Treballs Finals de Grau d'Enginyeria Biomèdica. Facultat de Medicina i Ciències de la Salut. Universitat de Barcelona. Curs: 2022-2023. Tutor/Director: Sala Llonch, Roser, Laredo Gregorio, Carlos1H MRS is conceived as a pioneer methodology for brain metabolism inspection and health status appraisal. Post-processing interventions are required to obtain explicit metabolite quantification values from which to derive diagnosis. On the grounds of addressing and covering such operation, multiple software packages have been recently developed and launched leading to an amorphous assortment of spectroscopic image processing tools, with lack of standardization and regulation. The current study thereby intends to judge the coherence and consistency of compound estimation outputs in terms of result variability by intercorrelation and intracorrelation analyses between appointed programs, being LCModel, Osprey, TARQUIN, and spant toolbox. The examination is performed on a 83-subject SVS short-TE 3T SIEMENS PRESS spectroscopic acquisitions’ collection, including healthy controls and HCV-infected patients assisted with DAA treatment. The analytical core of the project assesses software performance through the creation of a Python script in order to automatically compute and display the results sought. The statistical tests providing enough information to draw substantial conclusions stem from extraction of coefficient of determination (R2 ), Pearson’s coefficient (r), and intraclass correlation coefficient (ICC) together with representation of boxplots, rainclouds, and scatter plots easing data visualization. A clinical implementation is also entailed on the same basis, whose purpose is to reveal actual DAA treatment effect on HCV-infected patients by means of metabolite concentration alteration and hypothetical restoration. Conclusions declare evident and alarming variability among MRS platforms compromising the rigor, sharpness and systematization demanded in this discipline since quantification results hold incoherences, although they do not seem to affect or oppose medical determinations jeopardizing patient’s health. However, it would be interesting to extend the analysis to a greater cohort of subjects to reinforce and get to more solid resolutions
    corecore