567 research outputs found

    Upper and Lower Bounds for Weak Backdoor Set Detection

    Full text link
    We obtain upper and lower bounds for running times of exponential time algorithms for the detection of weak backdoor sets of 3CNF formulas, considering various base classes. These results include (omitting polynomial factors), (i) a 4.54^k algorithm to detect whether there is a weak backdoor set of at most k variables into the class of Horn formulas; (ii) a 2.27^k algorithm to detect whether there is a weak backdoor set of at most k variables into the class of Krom formulas. These bounds improve an earlier known bound of 6^k. We also prove a 2^k lower bound for these problems, subject to the Strong Exponential Time Hypothesis.Comment: A short version will appear in the proceedings of the 16th International Conference on Theory and Applications of Satisfiability Testin

    Explaining Snapshots of Network Diffusions: Structural and Hardness Results

    Full text link
    Much research has been done on studying the diffusion of ideas or technologies on social networks including the \textit{Influence Maximization} problem and many of its variations. Here, we investigate a type of inverse problem. Given a snapshot of the diffusion process, we seek to understand if the snapshot is feasible for a given dynamic, i.e., whether there is a limited number of nodes whose initial adoption can result in the snapshot in finite time. While similar questions have been considered for epidemic dynamics, here, we consider this problem for variations of the deterministic Linear Threshold Model, which is more appropriate for modeling strategic agents. Specifically, we consider both sequential and simultaneous dynamics when deactivations are allowed and when they are not. Even though we show hardness results for all variations we consider, we show that the case of sequential dynamics with deactivations allowed is significantly harder than all others. In contrast, sequential dynamics make the problem trivial on cliques even though it's complexity for simultaneous dynamics is unknown. We complement our hardness results with structural insights that can help better understand diffusions of social networks under various dynamics.Comment: 14 pages, 3 figure

    Using contracted solution graphs for solving reconfiguration problems.

    Get PDF
    We introduce a dynamic programming method for solving reconfiguration problems, based on contracted solution graphs, which are obtained from solution graphs by performing an appropriate series of edge contractions that decrease the graph size without losing any critical information needed to solve the reconfiguration problem under consideration. As an example, we consider a well-studied problem: given two k-colorings alpha and beta of a graph G, can alpha be modified into beta by recoloring one vertex of G at a time, while maintaining a k-coloring throughout? By applying our method in combination with a thorough exploitation of the graph structure we obtain a polynomial-time algorithm for (k-2)-connected chordal graphs

    Radiative corrections to deeply virtual Compton scattering

    Get PDF
    We discuss possibilities of measurement of deeply virtual Compton scattering amplitudes via different asymmetries in order to access the underlying skewed parton distributions. Perturbative one-loop coefficient functions and two-loop evolution kernels, calculated recently by a tentative use of residual conformal symmetry of QCD, are used for a model dependent numerical estimation of scattering amplitudes.Comment: 9 pages LaTeX, 3 figures, czjphyse.cls required Talk given by D. M\"uller at Inter. Workshop ``PRAHA-Spin99'', Prague, Sept. 6-11, 199

    Deeply virtual Compton scattering in next-to-leading order

    Get PDF
    We study the amplitude of deeply virtual Compton scattering in next-to-leading order of perturbation theory including the two-loop evolution effects for different sets of skewed parton distributions (SPDs). It turns out that in the minimal subtraction scheme the relative radiative corrections are of order 20-50%. We analyze the dependence of our predictions on the choice of SPD, that will allow to discriminate between possible models of SPDs from future high precision experimental data, and discuss shortly theoretical uncertainties induced by the radiative corrections.Comment: 10 pages, LaTeX, 3 figure

    Surface roughness during depositional growth and sublimation of ice crystals

    Get PDF
    Full version of an earlier discussion paper (Chou et al. 2018)Ice surface properties can modify the scattering properties of atmospheric ice crystals and therefore affect the radiative properties of mixed-phase and cirrus clouds. The Ice Roughness Investigation System (IRIS) is a new laboratory setup designed to investigate the conditions under which roughness develops on single ice crystals, based on their size, morphology and growth conditions (relative humidity and temperature). Ice roughness is quantified through the analysis of speckle in 2-D light-scattering patterns. Characterization of the setup shows that a supersaturation of 20 % with respect to ice and a temperature at the sample position as low as-40 °C could be achieved within IRIS. Investigations of the influence of humidity show that higher supersaturations with respect to ice lead to enhanced roughness and irregularities of ice crystal surfaces. Moreover, relative humidity oscillations lead to gradual ratcheting-up of roughness and irregularities, as the crystals undergo repeated growth-sublimation cycles. This memory effect also appears to result in reduced growth rates in later cycles. Thus, growth history, as well as supersaturation and temperature, influences ice crystal growth and properties, and future atmospheric models may benefit from its inclusion in the cloud evolution process and allow more accurate representation of not just roughness but crystal size too, and possibly also electrification properties.Peer reviewe

    Augmenting graphs to minimize the diameter

    Full text link
    We study the problem of augmenting a weighted graph by inserting edges of bounded total cost while minimizing the diameter of the augmented graph. Our main result is an FPT 4-approximation algorithm for the problem.Comment: 15 pages, 3 figure

    Bounded Search Tree Algorithms for Parameterized Cograph Deletion: Efficient Branching Rules by Exploiting Structures of Special Graph Classes

    Full text link
    Many fixed-parameter tractable algorithms using a bounded search tree have been repeatedly improved, often by describing a larger number of branching rules involving an increasingly complex case analysis. We introduce a novel and general search strategy that branches on the forbidden subgraphs of a graph class relaxation. By using the class of P4P_4-sparse graphs as the relaxed graph class, we obtain efficient bounded search tree algorithms for several parameterized deletion problems. We give the first non-trivial bounded search tree algorithms for the cograph edge-deletion problem and the trivially perfect edge-deletion problems. For the cograph vertex deletion problem, a refined analysis of the runtime of our simple bounded search algorithm gives a faster exponential factor than those algorithms designed with the help of complicated case distinctions and non-trivial running time analysis [21] and computer-aided branching rules [11].Comment: 23 pages. Accepted in Discrete Mathematics, Algorithms and Applications (DMAA

    Parameterized complexity of the MINCCA problem on graphs of bounded decomposability

    Full text link
    In an edge-colored graph, the cost incurred at a vertex on a path when two incident edges with different colors are traversed is called reload or changeover cost. The "Minimum Changeover Cost Arborescence" (MINCCA) problem consists in finding an arborescence with a given root vertex such that the total changeover cost of the internal vertices is minimized. It has been recently proved by G\"oz\"upek et al. [TCS 2016] that the problem is FPT when parameterized by the treewidth and the maximum degree of the input graph. In this article we present the following results for the MINCCA problem: - the problem is W[1]-hard parameterized by the treedepth of the input graph, even on graphs of average degree at most 8. In particular, it is W[1]-hard parameterized by the treewidth of the input graph, which answers the main open problem of G\"oz\"upek et al. [TCS 2016]; - it is W[1]-hard on multigraphs parameterized by the tree-cutwidth of the input multigraph; - it is FPT parameterized by the star tree-cutwidth of the input graph, which is a slightly restricted version of tree-cutwidth. This result strictly generalizes the FPT result given in G\"oz\"upek et al. [TCS 2016]; - it remains NP-hard on planar graphs even when restricted to instances with at most 6 colors and 0/1 symmetric costs, or when restricted to instances with at most 8 colors, maximum degree bounded by 4, and 0/1 symmetric costs.Comment: 25 pages, 11 figure

    Particle characterization at the Cape Verde atmospheric observatory during the 2007 RHaMBLe intensive

    Get PDF
    The chemical characterization of filter high volume (HV) and Berner impactor (BI) samples PM during RHaMBLe (Reactive Halogens in the Marine Boundary Layer) 2007 shows that the Cape Verde aerosol particles are mainly composed of sea salt, mineral dust and associated water. Minor components are nss-salts, OC and EC. The influence from the African continent on the aerosol constitution was generally small but air masses which came from south-western Europe crossing the Canary Islands transported dust to the sampling site together with other loadings. The mean mass concentration was determined for PM10 to 17 μg/m3 from impactor samples and to 24.2 μg/m3 from HV filter samples. Non sea salt (nss) components of PM were found in the submicron fractions and nitrate in the coarse mode fraction. Bromide was found in all samples with much depleted concentrations in the range 1–8 ng/m3 compared to fresh sea salt aerosol indicating intense atmospheric halogen chemistry. Loss of bromide by ozone reaction during long sampling time is supposed and resulted totally in 82±12% in coarse mode impactor samples and in filter samples in 88±6% bromide deficits. A chloride deficit was determined to 8% and 1% for the coarse mode particles (3.5–10 μm; 1.2–3.5 μm) and to 21% for filter samples. During 14 May with high mineral dust loads also the maximum of OC (1.71μg/m3) and EC (1.25 μg/m3) was measured. The minimum of TC (0.25 μg/m3) was detected during the period 25 to 27 May when pure marine air masses arrived. The concentrations of carbonaceous material decrease with increasing particle size from 60% for the ultra fine particles to 2.5% in coarse mode PM. Total iron (dust vs. non-dust: 0.53 vs. 0.06 μg m3), calcium (0.22 vs. 0.03 μg m3) and potassium (0.33 vs. 0.02 μg m3) were found as good indicators for dust periods because of their heavily increased concentration in the 1.2 to 3.5 μm fraction as compared to their concentration during the non-dust periods. For the organic constituents, oxalate (78–151 ng/m3) and methanesulfonic acid (MSA, 25–100 ng/m3) are the major compounds identified. A good correlation between nss-sulphate and MSA was found for the majority of days indicating active DMS chemistry and low anthropogenic influences
    corecore