4,565 research outputs found

    Effects of changing mosquito host searching behaviour on the cost effectiveness of a mass distribution of long-lasting, insecticidal nets : a modelling study

    Get PDF
    The effectiveness of long-lasting, insecticidal nets (LLINs) in preventing malaria is threatened by the changing biting behaviour of mosquitoes, from nocturnal and endophagic to crepuscular and exophagic, and by their increasing resistance to insecticides.; Using epidemiological stochastic simulation models, we studied the impact of a mass LLIN distribution on Plasmodium falciparum malaria. Specifically, we looked at impact in terms of episodes prevented during the effective life of the batch and in terms of net health benefits (NHB) expressed in disability adjusted life years (DALYs) averted, depending on biting behaviour, resistance (as measured in experimental hut studies), and on pre-intervention transmission levels.; Results were very sensitive to assumptions about the probabilistic nature of host searching behaviour. With a shift towards crepuscular biting, under the assumption that individual mosquitoes repeat their behaviour each gonotrophic cycle, LLIN effectiveness was far less than when individual mosquitoes were assumed to vary their behaviour between gonotrophic cycles. LLIN effectiveness was equally sensitive to variations in host-searching behaviour (if repeated) and to variations in resistance. LLIN effectiveness was most sensitive to pre-intervention transmission level, with LLINs being least effective at both very low and very high transmission levels, and most effective at around four infectious bites per adult per year. A single LLIN distribution round remained cost effective, except in transmission settings with a pre-intervention inoculation rate of over 128 bites per year and with resistant mosquitoes that displayed a high proportion (over 40%) of determined crepuscular host searching, where some model variants showed negative NHB. Shifts towards crepuscular host searching behaviour can be as important in reducing LLIN effectiveness and cost effectiveness as resistance to pyrethroids. As resistance to insecticides is likely to slow dow the development of behavioural resistance and vice versa, the two types of resistance are unlikely to occur within the same mosquito population. LLINs are likely cost effective interventions against malaria, even in areas with strong resistance to pyrethroids or where a large proportion of host-mosquito contact occurs during times when LLIN users are not under their nets

    Parameterized Complexity of Problems in Coalitional Resource Games

    Full text link
    Coalition formation is a key topic in multi-agent systems. Coalitions enable agents to achieve goals that they may not have been able to achieve on their own. Previous work has shown problems in coalitional games to be computationally hard. Wooldridge and Dunne (Artificial Intelligence 2006) studied the classical computational complexity of several natural decision problems in Coalitional Resource Games (CRG) - games in which each agent is endowed with a set of resources and coalitions can bring about a set of goals if they are collectively endowed with the necessary amount of resources. The input of coalitional resource games bundles together several elements, e.g., the agent set Ag, the goal set G, the resource set R, etc. Shrot, Aumann and Kraus (AAMAS 2009) examine coalition formation problems in the CRG model using the theory of Parameterized Complexity. Their refined analysis shows that not all parts of input act equal - some instances of the problem are indeed tractable while others still remain intractable. We answer an important question left open by Shrot, Aumann and Kraus by showing that the SC Problem (checking whether a Coalition is Successful) is W[1]-hard when parameterized by the size of the coalition. Then via a single theme of reduction from SC, we are able to show that various problems related to resources, resource bounds and resource conflicts introduced by Wooldridge et al are 1. W[1]-hard or co-W[1]-hard when parameterized by the size of the coalition. 2. para-NP-hard or co-para-NP-hard when parameterized by |R|. 3. FPT when parameterized by either |G| or |Ag|+|R|.Comment: This is the full version of a paper that will appear in the proceedings of AAAI 201

    Fixed-Parameter Tractability of Directed Multiway Cut Parameterized by the Size of the Cutset

    Full text link
    Given a directed graph GG, a set of kk terminals and an integer pp, the \textsc{Directed Vertex Multiway Cut} problem asks if there is a set SS of at most pp (nonterminal) vertices whose removal disconnects each terminal from all other terminals. \textsc{Directed Edge Multiway Cut} is the analogous problem where SS is a set of at most pp edges. These two problems indeed are known to be equivalent. A natural generalization of the multiway cut is the \emph{multicut} problem, in which we want to disconnect only a set of kk given pairs instead of all pairs. Marx (Theor. Comp. Sci. 2006) showed that in undirected graphs multiway cut is fixed-parameter tractable (FPT) parameterized by pp. Marx and Razgon (STOC 2011) showed that undirected multicut is FPT and directed multicut is W[1]-hard parameterized by pp. We complete the picture here by our main result which is that both \textsc{Directed Vertex Multiway Cut} and \textsc{Directed Edge Multiway Cut} can be solved in time 22O(p)nO(1)2^{2^{O(p)}}n^{O(1)}, i.e., FPT parameterized by size pp of the cutset of the solution. This answers an open question raised by Marx (Theor. Comp. Sci. 2006) and Marx and Razgon (STOC 2011). It follows from our result that \textsc{Directed Multicut} is FPT for the case of k=2k=2 terminal pairs, which answers another open problem raised in Marx and Razgon (STOC 2011)

    Tight Bounds for Gomory-Hu-like Cut Counting

    Full text link
    By a classical result of Gomory and Hu (1961), in every edge-weighted graph G=(V,E,w)G=(V,E,w), the minimum stst-cut values, when ranging over all s,tVs,t\in V, take at most V1|V|-1 distinct values. That is, these (V2)\binom{|V|}{2} instances exhibit redundancy factor Ω(V)\Omega(|V|). They further showed how to construct from GG a tree (V,E,w)(V,E',w') that stores all minimum stst-cut values. Motivated by this result, we obtain tight bounds for the redundancy factor of several generalizations of the minimum stst-cut problem. 1. Group-Cut: Consider the minimum (A,B)(A,B)-cut, ranging over all subsets A,BVA,B\subseteq V of given sizes A=α|A|=\alpha and B=β|B|=\beta. The redundancy factor is Ωα,β(V)\Omega_{\alpha,\beta}(|V|). 2. Multiway-Cut: Consider the minimum cut separating every two vertices of SVS\subseteq V, ranging over all subsets of a given size S=k|S|=k. The redundancy factor is Ωk(V)\Omega_{k}(|V|). 3. Multicut: Consider the minimum cut separating every demand-pair in DV×VD\subseteq V\times V, ranging over collections of D=k|D|=k demand pairs. The redundancy factor is Ωk(Vk)\Omega_{k}(|V|^k). This result is a bit surprising, as the redundancy factor is much larger than in the first two problems. A natural application of these bounds is to construct small data structures that stores all relevant cut values, like the Gomory-Hu tree. We initiate this direction by giving some upper and lower bounds.Comment: This version contains additional references to previous work (which have some overlap with our results), see Bibliographic Update 1.

    What drives the change in UK household energy expenditure and associated CO2 emissions? Implication and forecast to 2020

    Get PDF
    Given the amount of direct and indirect CO2 emissions attributable to UK households, policy makers need a good understanding of the structure of household energy expenditure and the impact of both economic and non-economic factors when considering policies to reduce future emissions. To help achieve this, the Structural Time Series Model is used here to estimate UK ‘transport’ and ‘housing’ energy expenditure equations for 1964-2009. This allows for the estimation of a stochastic trend to measure the underlying energy expenditure trend and hence capture the non-trivial impact of ‘non-economic factors’ on household ‘transport’ and ‘housing’ energy expenditure; as well as the impact of the traditional ‘economic factors’ of income and price. The estimated equations are used to show that given current expectations, CO2 attributable to ‘transport’ and ‘housing’ expenditures will not fall by 29% (or 40%) in 2020 compared to 1990, and is therefore not consistent with the latest UK total CO2 reduction target. Hence, the message for policy makers is that in addition to economic incentives such as taxes, which might be needed to help restrain future energy expenditure, other policies that attempt to influence lifestyles and behaviours also need to be considered.Household energy expenditure; CO2 emissions; Structural Time Series Model

    Parameterized Approximation Algorithms for Bidirected Steiner Network Problems

    Get PDF
    The Directed Steiner Network (DSN) problem takes as input a directed edge-weighted graph G=(V,E)G=(V,E) and a set DV×V\mathcal{D}\subseteq V\times V of kk demand pairs. The aim is to compute the cheapest network NGN\subseteq G for which there is an sts\to t path for each (s,t)D(s,t)\in\mathcal{D}. It is known that this problem is notoriously hard as there is no k1/4o(1)k^{1/4-o(1)}-approximation algorithm under Gap-ETH, even when parametrizing the runtime by kk [Dinur & Manurangsi, ITCS 2018]. In light of this, we systematically study several special cases of DSN and determine their parameterized approximability for the parameter kk. For the bi-DSNPlanar_\text{Planar} problem, the aim is to compute a planar optimum solution NGN\subseteq G in a bidirected graph GG, i.e., for every edge uvuv of GG the reverse edge vuvu exists and has the same weight. This problem is a generalization of several well-studied special cases. Our main result is that this problem admits a parameterized approximation scheme (PAS) for kk. We also prove that our result is tight in the sense that (a) the runtime of our PAS cannot be significantly improved, and (b) it is unlikely that a PAS exists for any generalization of bi-DSNPlanar_\text{Planar}, unless FPT=W[1]. One important special case of DSN is the Strongly Connected Steiner Subgraph (SCSS) problem, for which the solution network NGN\subseteq G needs to strongly connect a given set of kk terminals. It has been observed before that for SCSS a parameterized 22-approximation exists when parameterized by kk [Chitnis et al., IPEC 2013]. We give a tight inapproximability result by showing that for kk no parameterized (2ε)(2-\varepsilon)-approximation algorithm exists under Gap-ETH. Additionally we show that when restricting the input of SCSS to bidirected graphs, the problem remains NP-hard but becomes FPT for kk

    Possible Discrimination between Gamma Rays and Hadrons using Cerenkov Photon Timing Measurements

    Get PDF
    Atmospheric \v{C}erenkov Technique is an established methodology to study TeVTeV energy gamma rays. However the challenging problem has always been the poor signal to noise ratio due to the presence of abundant cosmic rays. Several ingenious techniques have been employed to alleviate this problem, most of which are centred around the \v{C}erenkov image characteristics. However there are not many techniques available for improving the signal to noise ratio of the data from wavefront sampling observations. One such possible technique is to use the \v{C}erenkov photon arrival times and identify the species dependent characteristics in them. Here we carry out systematic monte carlo simulation studies of the timing information of \v{C}erenkov photons at the observation level. We have parameterized the shape of the \v{C}erenkov shower front as well as the pulse shapes in terms of experimentally measurable quantities. We demonstrate the sensitivity of the curvature of the shower front, pulse shape parameters as well as the photon arrival time jitter to primary species and show their efficiency in improving the signal to noise ratio. The effect of limiting the \v{C}erenkov telescope opening angle by using a circular focal point mask, onthe efficacy of the parameters has also been studied for each of the parameters. Radius of the shower front, pulse decay time and photon arrival time jitter have been found to be the most promising parameters which could be used to discriminate γ\gamma -ray events from the background. We also find that the efficiency of the first two parameters increases with zenith angle and efficiency of pulse decay time decreases with increasing altitude of observation.Comment: 30 pages, 5 postscript figures, uses elsart.sty; To appear in Astroparticle Physic

    Preventing Unraveling in Social Networks Gets Harder

    Full text link
    The behavior of users in social networks is often observed to be affected by the actions of their friends. Bhawalkar et al. \cite{bhawalkar-icalp} introduced a formal mathematical model for user engagement in social networks where each individual derives a benefit proportional to the number of its friends which are engaged. Given a threshold degree kk the equilibrium for this model is a maximal subgraph whose minimum degree is k\geq k. However the dropping out of individuals with degrees less than kk might lead to a cascading effect of iterated withdrawals such that the size of equilibrium subgraph becomes very small. To overcome this some special vertices called "anchors" are introduced: these vertices need not have large degree. Bhawalkar et al. \cite{bhawalkar-icalp} considered the \textsc{Anchored kk-Core} problem: Given a graph GG and integers b,kb, k and pp do there exist a set of vertices BHV(G)B\subseteq H\subseteq V(G) such that Bb,Hp|B|\leq b, |H|\geq p and every vertex vHBv\in H\setminus B has degree at least kk is the induced subgraph G[H]G[H]. They showed that the problem is NP-hard for k2k\geq 2 and gave some inapproximability and fixed-parameter intractability results. In this paper we give improved hardness results for this problem. In particular we show that the \textsc{Anchored kk-Core} problem is W[1]-hard parameterized by pp, even for k=3k=3. This improves the result of Bhawalkar et al. \cite{bhawalkar-icalp} (who show W[2]-hardness parameterized by bb) as our parameter is always bigger since pbp\geq b. Then we answer a question of Bhawalkar et al. \cite{bhawalkar-icalp} by showing that the \textsc{Anchored kk-Core} problem remains NP-hard on planar graphs for all k3k\geq 3, even if the maximum degree of the graph is k+2k+2. Finally we show that the problem is FPT on planar graphs parameterized by bb for all k7k\geq 7.Comment: To appear in AAAI 201

    Parameterized Streaming Algorithms for Vertex Cover

    Full text link
    As graphs continue to grow in size, we seek ways to effectively process such data at scale. The model of streaming graph processing, in which a compact summary is maintained as each edge insertion/deletion is observed, is an attractive one. However, few results are known for optimization problems over such dynamic graph streams. In this paper, we introduce a new approach to handling graph streams, by instead seeking solutions for the parameterized versions of these problems where we are given a parameter kk and the objective is to decide whether there is a solution bounded by kk. By combining kernelization techniques with randomized sketch structures, we obtain the first streaming algorithms for the parameterized versions of the Vertex Cover problem. We consider the following three models for a graph stream on nn nodes: 1. The insertion-only model where the edges can only be added. 2. The dynamic model where edges can be both inserted and deleted. 3. The \emph{promised} dynamic model where we are guaranteed that at each timestamp there is a solution of size at most kk. In each of these three models we are able to design parameterized streaming algorithms for the Vertex Cover problem. We are also able to show matching lower bound for the space complexity of our algorithms. (Due to the arXiv limit of 1920 characters for abstract field, please see the abstract in the paper for detailed description of our results)Comment: Fixed some typo
    corecore