1,639 research outputs found

    Connectivity Oracles for Graphs Subject to Vertex Failures

    Full text link
    We introduce new data structures for answering connectivity queries in graphs subject to batched vertex failures. A deterministic structure processes a batch of ddd\leq d_{\star} failed vertices in O~(d3)\tilde{O}(d^3) time and thereafter answers connectivity queries in O(d)O(d) time. It occupies space O(dmlogn)O(d_{\star} m\log n). We develop a randomized Monte Carlo version of our data structure with update time O~(d2)\tilde{O}(d^2), query time O(d)O(d), and space O~(m)\tilde{O}(m) for any failure bound dnd\le n. This is the first connectivity oracle for general graphs that can efficiently deal with an unbounded number of vertex failures. We also develop a more efficient Monte Carlo edge-failure connectivity oracle. Using space O(nlog2n)O(n\log^2 n), dd edge failures are processed in O(dlogdloglogn)O(d\log d\log\log n) time and thereafter, connectivity queries are answered in O(loglogn)O(\log\log n) time, which are correct w.h.p. Our data structures are based on a new decomposition theorem for an undirected graph G=(V,E)G=(V,E), which is of independent interest. It states that for any terminal set UVU\subseteq V we can remove a set BB of U/(s2)|U|/(s-2) vertices such that the remaining graph contains a Steiner forest for UBU-B with maximum degree ss

    CAN WE SAY MORE NOW? A CLOSER LOOK AT ONLINE PUBLIC OPINION CHANGE IN CHINA

    Get PDF
    This study examined the pattern of online public opinion change in China by investigating the top one hit blog and its following commentaries of every day from July 2009 to March 2012 on a famous Chinese website, and then discussed potential factors that affected the formation of online public opinion. The extent of freedom of online public opinion during this period presented regular fluctuations. Whether criticisms were registered by commentators was influenced by four factors. First and most important, the negative tone of bloggers increased criticism and the positive tone decreased criticism, which shows that the news that flows from the media to the public is amplified and interpreted by influential bloggers according to the two-step flow theory. Second, while national and local events had no effect, international news events decreased criticism because the public strongly supported the Chinese government. This was as important as the first factor. Third, the negative tone of events discussed in blogs increased criticism, which means that the mass media did have some direct influence through negative but not positive events. And fourth, when the government censored blogs and commentaries, the public shied away from criticism because their posts would probably be removed

    EVALUATING THE IMPACTS OF ANTIDEPRESSANT USE ON THE RISK OF DEMENTIA

    Get PDF
    Dementia is a clinical syndrome caused by neurodegeneration or cerebrovascular injury. Patients with dementia suffer from deterioration in memory, thinking, behavior and the ability to perform everyday activities. Since there are no cures or disease-modifying therapies for dementia, there is much interest in identifying modifiable risk factors that may help prevent or slow the progression of cognitive decline. Medications are a common focus of this type of research. Importantly, according to a report from the Centers for Disease Control and Prevention (CDC), 19.1% of the population aged 60 and over report taking antidepressants during 2011-2014, and this number tends to increase. However, antidepressant use among the elderly may be concerning because of the potentially harmful effects on cognition. To assess the impacts of antidepressants on the risk of dementia, we conducted three consecutive projects. In the first project, a retrospective cohort study using Marginal Structural Cox Proportional Hazards regression model with Inverse Probability Weighting (IPW) was conducted to evaluate the average causal effects of different classes of antidepressant on the risk of dementia. Potential causal effects of selective serotonin reuptake inhibitors (SSRIs), serotonin and norepinephrine reuptake inhibitors (SNRIs), atypical anti-depressants (AAs) and tri-cyclic antidepressants (TCAs) on the risk of dementia were observed at the 0.05 significance level. Multiple sensitivity analyses supported these findings. Unmeasured confounding is a threat to the validity of causal inference methods. In evaluating the effects of antidepressants, it is important to consider how common comorbidities of depression, such as sleep disorders, may affect both the exposure to anti-depressants and the onset of cognitive impairment. In this dissertation, sleep apnea and rapid-eye-movement behavior disorder (RBD) were unmeasured and thus uncontrolled confounders for the association between antidepressant use and the risk of dementia. In the second project, a bias factor formula for two binary unmeasured confounders was derived in order to account for these variables. Monte Carlo analysis was implemented to estimate the distribution of the bias factor for each class of antidepressant. The effects of antidepressants on the risk of dementia adjusted for both measured and unmeasured confounders were estimated. Sleep apnea and RBD attenuated the effect estimates for SSRI, SNRI and AA on the risk of dementia. In the third project, to account for potential time-varying confounding and observed time-varying treatment, a multi-state Markov chain with three transient states (normal cognition, mild cognitive impairment (MCI), and impaired but not MCI) and two absorbing states (dementia and death) was performed to estimate the probabilities of moving between finite and mutually exclusive cognitive state. This analysis also allowed participants to recover from mild impairments (i.e., mild cognitive impairment, impaired but not MCI) to normal cognition, and accounted for the competing risk of death prior to dementia. These findings supported the results of the main analysis in the first project

    The nonparametric analysis of interval-censored failure time data

    Get PDF
    By interval-censored failure time data, we mean that the failure time of interest is observed to belong to some windows or intervals, instead of being known exactly. One would get an interval-censored observation for a survival event if a subject has not experienced the event at one follow-up time but had experienced the event at the next follow-up time. Interval-censored data include right-censored data (Kalbfleisch and Prentice, 2002) as a special case. Nonparametric comparison of survival functions is one of the main tasks in failure time studies such as clinical trials. For interval-censored failure time data, a few nonparametric test procedures have been developed. However, due to the strict restrictions of existing nonparametric tests and practical demands, some new nonparametric tests need to be developed. This dissertation consists of four parts. In the first part, we propose a new class of test procedures whose asymptotic distributions are established under both null and alternative hypotheses, since all of the existing test procedures cannot be used if one intends to perform some power or sample size calculation under the alternative hypothesis. Some numerical results have been obtained from a simulation study for assessing the finite sample performance of the proposed test procedure. Also we applied the proposed method to a real data set arising from an AIDS clinical trial concerning the opportunistic infection cytomegalovirus (CMV). The second part of this dissertation will focus on the nonparametric test for intervalcensored data with unequal censoring. As we know, one common drawback or restriction of the nonparametric test procedures given in the literature is that they can only apply to situations where the observation processes follow the same distribution among different treatment groups. To remove the restriction, a test procedure is proposed, which takes into account the difference between the distributions of the censoring variables. Also the asymptotic distribution of the test statistics i

    Vector and Spinor Decomposition of SU(2) Gauge Potential, their quivalence and Knot Structure in SU(2) Chern-Simons Theory

    Full text link
    In this paper, spinor and vector decomposition of SU(2) gauge potential are presented and their equivalence is constructed using a simply proposal. We also obtain the action of Faddeev nonlinear O(3) sigma model from the SU(2) massive gauge field theory which is proposed according to the gauge invariant principle. At last, the knot structure in SU(2) Chern-Simons filed theory is discussed in terms of the ϕ\phi--mapping topological current theory. The topological charge of the knot is characterized by the Hopf indices and the Brouwer degrees of ϕ\phi-mapping.Comment: 10 pages, ni figur

    An Improved Algorithm for Incremental DFS Tree in Undirected Graphs

    Get PDF
    Depth first search (DFS) tree is one of the most well-known data structures for designing efficient graph algorithms. Given an undirected graph G=(V,E)G=(V,E) with nn vertices and mm edges, the textbook algorithm takes O(n+m)O(n+m) time to construct a DFS tree. In this paper, we study the problem of maintaining a DFS tree when the graph is undergoing incremental updates. Formally, we show: Given an arbitrary online sequence of edge or vertex insertions, there is an algorithm that reports a DFS tree in O(n)O(n) worst case time per operation, and requires O(min{mlogn,n2})O\left(\min\{m \log n, n^2\}\right) preprocessing time. Our result improves the previous O(nlog3n)O(n \log^3 n) worst case update time algorithm by Baswana et al. and the O(nlogn)O(n \log n) time by Nakamura and Sadakane, and matches the trivial Ω(n)\Omega(n) lower bound when it is required to explicitly output a DFS tree. Our result builds on the framework introduced in the breakthrough work by Baswana et al., together with a novel use of a tree-partition lemma by Duan and Zhan, and the celebrated fractional cascading technique by Chazelle and Guibas

    Approximating All-Pair Bounded-Leg Shortest Path and APSP-AF in Truly-Subcubic Time

    Get PDF
    In the bounded-leg shortest path (BLSP) problem, we are given a weighted graph G with nonnegative edge lengths, and we want to answer queries of the form "what\u27s the shortest path from u to v, where only edges of length = f are considered. In this article we give an O~(n^{(omega+3)/2}epsilon^{-3/2}log W) time algorithm to compute a data structure that answers APSP-AF queries in O(log(epsilon^{-1}log (nW))) time and achieves (1+epsilon)-approximation, where omega < 2.373 is the exponent of time complexity of matrix multiplication, W is the upper bound of integer edge lengths, and n is the number of vertices. This is the first truly-subcubic time algorithm for these problems on dense graphs. Our algorithm utilizes the O(n^{(omega+3)/2}) time max-min product algorithm [Duan and Pettie 2009]. Since the all-pair bottleneck path (APBP) problem, which is equivalent to max-min product, can be seen as all-pair reachability for all flow, our approach indeed shows that these problems are almost equivalent in the approximation sense
    corecore