596 research outputs found

    Exacting eccentricity for small-world networks

    Full text link
    © 2018 IEEE. This paper studies the efficiency issue on computing the exact eccentricity-distribution of a small-world network. Eccentricity-distribution reflects the importance of each node in a graph, which is beneficial for graph analysis. Moreover, it is key to computing two fundamental graph characters: diameter and radius. Existing eccentricity computation algorithms, however, are either inefficient in handling large-scale networks emerging nowadays in practice or approximate algorithms that are inappropriate to small-world networks. We propose an efficient approach for exact eccentricity computation. Our approach is based on a plethora of insights on the bottleneck of the existing algorithms-one-node eccentricity computation and the upper/lower bounds update. Extensive experiments demonstrate that our approach outperforms the state-of-The-Art up to three orders of magnitude on real large small-world networks

    A Mini review of Node Centrality Metrics in Biological Networks

    Get PDF

    Measuring Graphs with Shortest Distances

    Full text link
    University of Technology Sydney. Faculty of Engineering and Information Technology.Shortest distances characterize the pair-wise relationships among nodes in a graph. Given a graph with a node set and an edge set, the shortest distance between two nodes is defined as the minimum path length between them. Computing the shortest distance between two nodes is a fundamental operation of graphs, which can be used both as a primary function and as a building block for applications. Given the important roles of shortest distances, this thesis focuses on the efficient computation of shortest-distance-based measures on graphs. Firstly, we investigate how to accelerate the index time of 2-hop labeling. 2-hop labeling approaches are widely adopted to speed up the online performance of shortest distance queries. The construction of the 2-hop labeling, however, can be exhaustive especially on big graphs. For a major category of large graphs, small-world networks, the state-of-the-art approach is Pruned Landmark Labeling (PLL). However, PLL’s strong sequential nature hinders it from being parallelized. It becomes an urgent issue on massive small-world networks whose index can hardly be constructed by a single thread within a reasonable time. Based on the dependency analysis of PLL, we propose a Parallelized Shortest-distance Labeling (PSL) scheme to exploit parallelism to shorten the index time. Secondly, we study how to reduce the index size of 2-hop labeling. While the index time can be shortened by parallelized labeling, the index size becomes the bottleneck for a massive real graph with a relatively large treewidth — 2-hop labeling can hardly be constructed due to the oversized index. We disclose the theoretical relationships between the graph treewidth and 2-hop labeling’s index size and query time. To scale up distance labeling, we propose Core-Tree (CT) Index to dramatically reduce the index size, thereby enabling CT-Index to handle massive graphs that no existing approaches can process. Thirdly, we compute and maintain the eccentricities of all nodes. Given a graph, eccentricity measures the shortest distance from each node to its farthest node. Existing eccentricity computation algorithms are not scalable enough to handle real large networks. Our solution optimizes existing eccentricity computation algorithms on their bottlenecks — one node eccentricity computation and the upper/lower bounds update — based on a line of original insights; it also provides the first algorithm on maintaining the eccentricities of a dynamic graph without recomputing the eccentricity distribution upon each edge update. Extensive empirical studies validate the efficiency of our techniques

    Hyperbolicity Computation through Dominating Sets

    Get PDF
    International audienceHyperbolicity is a graph parameter related to how much a graph resembles a tree with respect to distances. Its computation is challenging as the main approaches consist in scanning all quadruples of the graph or using fast matrix multiplication as building block, both are not practical for large graphs. In this paper, we propose and evaluate an approach that uses a hierarchy of distance-k dominating sets to reduce the search space. This technique, compared to the previous best practical algorithms, enables us to compute the hyperbolicity of graphs with unprecedented size (up to a million nodes) and speeds up the computation of previously attainable graphs by up to 3 orders of magnitude while reducing the memory consumption by up to more than a factor of 23

    Enumeration of far-apart pairs by decreasing distance for faster hyperbolicity computation

    Get PDF
    Hyperbolicity is a graph parameter which indicates how much the shortest-path distance metric of a graph deviates from a tree metric. It is used in various fields such as networking, security, and bioinformatics for the classification of complex networks, the design of routing schemes, and the analysis of graph algorithms. Despite recent progress, computing the hyperbolicity of a graph remains challenging. Indeed, the best known algorithm has time complexity O(n^{3.69}), which is prohibitive for large graphs, and the most efficient algorithms in practice have space complexity O(n^2). Thus, time as well as space are bottlenecks for computing the hyperbolicity. In this paper, we design a tool for enumerating all far-apart pairs of a graph by decreasing distances. A node pair (u, v) of a graph is far-apart if both v is a leaf of all shortest-path trees rooted at u and u is a leaf of all shortest-path trees rooted at v. This notion was previously used to drastically reduce the computation time for hyperbolicity in practice. However, it required the computation of the distance matrix to sort all pairs of nodes by decreasing distance, which requires an infeasible amount of memory already for medium-sized graphs. We present a new data structure that avoids this memory bottleneck in practice and for the first time enables computing the hyperbolicity of several large graphs that were far out-of-reach using previous algorithms. For some instances, we reduce the memory consumption by at least two orders of magnitude. Furthermore, we show that for many graphs, only a very small fraction of far-apart pairs have to be considered for the hyperbolicity computation, explaining this drastic reduction of memory. As iterating over far-apart pairs in decreasing order without storing them explicitly is a very general tool, we believe that our approach might also be relevant to other problems

    A bibliography /with abstracts/ on gas-lubricated bearings Interim report

    Get PDF
    Gas lubricated bearings - annotated bibliograph

    Contributions to Statistical Image Analysis for High Content Screening.

    Full text link
    Images of cells incubated with fluorescent small molecule probes can be used to infer where the compounds distribute within cells. Identifying the spatial pattern of compound localization within each cell is very important problem for which adequate statistical methods do not yet exist. First, we asked whether a classifier for subcellular localization categories can be developed based on a training set of manually classified cells. Due to challenges of the images such as uneven field illumination, low resolution, high noise, variation in intensity and contrast, and cell to cell variability in probe distributions, we constructed texture features for contrast quantiles conditioning on intensities, and classifying on artificial cells with same marginal distribution but different conditional distribution supported that this conditioning approach is beneficial to distinguish different localization distributions. Using these conditional features, we obtained satisfactory performance in image classification, and performed to dimension reduction and data visualization. As high content images are subject to several major forms of artifacts, we are interested in the implications of measurement errors and artifacts on our ability to draw scientifically meaningful conclusions from high content images. Specifically, we considered three forms of artifacts: saturation, blurring and additive noise. For each type of artifacts, we artificially introduced larger amount, and aimed to understand the bias by `Simulation Extrapolation' (SIMEX) method, applied to the measurement errors for pairwise centroid distances, the degree of eccentricity in the class-specific distributions, and the angles between the dominant axes of variability for different categories. Finally, we briefly considered the analysis of time-point images. Small molecule studies will be more focused. Specifically, we consider the evolving patterns of subcellular staining from the moment that a compound is introduced into the cell culture medium, to the point that steady state distribution is reached. We construct the degree to which the subcellular staining pattern is concentrated in or near the nucleus as the features of timecourse data set, and aim to determine whether different compounds accumulate in different regions at different times, as characterized in terms of their position in the cell relative to the nucleus.Ph.D.StatisticsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/91460/1/liufy_1.pd

    Self-neglect and adult safeguarding: findings from research

    Get PDF
    This report was commissioned by the Department of Health (DH) and examines the concept of self-neglect. The relationship between self-neglect and safeguarding in the UK is a difficult one, partly because the current definition of abuse specifies harmful actions by someone other than the individual at risk. Safeguarding Adults Boards’ policies and procedures commonly contain no reference to self-neglect; occasionally they explicitly exclude it or set criteria for its inclusion The perceptions of people who neglect themselves have not been extensively researched, but where they have, emerging themes are pride in self-sufficiency, connectedness to place and possessions and behaviour that attempts to preserve continuity of identity and control. Traumatic histories and life-changing effects are also present in individuals’ own accounts of their situation. Self-neglect is reported mainly as occurring in older people, although it is also associated with mental ill health. Differentiation between inability and unwillingness to care for oneself, and capacity to understand the consequences of one’s actions, are crucial determinants of response. Professional tolerance of self-neglect as lifestyle choice is higher than when it accompanies physical/mental impairment. Professionals express uncertainty about causation and intervention

    The Midwestern Aristocracy: Anders Zorn\u27s Portraits in Gilded Age St. Louis

    Get PDF
    To the American aristocracy of the Gilded Age, painted portraits functioned as pictorial symbols of one’s taste, power, and status. This thesis evaluates the motivations of a provincial elite in St. Louis, Missouri, and sees their taste for portraits by Swedish artist, Anders Zorn, as the result of the intersection of myriad cultural and ethnic allegiances. Situating Zorn as a trans-Atlantic artist, this thesis functions as a patronage study, evaluating the portraits and goals of specific St. Louis patrons and analyzes Zorn’s role as an active agent in the art market, leveraging his public persona to establish aesthetic authority over his patrons. The first section of this thesis evaluates the nuances of conspicuous consumption, gender roles, politics, and ethnicity which undergirded Gilded Age St. Louis. The second section is a formal and contextual analysis of Zorn’s portraits of Adolphus Busch, Lilly Eberhard Anheuser, and Halsey Cooley Ives, that reflects these contemporary St. Louis realities. It examines the complicated concept of “American-ness” negotiated by the city’s upwardly mobile German American art patrons, the elite’s efforts to establish European and Aesthetic art through the School and Museum of Fine Arts at Washington University, and how art contributed to setting and policing boundaries of status and taste in the city. The latter section discusses the circumstances around Anders Zorn’s lawsuit against St. Louis millionaire and patron, Henry Clay Pierce, analyzing competing Gilded Age conceptions of a portrait’s purpose. Zorn’s dispute with Pierce soon became public, with the conflict drawing in many local artists, revealing the industrial age tension between economic control wielded by the patron and the aesthetic authority granted the socially entrenched artist. Advisor: Wendy Kat
    • 

    corecore