814 research outputs found

    Measurements of eye lens doses in interventional cardiology using OSL and electronic dosemeters

    Get PDF
    The purpose of this paper is to test the appropriateness of OSL and electronic dosemeters to estimate eye lens doses at interventional cardiology environment. Using TLD as reference detectors, personal dose equivalent was measured in phantoms and during clinical procedures. For phantom measurements, OSL dose values resulted in an average difference of 215% vs. TLD. Tests carried out with other electronic dosemeters revealed differences up to +/- 20% versus TLD. With dosemeters positioned outside the goggles and when TLD doses were > 20 mu Sv, the average difference OSL vs. TLD was 29%. Eye lens doses of almost 700 mu Sv per procedure were measured in two cases out of a sample of 33 measurements in individual clinical procedures, thus showing the risk of high exposure to the lenses of the eye when protection rules are not followed. The differences found between OSL and TLD are acceptable for the purpose and range of doses measured in the survey.Postprint (published version

    Size reduction of complex networks preserving modularity

    Get PDF
    The ubiquity of modular structure in real-world complex networks is being the focus of attention in many trials to understand the interplay between network topology and functionality. The best approaches to the identification of modular structure are based on the optimization of a quality function known as modularity. However this optimization is a hard task provided that the computational complexity of the problem is in the NP-hard class. Here we propose an exact method for reducing the size of weighted (directed and undirected) complex networks while maintaining invariant its modularity. This size reduction allows the heuristic algorithms that optimize modularity for a better exploration of the modularity landscape. We compare the modularity obtained in several real complex-networks by using the Extremal Optimization algorithm, before and after the size reduction, showing the improvement obtained. We speculate that the proposed analytical size reduction could be extended to an exact coarse graining of the network in the scope of real-space renormalization.Comment: 14 pages, 2 figure

    Enhance the Efficiency of Heuristic Algorithm for Maximizing Modularity Q

    Full text link
    Modularity Q is an important function for identifying community structure in complex networks. In this paper, we prove that the modularity maximization problem is equivalent to a nonconvex quadratic programming problem. This result provide us a simple way to improve the efficiency of heuristic algorithms for maximizing modularity Q. Many numerical results demonstrate that it is very effective.Comment: 9 pages, 3 figure

    A Political Winner’s Curse: Why Preventive Policies Pass Parliament so Narrowly

    Full text link
    Preventive policy measures such as bailouts often pass parliament very narrowly. We present a model of asymmetric information between politicians and voters which rationalizes this narrow parliamentary outcome. A successful preventive policy impedes the verification of its own necessity. When policy intervention is necessary but voters disagree ex-ante, individual politicians have an incentive to loose the vote in parliament in order to be rewarded by voters ex-post. Comfortable vote margins induce incentives to move to the loosing fraction to avoid this winner's curse. In equilibrium, parliamentary elections over preventive policies are thus likely to end at very narrow margins.Politikmaßnahmen zur Prävention einer drohenden Krise wie Bankenrettungen oder Finanzhilfen an notleidende Staaten erhalten häufig nur eine knappe Mehrheit im Parlament. Im vorliegenden Beitrag wird ein polit-ökonomisches Modell asymmetrischer Informationen zwischen Politikern und Wählern vorgestellt, aus dem sich diese knappen Parlamentsabstimmungen erklären lassen. Annahmegemäß haben die Politiker im Vorfeld der Parlamentsabstimmung (ex-ante) einen Informationsvorsprung gegenüber den Wählern was die Notwendigkeit der präventiven Politikmaßnahme betrifft. Selbst nach der Entscheidung über die Durchsetzung der Maßnahme (ex-post) erfahren die Wähler nur dann, ob die Maßnahme notwendig war, wenn sie nicht durchgesetzt wurde und die Folgen der ausbleibenden Krisenprävention sichtbar werden. Sofern die präventive Politik tatsächlich notwendig ist, um Schaden abzuwenden, die Wähler dies ex-ante aber nicht glauben, ergibt sich eine interessante Konstellation: Folgen die Politiker dem ex-ante-Willen der Wähler und wird dementsprechend die Politik nicht umgesetzt, tritt der volkswirtschaftliche Schaden auf. Dies wird ex-post offenkundig und die Wähler strafen die Politiker für ihre fehlerhafte Politik bei der nachfolgendenden Wahl ab. Entscheiden sich die Politiker hingegen dafür, die Politik zur Krisenprävention durchzusetzen, kann der Schaden abgewendet werden. Allerdings bleiben die Wähler ex-post im Unklaren darüber, ob die Politikmaßnahme tatsächlich notwendig war und somit bei ihrer ex-ante-Einstellung. Auch dann werden die Politiker für ihre als fehlerhaft erachtete Politik abgestraft. Hieraus ergibt sich für einen einzelnen Politiker im Parlament eine Situation, die im Aufsatz als Winner's Curse bezeichnet wird: Er erhält nur dann die Zustimmung der Wähler, wenn die Politik im Parlament durchgesetzt wird, er aber dagegen gestimmt hat, oder die Politik keine Mehrheit im Parlament erhält, er aber dafür gestimmt hat. Im Falle eines eindeutigen Mehrheitsverhältnisses entstehen somit individuelle Anreize, zur Minderheit abzuweichen. Die Wahrscheinlichkeit eines knappen Wahlausgangs steigt durch diese Anreize zur Abweichung

    Comparing community structure identification

    Full text link
    We compare recent approaches to community structure identification in terms of sensitivity and computational cost. The recently proposed modularity measure is revisited and the performance of the methods as applied to ad hoc networks with known community structure, is compared. We find that the most accurate methods tend to be more computationally expensive, and that both aspects need to be considered when choosing a method for practical purposes. The work is intended as an introduction as well as a proposal for a standard benchmark test of community detection methods.Comment: 10 pages, 3 figures, 1 table. v2: condensed, updated version as appears in JSTA

    Urban traffic from the perspective of dual graph

    Full text link
    In this paper, urban traffic is modeled using dual graph representation of urban transportation network where roads are mapped to nodes and intersections are mapped to links. The proposed model considers both the navigation of vehicles on the network and the motion of vehicles along roads. The road's capacity and the vehicle-turning ability at intersections are naturally incorporated in the model. The overall capacity of the system can be quantified by a phase transition from free flow to congestion. Simulation results show that the system's capacity depends greatly on the topology of transportation networks. In general, a well-planned grid can hold more vehicles and its overall capacity is much larger than that of a growing scale-free network.Comment: 7 pages, 10 figure

    Integrating fluctuations into distribution of resources in transportation networks

    Full text link
    We propose a resource distribution strategy to reduce the average travel time in a transportation network given a fixed generation rate. Suppose that there are essential resources to avoid congestion in the network as well as some extra resources. The strategy distributes the essential resources by the average loads on the vertices and integrates the fluctuations of the instantaneous loads into the distribution of the extra resources. The fluctuations are calculated with the assumption of unlimited resources, where the calculation is incorporated into the calculation of the average loads without adding to the time complexity. Simulation results show that the fluctuation-integrated strategy provides shorter average travel time than a previous distribution strategy while keeping similar robustness. The strategy is especially beneficial when the extra resources are scarce and the network is heterogeneous and lowly loaded.Comment: 14 pages, 4 figure

    Polymerase chain reaction detection of avipox and avian papillomavirus in naturally infected wild birds: comparisons of blood, swab and tissue samples

    Get PDF
    Avian poxvirus (avipox) is widely reported from avian species, causing cutaneous or mucosal lesions. Mortality rates of up to 100% are recorded in some hosts. Three major avipox clades are recognized. Several diagnostic techniques have been reported, with molecular techniques used only recently. Avipox has been reported from 278 different avian species, but only 111 of these involved sequence and/or strain identification. Collecting samples from wild birds is challenging as only few wild bird individuals or species may be symptomatic. Also, sampling regimes are tightly regulated and the most efficient sampling method, whole bird collection, is ethically challenging. In this study, three alternative sampling techniques (blood, cutaneous swabs and tissue biopsies) from symptomatic wild birds were examined. Polymerase chain reaction was used to detect avipoxvirus and avian papillomavirus (which also induces cutaneous lesions in birds). Four out of 14 tissue samples were positive but all 29 blood samples and 22 swab samples were negative for papillomavirus. All 29 blood samples were negative but 6/22 swabs and 9/14 tissue samples were avipox-positive. The difference between the numbers of positives generated from tissue samples and from swabs was not significant. The difference in the avipox-positive specimens in paired swab (4/6) and tissue samples (6/6) was also not significant. These results therefore do not show the superiority of swab or tissue samples over each other. However, both swab (6/22) and tissue (8/9) samples yielded significantly more avipox-positive cases than blood samples, which are therefore not recommended for sampling these viruses.The authors thank bird ringers from Alula and Monticola, especially Alfredo Ortega and Chechu Aguirre, for help with the capture and ringing of birds, which made this project possible. Thanks to Alvaro RamĂ­rez for samples. This research was funded by the Ministerio de Ciencia e InnovaciĂłn, Spain (grant number: CGL2010-15734/BOS). R.A.J.W. was supported by the Programa Internacional de CaptaciĂłn de Talento (PICATA) de Moncloa Campus de Excelencia Internacional while writing the manuscript

    Who is the best player ever? A complex network analysis of the history of professional tennis

    Get PDF
    We consider all matches played by professional tennis players between 1968 and 2010, and, on the basis of this data set, construct a directed and weighted network of contacts. The resulting graph shows complex features, typical of many real networked systems studied in literature. We develop a diffusion algorithm and apply it to the tennis contact network in order to rank professional players. Jimmy Connors is identified as the best player of the history of tennis according to our ranking procedure. We perform a complete analysis by determining the best players on specific playing surfaces as well as the best ones in each of the years covered by the data set. The results of our technique are compared to those of two other well established methods. In general, we observe that our ranking method performs better: it has a higher predictive power and does not require the arbitrary introduction of external criteria for the correct assessment of the quality of players. The present work provides a novel evidence of the utility of tools and methods of network theory in real applications.Comment: 10 pages, 4 figures, 4 table
    • …
    corecore