40,346 research outputs found

    Efficient Generation of Geographically Accurate Transit Maps

    Full text link
    We present LOOM (Line-Ordering Optimized Maps), a fully automatic generator of geographically accurate transit maps. The input to LOOM is data about the lines of a given transit network, namely for each line, the sequence of stations it serves and the geographical course the vehicles of this line take. We parse this data from GTFS, the prevailing standard for public transit data. LOOM proceeds in three stages: (1) construct a so-called line graph, where edges correspond to segments of the network with the same set of lines following the same course; (2) construct an ILP that yields a line ordering for each edge which minimizes the total number of line crossings and line separations; (3) based on the line graph and the ILP solution, draw the map. As a naive ILP formulation is too demanding, we derive a new custom-tailored formulation which requires significantly fewer constraints. Furthermore, we present engineering techniques which use structural properties of the line graph to further reduce the ILP size. For the subway network of New York, we can reduce the number of constraints from 229,000 in the naive ILP formulation to about 4,500 with our techniques, enabling solution times of less than a second. Since our maps respect the geography of the transit network, they can be used for tiles and overlays in typical map services. Previous research work either did not take the geographical course of the lines into account, or was concerned with schematic maps without optimizing line crossings or line separations.Comment: 7 page

    THRET: Threshold Regression with Endogenous Threshold Variables

    Get PDF
    This paper extends the simple threshold regression framework of Hansen (2000) and Caner and Hansen (2004) to allow for endogeneity of the threshold variable. We develop a concentrated two-stage least squares (C2SLS) estimator of the threshold parameter that is based on an inverse Mills ratio bias correction. Our method also allows for the endogeneity of the slope variables. We show that our estimator is consistent and investigate its performance using a Monte Carlo simulation that indicates the applicability of the method is finite samples. We also illustrate its usefulness with an empirical example from economic growth.

    Second-line antiretroviral therapy in resource-limited settings: the experience of Médecins Sans Frontières

    Get PDF
    OBJECTIVES: To describe the use of second-line protease-inhibitor regimens in Médecins Sans Frontières HIV programmes, and determine switch rates, clinical outcomes, and factors associated with survival. DESIGN/METHODS: We used patient data from 62 Médecins Sans Frontières programmes and included all antiretroviral therapy-naive adults (> 15 years) at the start of antiretroviral therapy and switched to a protease inhibitor-containing regimen with at least one nucleoside reverse transcriptase inhibitor change after more than 6 months of nonnucleoside reverse transcriptase inhibitor first-line use. Cumulative switch rates and survival curves were estimated using Kaplan-Meier methods, and mortality predictors were investigated using Poisson regression. RESULTS: Of 48,338 adults followed on antiretroviral therapy, 370 switched to a second-line regimen after a median of 20 months (switch rate 4.8/1000 person-years). Median CD4 cell count at switch was 99 cells/microl (interquartile ratio 39-200; n = 244). A lopinavir/ritonavir-based regimen was given to 51% of patients and nelfinavir-based regimen to 43%; 29% changed one nucleoside reverse transcriptase inhibitor and 71% changed two nucleoside reverse transcriptase inhibitors. Median follow-up on second-line antiretroviral therapy was 8 months, and probability of remaining in care at 12 months was 0.86. Median CD4 gains were 90 at 6 months and 135 at 12 months. Death rates were higher in patients in World Health Organization stage 4 at antiretroviral therapy initiation and in those with CD4 nadir count less than 50 cells/microl. CONCLUSION: The rate of switch to second-line treatment in antiretroviral therapy-naive adults on non-nucleoside reverse transcriptase inhibitor-based first-line antiretroviral therapy was relatively low, with good early outcomes observed in protease inhibitor-based second-line regimens. Severe immunosuppression was associated with increased mortality on second-line treatment

    Design of trials for interrupting the transmission of endemic pathogens

    Get PDF
    Many interventions against infectious diseases have geographically diffuse effects. This leads to contamination between arms in cluster-randomized trials (CRTs). Pathogen elimination is the goal of many intervention programs against infectious agents, but contamination means that standard CRT designs and analyses do not provide inferences about the potential of interventions to interrupt pathogen transmission at maximum scale-up.; A generic model of disease transmission was used to simulate infections in stepped wedge cluster-randomized trials (SWCRTs) of a transmission-reducing intervention, where the intervention has spatially diffuse effects. Simulations of such trials were then used to examine the potential of such designs for providing generalizable causal inferences about the impact of such interventions, including measurements of the contamination effects. The simulations were applied to the geography of Rusinga Island, Lake Victoria, Kenya, the site of the SolarMal trial on the use of odor-baited mosquito traps to eliminate Plasmodium falciparum malaria. These were used to compare variants in the proposed SWCRT designs for the SolarMal trial.; Measures of contamination effects were found that could be assessed in the simulated trials. Inspired by analyses of trials of insecticide-treated nets against malaria when applied to the geography of the SolarMal trial, these measures were found to be robust to different variants of SWCRT design. Analyses of the likely extent of contamination effects supported the choice of cluster size for the trial.; The SWCRT is an appropriate design for trials that assess the feasibility of local elimination of a pathogen. The effects of incomplete coverage can be estimated by analyzing the extent of contamination between arms in such trials, and the estimates also support inferences about causality. The SolarMal example illustrates how generic transmission models incorporating spatial smoothing can be used to simulate such trials for a power calculation and optimization of cluster size and randomization strategies. The approach is applicable to a range of infectious diseases transmitted via environmental reservoirs or via arthropod vectors

    Ontology of common sense geographic phenomena: Foundations for interoperable multilingual geospatial databases

    Get PDF
    Information may be defined as the conceptual or communicable part of the content of mental acts. The content of mental acts includes sensory data as well as concepts, particular as well as general information. An information system is an external (non-mental) system designed to store such content. Information systems afford indirect transmission of content between people, some of whom may put information into the system and others who are among those who use the system. In order for communication to happen, the conceptual systems of the originators and users of the information must be sufficiently similar. A formal conceptual framework that can provide the basis for exchange of information is termed an ontology. In its most fundamental form, ontology studies the most basic constituents of reality. Traditionally, ontology seeks to reflects structures that are independent of thought and cognition. The term ontology is used more broadly in artificial intelligence and software engineering, to refer to the conceptual basis for an information system

    Describing and Understanding Neighborhood Characteristics through Online Social Media

    Full text link
    Geotagged data can be used to describe regions in the world and discover local themes. However, not all data produced within a region is necessarily specifically descriptive of that area. To surface the content that is characteristic for a region, we present the geographical hierarchy model (GHM), a probabilistic model based on the assumption that data observed in a region is a random mixture of content that pertains to different levels of a hierarchy. We apply the GHM to a dataset of 8 million Flickr photos in order to discriminate between content (i.e., tags) that specifically characterizes a region (e.g., neighborhood) and content that characterizes surrounding areas or more general themes. Knowledge of the discriminative and non-discriminative terms used throughout the hierarchy enables us to quantify the uniqueness of a given region and to compare similar but distant regions. Our evaluation demonstrates that our model improves upon traditional Naive Bayes classification by 47% and hierarchical TF-IDF by 27%. We further highlight the differences and commonalities with human reasoning about what is locally characteristic for a neighborhood, distilled from ten interviews and a survey that covered themes such as time, events, and prior regional knowledgeComment: Accepted in WWW 2015, 2015, Florence, Ital

    Ontological Foundations for Geographic Information Science

    Get PDF
    We propose as a UCGIS research priority the topic of “Ontological Foundations for Geographic Information.” Under this umbrella we unify several interrelated research subfields, each of which deals with different perspectives on geospatial ontologies and their roles in geographic information science. While each of these subfields could be addressed separately, we believe it is important to address ontological research in a unitary, systematic fashion, embracing conceptual issues concerning what would be required to establish an exhaustive ontology of the geospatial domain, issues relating to the choice of appropriate methods for formalizing ontologies, and considerations regarding the design of ontology-driven information systems. This integrated approach is necessary, because there is a strong dependency between the methods used to specify an ontology, and the conceptual richness, robustness and tractability of the ontology itself. Likewise, information system implementations are needed as testbeds of the usefulness of every aspect of an exhaustive ontology of the geospatial domain. None of the current UCGIS research priorities provides such an integrative perspective, and therefore the topic of “Ontological Foundations for Geographic Information Science” is unique

    Critical mass and the dependency of research quality on group size

    Full text link
    Academic research groups are treated as complex systems and their cooperative behaviour is analysed from a mathematical and statistical viewpoint. Contrary to the naive expectation that the quality of a research group is simply given by the mean calibre of its individual scientists, we show that intra-group interactions play a dominant role. Our model manifests phenomena akin to phase transitions which are brought about by these interactions, and which facilitate the quantification of the notion of critical mass for research groups. We present these critical masses for many academic areas. A consequence of our analysis is that overall research performance of a given discipline is improved by supporting medium-sized groups over large ones, while small groups must strive to achieve critical mass.Comment: 16 pages, 6 figures consisting of 16 panels. Presentation and reference list improved for version
    • …
    corecore