7,227 research outputs found

    Dosing pole recommendations for lymphatic filariasis elimination: A height-weight quantile regression modeling approach

    Get PDF
    BACKGROUND: The World Health Organization (WHO) currently recommends height or age-based dosing as alternatives to weight-based dosing for mass drug administration lymphatic filariasis (LF) elimination programs. The goals of our study were to compare these alternative dosing strategies to weight-based dosing and to develop and evaluate new height-based dosing pole scenarios. METHODOLOGY/PRINCIPAL FINDINGS: Age, height and weight data were collected from \u3e26,000 individuals in five countries during a cluster randomized LF clinical trial. Weight-based dosing for diethylcarbamazine (DEC; 6 mg/kg) and ivermectin (IVM; 200 ug/kg) with tablet numbers derived from a table of weight intervals was treated as the gold standard for this study. Following WHO recommended age-based dosing of DEC and height-based dosing of IVM would have resulted in 32% and 27% of individuals receiving treatment doses below those recommended by weight-based dosing for DEC and IVM, respectively. Underdosing would have been especially common in adult males, who tend to have the highest LF prevalence in many endemic areas. We used a 3-step modeling approach to develop and evaluate new dosing pole cutoffs. First, we analyzed the clinical trial data using quantile regression to predict weight from height. We then used weight predictions to develop new dosing pole cutoff values. Finally, we compared different dosing pole cutoffs and age and height-based WHO dosing recommendations to weight-based dosing. We considered hundreds of scenarios including country- and sex-specific dosing poles. A simple dosing pole with a 6-tablet maximum for both DEC and IVM reduced the underdosing rate by 30% and 21%, respectively, and was nearly as effective as more complex pole combinations for reducing underdosing. CONCLUSIONS/SIGNIFICANCE: Using a novel modeling approach, we developed a simple dosing pole that would markedly reduce underdosing for DEC and IVM in MDA programs compared to current WHO recommended height or age-based dosing

    A Methodology for Robust Multiproxy Paleoclimate Reconstructions and Modeling of Temperature Conditional Quantiles

    Full text link
    Great strides have been made in the field of reconstructing past temperatures based on models relating temperature to temperature-sensitive paleoclimate proxies. One of the goals of such reconstructions is to assess if current climate is anomalous in a millennial context. These regression based approaches model the conditional mean of the temperature distribution as a function of paleoclimate proxies (or vice versa). Some of the recent focus in the area has considered methods which help reduce the uncertainty inherent in such statistical paleoclimate reconstructions, with the ultimate goal of improving the confidence that can be attached to such endeavors. A second important scientific focus in the subject area is the area of forward models for proxies, the goal of which is to understand the way paleoclimate proxies are driven by temperature and other environmental variables. In this paper we introduce novel statistical methodology for (1) quantile regression with autoregressive residual structure, (2) estimation of corresponding model parameters, (3) development of a rigorous framework for specifying uncertainty estimates of quantities of interest, yielding (4) statistical byproducts that address the two scientific foci discussed above. Our statistical methodology demonstrably produces a more robust reconstruction than is possible by using conditional-mean-fitting methods. Our reconstruction shares some of the common features of past reconstructions, but also gains useful insights. More importantly, we are able to demonstrate a significantly smaller uncertainty than that from previous regression methods. In addition, the quantile regression component allows us to model, in a more complete and flexible way than least squares, the conditional distribution of temperature given proxies. This relationship can be used to inform forward models relating how proxies are driven by temperature

    Spatial Variation and Interpolation of Wind Speed Statistics and Its Implication in Design Wind Load

    Get PDF
    Consideration of wind load is important for design of engineered structures. Codification of wind load for structural design requires the estimation of the quantiles or return period values of the annual maximum wind speed. The extreme wind speeds are estimated based on the measured wind records at different meteorological stations and affected by the length of the wind record (i.e., sample size) and other factors such as the surrounding terrain and so on. This study is focused on 1) the spatial interpolation of wind speed statistics, 2) the potential of using regional frequency analysis in estimating the extreme wind speed, and 3) the reliability of designed structure at sites with and without sample size effects. For the spatial interpolation, both code recommended values of the wind speed as well as those based on at-site analysis are used, and commonly used spatial interpolation methods including 8 deterministic methods and 6 geostatistical methods have been applied. The preferred methods for each data set are determined based on the (leave-one-out) cross validation analyses. It is shown that the preferred method depends on the considered data set; the use of the ordinary kriging is preferred if a single method is to be selected for all considered data sets. The historical wind records and available meteorological stations are often short and insufficient or unavailable, and the limited sample size will cause the uncertainty in the estimated quantiles. To deal with the data insufficiency in the wind speed records at the meteorological stations, the regional frequency analysis was applied to the data from the same 235 Canadian meteorological stations as mentioned above to calculate the quantiles of the annual maximum wind speed for Canada. The obtained estimates of the quantiles of the extreme wind speed based on the regional frequency analysis are compared with those obtained based directly on the at-site analysis. The analysis uses the k-means, hierarchical and self-organizing map clusteringtoexplore potential clusters or regions; statistical tests are then appliedtoidentify homogeneous regions for subsequent regional frequency analysis. Results indicate that the k-means is the preferred exploratory tool for the considered data and the generalized extreme value distribution provides a better fit to the data than the Gumbel distribution for regional frequency analysis. However, the former is associated with low values of the upper bound that do not influence the estimation of 10- to 50-year return period values of annual maximum wind speed but do influence the return period values with return period greater than 500 years. Based on these observations, regional frequency analysis may not be needed as an alternative to the at-site analysis. Furthermore, since the estimated quantiles of the extreme wind speed at a site are uncertain due to the limited sample size, the effect of this statistical uncertainty on the estimated return period value of the wind speed and structural reliability is investigated and two strategies (i.e. (i) a low return period for the nominal wind speed combined with a wind load factor greater than one and (ii) a high return period for the nominal wind speed combined with unity wind load factor) for specifying the factored design wind load are also evaluated to determine the optimal one. Results indicate that at least 20 years of useable data are needed for a station to be included in the extreme value analysis, and the first strategy is preferred to cope with sample size effect for the design at a particular site or in a region with statistically homogeneous wind climate, while the second strategy is recommended for the code making for a country with spatially varying coefficient of variation of annual maximum wind speed since it leads to better reliability consistency

    A Latent Parameter Node-Centric Model for Spatial Networks

    Get PDF
    Spatial networks, in which nodes and edges are embedded in space, play a vital role in the study of complex systems. For example, many social networks attach geo-location information to each user, allowing the study of not only topological interactions between users, but spatial interactions as well. The defining property of spatial networks is that edge distances are associated with a cost, which may subtly influence the topology of the network. However, the cost function over distance is rarely known, thus developing a model of connections in spatial networks is a difficult task. In this paper, we introduce a novel model for capturing the interaction between spatial effects and network structure. Our approach represents a unique combination of ideas from latent variable statistical models and spatial network modeling. In contrast to previous work, we view the ability to form long/short-distance connections to be dependent on the individual nodes involved. For example, a node's specific surroundings (e.g. network structure and node density) may make it more likely to form a long distance link than other nodes with the same degree. To capture this information, we attach a latent variable to each node which represents a node's spatial reach. These variables are inferred from the network structure using a Markov Chain Monte Carlo algorithm. We experimentally evaluate our proposed model on 4 different types of real-world spatial networks (e.g. transportation, biological, infrastructure, and social). We apply our model to the task of link prediction and achieve up to a 35% improvement over previous approaches in terms of the area under the ROC curve. Additionally, we show that our model is particularly helpful for predicting links between nodes with low degrees. In these cases, we see much larger improvements over previous models
    • …
    corecore