12 research outputs found

    Informed microarchitecture design space exploration using workload dynamics

    Get PDF
    Program runtime characteristics exhibit significant variation. As microprocessor architectures become more complex, their efficiency depends on the capability of adapting with workload dynamics. Moreover, with the approaching billion-transistor microprocessor era, it is not always economical or feasible to design processors with thermal cooling and reliability redundancy capabilities that target an application’s worst case scenario. Therefore, analyzing complex workload dynamics early, at the microarchitecture design stage, is crucial to forecast workload runtime behavior across architecture design alternatives and evaluate the efficiency of workload scenariobased architecture optimizations. Existing methods focus exclusively on predicting aggregated workload behavior. In this paper, we propose accurate and efficient techniques and models to reason about workload dynamics across the microarchitecture design space without using detailed cyclelevel simulations. Our proposed techniques employ waveletbased multiresolution decomposition and neural network based non-linear regression modeling. We extensively evaluate the efficiency of our predictive models in forecasting performance, power and reliability domain workload dynamics that the SPEC CPU 2000 benchmarks manifest on high-performance microprocessors with a microarchitecture design space that consists of 9 key parameters. Our results show that the models achieve high accuracy in revealing workload dynamic behavior across a large microarchitecture design space. We also demonstrate that the proposed techniques can be used to efficiently explore workload scenario-driven architecture optimizations. 1

    Opinion formation about childhood immunization and disease spread on networks

    Get PDF
    People are physically and socially connected with each other. Those connections between people represent two, probably overlapping, networks: biological networks, through which physical contacts occur, or social network, through which information diffuse. In my thesis research, I am trying to answer that question in the context of pediatric disease spread on the biological network between households as well as within them and its relationship with information sharing on the social network of households (parents in that case) via Information Cascades. I mainly focus on the Erdos-Renyi network model. In particular, I use two different but overlapping Erdos-Renyi networks for the biological and social networks in the model. I am using agent-based stochastic simulations implemented in MatLab to study the modeling results

    Stochastic-optimization of equipment productivity in multi-seam formations

    Get PDF
    Short and long range planning and execution for multi-seam coal formations (MSFs) are challenging with complex extraction mechanisms. Stripping equipment selection and scheduling are functions of the physical dynamics of the mine and the operational mechanisms of its components, thus its productivity is dependent on these parameters. Previous research studies did not incorporate quantitative relationships between equipment productivities and extraction dynamics in MSFs. The intrinsic variability of excavation and spoiling dynamics must also form part of existing models. This research formulates quantitative relationships of equipment productivities using Branch-and-Bound algorithms and Lagrange Parameterization approaches. The stochastic processes are resolved via Monte Carlo/Latin Hypercube simulation techniques within @RISK framework. The model was presented with a bituminous coal mining case in the Appalachian field. The simulated results showed a 3.51% improvement in mining cost and 0.19% increment in net present value. A 76.95yd³ drop in productivity per unit change in cycle time was recorded for sub-optimal equipment schedules. The geologic variability and equipment operational parameters restricted any possible change in the cost function. A 50.3% chance of the mining cost increasing above its current value was driven by the volume of material re-handled with 0.52 regression coefficient. The study advances the optimization process in mine planning and scheduling algorithms, to efficiently capture future uncertainties surrounding multivariate random functions. The main novelty includes the application of stochastic-optimization procedures to improve equipment productivity in MSFs --Abstract, page iii

    Aplicación para android para el cálculo de la amenaza relativa frente a flujos piroclásticos

    Get PDF
    ARHAT (Android Relative Hazard Assessment Tool o Herramienta Android para la Evaluación de la Amenaza Relativa) es un aplicativo para dispositivos móviles, que a través de una conexión a internet, permite realizar la consulta de los datos de modelado computacional de flujos piroclásticos en el software TITAN 2D luego de ser analizados. La app presenta al usuario el porcentaje de aumento o disminución de la amenaza relativa de una localización respecto a otra, haciendo uso del sistema operativo ANDROID para la implementación del proyecto en teléfonos móviles con este sistema. El proyecto ARHAT fue desarrollado para dos volcanes, los cuales ya contaban con un estudio de modelamiento de escenarios realizados con la herramienta TITAN 2D. El primer caso es el del volcán San Cristóbal ubicado al suroeste del país de Nicaragua, cuenta con un total de 700 modelamientos de flujos piroclásticos alrededor del cráter. El segundo caso de estudio es del volcán Galeras ubicado en al sur occidente del territorio Colombiano, el volcán cuenta 5600 simulaciones de flujos piroclásticos. A diferencia del cráter del volcán San Cristóbal cuyo diámetro es relativamente pequeño comparado con el de Galeras, éste último cuenta con 8 fuentes de inicio de flujos ubicadas alrededor del cráter con el fin de cubrir los flancos del volcán.Haciendo uso de la teoría de la probabilidad se aplicaron dos corrientes principales. La metodología frecuencial y el análisis bayesiano. Finalmente, a través del modelo de desarrollo de software conocido como Extreme Programming ó Programación Extrema, se diseñó una herramienta software para el acceso a la información probabilística obtenida. Para lo cual fue necesario implementar una arquitectura cliente servidor donde el aplicativo Android es el cliente y por medio de un Servlet se configuró el manejo y la consulta de la información del Servidor

    Generalized belief change with imprecise probabilities and graphical models

    Get PDF
    We provide a theoretical investigation of probabilistic belief revision in complex frameworks, under extended conditions of uncertainty, inconsistency and imprecision. We motivate our kinematical approach by specializing our discussion to probabilistic reasoning with graphical models, whose modular representation allows for efficient inference. Most results in this direction are derived from the relevant work of Chan and Darwiche (2005), that first proved the inter-reducibility of virtual and probabilistic evidence. Such forms of information, deeply distinct in their meaning, are extended to the conditional and imprecise frameworks, allowing further generalizations, e.g. to experts' qualitative assessments. Belief aggregation and iterated revision of a rational agent's belief are also explored

    Evaluation of land management impacts on low flows in northern England

    Get PDF
    Low flows are becoming an increasing issue in the UK. The effect of an increasing population on water supply demand is bringing awareness of the issue of extreme low flows risk to the attention of water and environmental managers across the country. Summer droughts in the Lake District in 2010 which followed winter flooding have raised the question of whether land management can be applied to reduce low flows risk in the area. This is the issue considered in this project. This master’s thesis, funded by the Adaptive Land-use for Flood Alleviation (ALFA) project of the EU set out to discover whether land management, vegetation change or changes in farming practices, could help reduce the risk of extreme low flows in Cumbria, England. The hydrological model CRUM3 was applied to simulate the river discharge of the Dacre Beck under different land management change scenarios. Sensitivity analysis and a rigorous Generalised Likelihood Uncertainty Estimation experiment proved the model’s efficiency at predicting low flows discharges as well as flood peaks. Results of vegetation change scenarios demonstrated that a cover of natural grassland provided the best water supply to the river during low flows. Increases in cover of the land by each 1% of the catchment area in natural grassland resulted in a 1% increase in stream discharge during extreme low flows periods. The location of the land assigned to vegetation change was shown to be insignificant. Scenarios of improved agricultural practice were modelled to simulate the reduction of compaction in the catchment by soil aeration. This revealed more impressive increases in river discharge during extreme low flows than the vegetation change. Though the compaction scenarios were theoretical, feasible increases in low flows discharge could reach 100%. Since flooding has also been a proven issue in this region, the scenarios were also assessed for their impacts on high flows. The most beneficial vegetation type at reducing high flows was deciduous woodland, though this had been seen to have a negative effect on low flows. Natural grassland had negligible effect on catchment high flows. Compaction reduction was however discovered to be a potential simultaneous management solution to both high and low flows, as whilst potentially increasing low flows by up to 100%, it could also decrease high flows by up to 8%. Further research would be required to make accurate estimates of the potential improvements to high and low flows, but this project has demonstrated that reducing compaction is definitely beneficial to the catchment hydrology

    Estimating Unsaturated Flow Properties in Coarse Conglomeratic Sediment

    Get PDF
    In this dissertation, I address the lack of knowledge of unsaturated flow in coarse, conglomeratic sediment by determining if functional θ-ψ-K relationships, specifically van Genuchten-Mualem (VGM) relationships, developed to predict unsaturated flow in relatively fine-grained sediment can be directly applied to coarse, conglomeratic sediment. In the summer of 2011, a field-scale infiltration test was conducted at the Boise Hydrogeophysical Research Site to determine if functional ψ-θ-K relationships could be applied to infiltration in coarse, conglomeratic sediment, and to estimate parameter values for the VGM relationships. Vertically and laterally distributed ψ(t) and θ(t) measurements were made within the infiltration volume during the test, and geophysical data and core samples were used to determine material structure and distribution for model development. A four-material, 1D layered model was first used with a Metropolis-Hastings search to fit partial ψ(t) and θ(t) data and determine if VGM relationships are appropriate for unsaturated flow in coarse, conglomeratic sediment. The 1D model accurately fit a subset of the observed data, implying that VGM relationships were applicable, and predicted low uncertainty in θ(ψ) and K(ψ) curves for three of the four different materials but high uncertainty was observed in individual parameter values (σ/μ \u3e 50 %). A four-material, 2D model was then constructed to incorporate variations in material thickness (lateral heterogeneity) and to fit all ψ(t) and θ(t) data. A Direct-search optimization with this model showed that fitting θ(t) and ψ(t) data simultaneously was not possible due to additional, lateral heterogeneity within one of the material layers so a five-material, 2D model was constructed. Direct-search optimization using this model successfully fit the full θ(t) and ψ(t) data sets and Latin-hypercube sampling was used to estimate final parameter uncertainty. These results showed further reduction of uncertainty in parameter values compared to the 1D model (σ/μ \u3c 15 % for all parameters and up to 36 % reduction of σ/μ for some individual parameters). Results from both the 1D and 2D models show that unsaturated flow relationships developed for agricultural soils (e.g., the VGM models) may be used to predict flow and moisture distribution in coarse, conglomeratic sediment. This implies limited obstruction by cobbles at low-saturation and a very high capacity for infiltration in these types of materials under natural conditions. A method was also developed and presented in this dissertation which uses reflection travel-time from time-lapse ground-penetrating radar (GPR) profiles to estimate changes in θ in the vadose zone. The method was applied to the infiltration test data but failed to accurately reproduce the observed GPR travel-time data which was attributed primarily to uncertainty in picking GPR reflection times

    The scale-up of PrEP for HIV prevention in high-risk women in sub-Saharan Africa: use of mathematical modelling to inform policy making

    Get PDF
    Background: Women in sub-Saharan Africa carry a disproportionate burden and risk of HIV. In this context, women at high HIV risk are not a discrete group, rather on a spectrum of risk caused by a multitude of behavioural, economic, structural, cultural and geographic factors. Pre-exposure prophylaxis (PrEP) is a promising new HIV prevention method, effective at reducing HIV risk when adhered to. However, the results of PrEP trials and implementation studies to date reveal challenges in women’s programme retention and drug adherence. There are also concerns that behavioural disinhibition (reductions in condom use) following the introduction of PrEP may further limit its ability to avert infections. In the context of HIV resource limitations and decreasing donor budgets, this thesis seeks to use mathematical modelling to assess strategies for PrEP scale-up for women across a spectrum of risk in sub-Saharan Africa, accounting for heterogeneities in HIV risk factors and PrEP programme outcomes. Considering the challenges faced by policy makers in using mathematical models to guide decision making, often considered to be complex ‘black boxes’, this thesis also sets out to assess the contexts in which simple models are sufficient to guide policy making around the introduction of a new HIV prevention intervention. Methods: This thesis adopts mathematical modelling approaches to inform HIV policy making. First, a simple static model of HIV risk to female sex workers is developed and used to assess the impact of behavioural disinhibition on PrEP’s ability to avert HIV infections. The static model formulation is then evolved to incorporate dynamic effects to account for the downstream effects of population interactions. The models account for heterogeneities in women’s HIV risk factors and PrEP programme outcomes, and the low levels of PrEP programme retention and adherence reported in studies. The outcomes of the static and dynamic model formulation are compared over different time horizons and epidemic contexts, to contribute to understanding around the importance of modelling complexity to inform HIV policy. Finally, the static model is refined to represent women across a more broadly defined spectrum of risk: women 15-24 years, 25-34 years, 35-49 years and female sex workers. The models are parameterised to case study countries spanning a range of high HIV burden contexts in sub-Saharan Africa: South Africa, Zimbabwe and Kenya, and used to assess strategies for PrEP scale-up in each country, considering cost-effectiveness and population-level impact. Conclusions: PrEP is likely to be of benefit in reducing HIV risk in women across a spectrum of HIV-risk in sub-Saharan Africa, even if reductions in condom use occur. PrEP will be most cost-effective for individuals at great HIV risk, such as female sex workers. However, PrEP has potential to significantly reduce the number of new infections at population-level if made widely available beyond those at highest individual risk, including to women in the general population. Strategies for PrEP scale-up will need to weigh the potential cost-effectiveness and population-level impact of PrEP with the potential for PrEP integration into a wide range of national services and at community level, in order to significantly bring down the costs and improve cost-effectiveness in resource-constrained environments. Static models can be sufficiently robust to inform policy making around the introduction of new HIV prevention interventions in high HIV-burden settings over short-medium time horizon of up to 5 years, where underlying HIV epidemics have reached equilibrium. Over longer timeframes, and in contexts where the underlying HIV epidemics are still evolving (other than over short time horizons of less than a year), static models may under-emphasize situations of programmatic importance and dynamic models will be more appropriate to guide decision making
    corecore