165,998 research outputs found

    ASCR/HEP Exascale Requirements Review Report

    Full text link
    This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.Comment: 77 pages, 13 Figures; draft report, subject to further revisio

    Home Value Protection: Final Report

    Get PDF
    The following report provides an overview of a Home Value Protection (HVP) product to evaluate the practicality of making such a program more widely available and provide background for anyone considering such a plan. The paper is based largely on the Home Value Protection product established in Syracuse New York in 2002, and a number of the authors of this paper participated in the establishment of the Syracuse Home Value Protection program.The paper contains four sections:1: Investor OutreachThis section provides background information about the Syracuse program, the current and potential participants and what roles they might play, a review of a few of the ways such a program could be implemented, and links to various media coverage.2: Index ResearchThe Syracuse program measured changes in house values by a real estate index for the area (rather than individual house sale price), and this section evaluates a number of different index methods using four markets historical data to see how well the different indexes would have performed with a HVP product (had it been available).3: Capital Requirements & PricingThis section provides a model for estimating the pricing requirements and capital required for a program across multiple markets. While not exhaustive, this approach will provide a useful reference and starting point for anyone evaluating investment in such a program.4: Regulatory EnvironmentThis section provides information on some of the regulatory entities across the markets used in the analysis. Due to the variations in the way a HVP product could be implemented, regulations could apply in a variety of ways and this section can only offer a starting point for potential investors or participants

    Can geocomputation save urban simulation? Throw some agents into the mixture, simmer and wait ...

    Get PDF
    There are indications that the current generation of simulation models in practical, operational uses has reached the limits of its usefulness under existing specifications. The relative stasis in operational urban modeling contrasts with simulation efforts in other disciplines, where techniques, theories, and ideas drawn from computation and complexity studies are revitalizing the ways in which we conceptualize, understand, and model real-world phenomena. Many of these concepts and methodologies are applicable to operational urban systems simulation. Indeed, in many cases, ideas from computation and complexity studies—often clustered under the collective term of geocomputation, as they apply to geography—are ideally suited to the simulation of urban dynamics. However, there exist several obstructions to their successful use in operational urban geographic simulation, particularly as regards the capacity of these methodologies to handle top-down dynamics in urban systems. This paper presents a framework for developing a hybrid model for urban geographic simulation and discusses some of the imposing barriers against innovation in this field. The framework infuses approaches derived from geocomputation and complexity with standard techniques that have been tried and tested in operational land-use and transport simulation. Macro-scale dynamics that operate from the topdown are handled by traditional land-use and transport models, while micro-scale dynamics that work from the bottom-up are delegated to agent-based models and cellular automata. The two methodologies are fused in a modular fashion using a system of feedback mechanisms. As a proof-of-concept exercise, a micro-model of residential location has been developed with a view to hybridization. The model mixes cellular automata and multi-agent approaches and is formulated so as to interface with meso-models at a higher scale

    Toward Reliable Contention-aware Data Dissemination in Multi-hop Cognitive Radio Ad Hoc Networks

    Get PDF
    This paper introduces a new channel selection strategy for reliable contentionaware data dissemination in multi-hop cognitive radio network. The key challenge here is to select channels providing a good tradeoff between connectivity and contention. In other words, channels with good opportunities for communication due to (1) low primary radio nodes (PRs) activities, and (2) limited contention of cognitive ratio nodes (CRs) acceding that channel, have to be selected. Thus, by dynamically exploring residual resources on channels and by monitoring the number of CRs on a particular channel, SURF allows building a connected network with limited contention where reliable communication can take place. Through simulations, we study the performance of SURF when compared with three other related approaches. Simulation results confirm that our approach is effective in selecting the best channels for efficient and reliable multi-hop data dissemination

    A comparative assessment of methodologies used to evaluate competition policy

    Get PDF
    Research by academics and competition agencies on evaluating competition policy has grown rapidly during the last two decades. This paper surveys the literature in order to (i) assess the fitness for purpose of the main quantitative methodologies employed, and (ii) identify the main undeveloped areas and unanswered questions for future research. It suggests that policy evaluation is necessarily an imprecise science and that all existing methodologies have strengths and limitations. The areas where the need is most pressing for further work include: understanding why Article 102 cases are only infrequently evaluated; the need to bring conscious discussion of the counterfactual firmly into the foreground; a wider definition of policy to include success in deterrence and detection. At the heart of the discussion is the impact of selection bias on most aspects of evaluation. These topics are the focus of ongoing work in the CCP
    • …
    corecore