12,962 research outputs found

    High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Full text link
    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.Comment: 72 page

    Administrative Management Capacity in Out-of-School Time Organizations: An Exploratory Study

    Get PDF
    Based on interviews with sixteen high-quality out-of-school time (OST) program providers, identifies the managerial and administrative needs of OST nonprofits such as financial and human resources management and information technology. Suggests solutions

    De-perimeterisation as a cycle: tearing down and rebuilding security perimeters

    Get PDF
    If an organisation wants to secure its IT assets, where should the security mechanisms be placed? The traditional view is the hard-shell model, where an organisation secures all its assets using a fixed security border: What is inside the security perimeter is more or less trusted, what is outside is not. Due to changes in technologies, business processes and their legal environments this approach is not adequate anymore.\ud This paper examines this process, which was coined de-perimeterisation by the Jericho Forum.\ud In this paper we analyse and define the concepts of perimeter and de-perimeterisation, and show that there is a long term trend in which de-perimeterisation is iteratively accelerated and decelerated. In times of accelerated de-perimeterisation, technical and organisational changes take place by which connectivity between organisations and their environment scales up significantly. In times of deceleration, technical and organisational security measures are taken to decrease the security risks that come with de-perimeterisation, a movement that we call re-perimeterisation. We identify the technical and organisational mechanisms that facilitate de-perimeterisation and re-perimeterisation, and discuss the forces that cause organisations to alternate between these two movements

    Enhancing Job Scheduling of an Atmospheric Intensive Data Application

    Get PDF
    Nowadays, e-Science applications involve great deal of data to have more accurate analysis. One of its application domains is the Radio Occultation which manages satellite data. Grid Processing Management is a physical infrastructure geographically distributed based on Grid Computing, that is implemented for the overall processing Radio Occultation analysis. After a brief description of algorithms adopted to characterize atmospheric profiles, the paper presents an improvement of job scheduling in order to decrease processing time and optimize resource utilization. Extension of grid computing capacity is implemented by virtual machines in existing physical Grid in order to satisfy temporary job requests. Also scheduling plays an important role in the infrastructure that is handled by a couple of schedulers which are developed to manage data automaticall

    Analysis domain model for shared virtual environments

    Get PDF
    The field of shared virtual environments, which also encompasses online games and social 3D environments, has a system landscape consisting of multiple solutions that share great functional overlap. However, there is little system interoperability between the different solutions. A shared virtual environment has an associated problem domain that is highly complex raising difficult challenges to the development process, starting with the architectural design of the underlying system. This paper has two main contributions. The first contribution is a broad domain analysis of shared virtual environments, which enables developers to have a better understanding of the whole rather than the part(s). The second contribution is a reference domain model for discussing and describing solutions - the Analysis Domain Model
    • 

    corecore