1 research outputs found

    Productivity and Software Development Effort Estimation in High-Performance Computing

    No full text
    Ever increasing demands for computational power are concomitant with rising electrical power needs and complexity in hardware and software designs. According increasing expenses for hardware, electrical power and programming tighten the rein on available budgets. Hence, an informed decision making on how to invest available budgets is more important than ever. Especially for procurements, a quantitative metric is needed to predict the cost effectiveness of an HPC center.In this work, I set up models and methodologies to support the HPC procurement process of German HPC centers. I model cost effectiveness as a productivity figure of merit of HPC centers by defining a ratio of scientific outcome generated over the lifetime of the HPC system to its total costs of ownership (TCO). I further define scientific outcome as number of scientific-application runs to embrace the multi-job nature of an HPC system in a meaningful way. I investigate the predictability of the productivity model's parameters and show their robustnesstowards errors in various real-world HPC setups. Case studies further verify the model's applicability, e.g., to compare hardware setups or optimize system lifetime.I continue to investigate total ownership costs of HPC centers as part of the productivity metric. I model TCO by splitting expenses into one-time and annual costs, node-based and node-type-based costs, as well as, system-dependent and application-dependent costs. Furthermore, I discuss quantification and predictability capabilities of all TCO components.I tackle the challenge of estimating HPC software development effort as TCO component with increasing importance. For that, I establish a methodology that is based on a so-called performance life-cycle describing the relationship of effort to performance achieved by spending the respective effort. To identify further impactfactors on application development effort, I apply ranking surveys that reveal priorities for quantifying effects. Such an effect is the developer's pre-knowledge in HPC whose quantification is addressed by confidence ratings in so-called knowledge surveys. I also examine the quantification of impacts of the parallel programming model by proposing a pattern-based approach. Since meaningful quantifications rely on sufficient and appropriate data sets, I broaden previous human-subject based data collections by introducing tools and methods for a community effort. Finally, I present the applicability of my methodologies and models in a case study that covers a real-world application from aeroacoustics simulation
    corecore