1,046 research outputs found

    Scalable Applications on Heterogeneous System Architectures: A Systematic Performance Analysis Framework

    Get PDF
    The efficient parallel execution of scientific applications is a key challenge in high-performance computing (HPC). With growing parallelism and heterogeneity of compute resources as well as increasingly complex software, performance analysis has become an indispensable tool in the development and optimization of parallel programs. This thesis presents a framework for systematic performance analysis of scalable, heterogeneous applications. Based on event traces, it automatically detects the critical path and inefficiencies that result in waiting or idle time, e.g. due to load imbalances between parallel execution streams. As a prerequisite for the analysis of heterogeneous programs, this thesis specifies inefficiency patterns for computation offloading. Furthermore, an essential contribution was made to the development of tool interfaces for OpenACC and OpenMP, which enable a portable data acquisition and a subsequent analysis for programs with offload directives. At present, these interfaces are already part of the latest OpenACC and OpenMP API specification. The aforementioned work, existing preliminary work, and established analysis methods are combined into a generic analysis process, which can be applied across programming models. Based on the detection of wait or idle states, which can propagate over several levels of parallelism, the analysis identifies wasted computing resources and their root cause as well as the critical-path share for each program region. Thus, it determines the influence of program regions on the load balancing between execution streams and the program runtime. The analysis results include a summary of the detected inefficiency patterns and a program trace, enhanced with information about wait states, their cause, and the critical path. In addition, a ranking, based on the amount of waiting time a program region caused on the critical path, highlights program regions that are relevant for program optimization. The scalability of the proposed performance analysis and its implementation is demonstrated using High-Performance Linpack (HPL), while the analysis results are validated with synthetic programs. A scientific application that uses MPI, OpenMP, and CUDA simultaneously is investigated in order to show the applicability of the analysis

    Features correlation-based workflows for high-performance computing systems diagnosis

    Get PDF
    Analysing failures to improve the reliability of high performance computing systems and data centres is important. The primary source of information for diagnosing system failures is the system logs and it is widely known that finding the cause of a system failure using only system logs is incomplete. Resource utilisation data – recently made available – is another potential useful source of information for failure analysis. However, large High-Performance Computing (HPC) systems generate a lot of data. Processing the huge amount of data presents a significant challenge for online failure diagnosis. Most of the work on failure diagnosis have studied errors that lead to system failures only, but there is little work that study errors which lead to a system failure or recovery on real data. In this thesis, we design, implement and evaluate two failure diagnostics frameworks. We name the frameworks CORRMEXT and EXERMEST. We implement the Data Type Extraction, Feature Extraction, Correlation and Time-bin Extraction modules. CORRMEXT integrates the Data Type Extraction, Correlation and Time-bin Extraction modules. It identifies error cases that occur frequently and reports the success and failure of error recovery protocols. EXERMEST integrates the Feature Extraction and Correlation modules. It extracts significant errors and resource use counters and identifies error cases that are rare. We apply the diagnostics frameworks on the resource use data and system logs on three HPC systems operated by the Texas Advanced Computing Center (TACC). Our results show that: (i) multiple correlation methods are required for identifying more dates of groups of correlated resource use counters and groups of correlated errors, (ii) the earliest hour of change in system behaviour can only be identified by using the correlated resource use counters and correlated errors, (iii) multiple feature extraction methods are required for identifying the rare error cases, and (iv) time-bins of multiple granularities are necessary for identifying the rare error cases. CORRMEXT and EXERMEST are available on the public domain for supporting system administrators in failure diagnosis

    John Wesley's pioneer work in the field of applied Christianity.

    Full text link
    Typewritten sheets in cover. Thesis (M.A.)--Boston University This item was digitized by the Internet Archive

    Review of M. Peillon, Welfare in Ireland

    Get PDF

    Cognitive control of attention, emotion, and memory : an ERP study

    Get PDF
    Unwanted retrieval of negative memories can be problematic for many clinical populations. The Think/No-Think (T/NT) task (Anderson & Green, 2001) is a new paradigm for studying cognitive control during cued recall. In this task participants view a cue item and are asked to consciously retrieve (think) or interrupt retrieval (no-think) of the associated target item. Eyer (2009) found that self-reported mindfulness was correlated with T/NT cued recall, suggesting a relationship between control of memory retrieval and a general cognitive control skill. The current study measured event-related potentials (ERPs; i.e., electrical brain responses time-locked to cue presentation) for negative and neutral stimuli on the TNT task to assess cognitive control during retrieval. Method: Participants (N = 35) completed questionnaires (e.g., mindfulness, intrusive thoughts) and cognitive tasks related to cognitive control (e.g., attention, working memory span). Then, ERPs were recorded during the TNT task, followed by a final cued recall test. Results: Analyses of ERPs found evidence to support somewhat separable neural networks for control of memory retrieval and for processing the emotional content of the target pictures, with some time windows only exhibiting a main effect of strategy or of emotional valence. However, there was widespread evidence for interactions of these subsystems across a range of time latencies post-cue presentation. Of particular note was a significant Strategy x Valence interaction for the early P1 component (125-164 ms). The overall size of the N2 (250–324 ms) peak was correlated with a wide range of self- report and cognitive test measures of cognitive control at frontal electrode sites. Discussion: The present study adds to knowledge of the timing of control processes during performance of the TNT task through its use of ERP methodology. The effect of the emotional valence of the to-be-recalled target on the early P1 ERP component suggests surprisingly early emotional processing during memory retrieval. The present results also suggest that at least some of the control processes used during the TNT task are part of a larger general-purpose cognitive control system. These results suggest that individual traits provide important and varying influences on the cognitive control of emotional memories

    Doctor of Philosophy

    Get PDF
    dissertationDespite decades of awareness and research, cancer continues to grow as a threat to public health. This prevalence indicates the continued importance of attending to how cancer is covered and constructed in public health campaigns ("official" discourses) and mainstream news coverage ("common" discourses), particularly since the latter frequently shapes public perceptions about the disease and the former educates populations about the disease. In this dissertation, I assess and evaluate the differences and similarities between official and common discourses of health, paying particular attention to the existence, location, and mobilization of fissures between these discourses, especially as these fissures could indicate the pervasive discourses around particular cancers that patients are likely to have encountered and that may influence their perceptions of the disease, their experiences, and appropriate treatment. I am guided by four questions: (a) What are the differences, if any, between official and common health discourses of, respectively, breast, bladder, and skin cancers? (b) How are health providers, patients, and specific cancers rhetorically characterized, respectively, within and across official and common discourses? (c) How are individual and structural responsibility (or unaccountability) rhetorically mobilized across these different health conditions? (d) What are the implications of these findings for health information, education, promotion, and intervention efforts? I answer these questions through a critical rhetorical analysis of two distinct sets of texts for each cancer type under examination here: official/institutional discourses broadly disseminated to the public about these cancers and mainstream news coverage. Analysis of these texts suggest that, in each case, official discourses characterize cancer, patients, and the medical establishment in ways that are distinct from common discourses. In doing so, this study contributes to extant health communication literature by continuing to parse established knowledge about assumptions of patient responsibility and the role of structural entities in the fight against cancer. This study also complicates the official/common binary in order to apprehend a potential middle ground discourse between official and vernacular discourses, thus resurfacing and redefining the notion of the "common" in order to account for the continued blurring of the line between media producer and consumer
    • …
    corecore