27,811 research outputs found

    Beyond Reuse Distance Analysis: Dynamic Analysis for Characterization of Data Locality Potential

    Get PDF
    Emerging computer architectures will feature drastically decreased flops/byte (ratio of peak processing rate to memory bandwidth) as highlighted by recent studies on Exascale architectural trends. Further, flops are getting cheaper while the energy cost of data movement is increasingly dominant. The understanding and characterization of data locality properties of computations is critical in order to guide efforts to enhance data locality. Reuse distance analysis of memory address traces is a valuable tool to perform data locality characterization of programs. A single reuse distance analysis can be used to estimate the number of cache misses in a fully associative LRU cache of any size, thereby providing estimates on the minimum bandwidth requirements at different levels of the memory hierarchy to avoid being bandwidth bound. However, such an analysis only holds for the particular execution order that produced the trace. It cannot estimate potential improvement in data locality through dependence preserving transformations that change the execution schedule of the operations in the computation. In this article, we develop a novel dynamic analysis approach to characterize the inherent locality properties of a computation and thereby assess the potential for data locality enhancement via dependence preserving transformations. The execution trace of a code is analyzed to extract a computational directed acyclic graph (CDAG) of the data dependences. The CDAG is then partitioned into convex subsets, and the convex partitioning is used to reorder the operations in the execution trace to enhance data locality. The approach enables us to go beyond reuse distance analysis of a single specific order of execution of the operations of a computation in characterization of its data locality properties. It can serve a valuable role in identifying promising code regions for manual transformation, as well as assessing the effectiveness of compiler transformations for data locality enhancement. We demonstrate the effectiveness of the approach using a number of benchmarks, including case studies where the potential shown by the analysis is exploited to achieve lower data movement costs and better performance.Comment: Transaction on Architecture and Code Optimization (2014

    Risk mitigation decisions for it security

    Get PDF
    Enterprises must manage their information risk as part of their larger operational risk management program. Managers must choose how to control for such information risk. This article defines the flow risk reduction problem and presents a formal model using a workflow framework. Three different control placement methods are introduced to solve the problem, and a comparative analysis is presented using a robust test set of 162 simulations. One year of simulated attacks is used to validate the quality of the solutions. We find that the math programming control placement method yields substantial improvements in terms of risk reduction and risk reduction on investment when compared to heuristics that would typically be used by managers to solve the problem. The contribution of this research is to provide managers with methods to substantially reduce information and security risks, while obtaining significantly better returns on their security investments. By using a workflow approach to control placement, which guides the manager to examine the entire infrastructure in a holistic manner, this research is unique in that it enables information risk to be examined strategically. © 2014 ACM

    User profiles matching for different social networks based on faces embeddings

    Full text link
    It is common practice nowadays to use multiple social networks for different social roles. Although this, these networks assume differences in content type, communications and style of speech. If we intend to understand human behaviour as a key-feature for recommender systems, banking risk assessments or sociological researches, this is better to achieve using a combination of the data from different social media. In this paper, we propose a new approach for user profiles matching across social media based on embeddings of publicly available users' face photos and conduct an experimental study of its efficiency. Our approach is stable to changes in content and style for certain social media.Comment: Submitted to HAIS 2019 conferenc

    Co-Scheduling Algorithms for High-Throughput Workload Execution

    Get PDF
    This paper investigates co-scheduling algorithms for processing a set of parallel applications. Instead of executing each application one by one, using a maximum degree of parallelism for each of them, we aim at scheduling several applications concurrently. We partition the original application set into a series of packs, which are executed one by one. A pack comprises several applications, each of them with an assigned number of processors, with the constraint that the total number of processors assigned within a pack does not exceed the maximum number of available processors. The objective is to determine a partition into packs, and an assignment of processors to applications, that minimize the sum of the execution times of the packs. We thoroughly study the complexity of this optimization problem, and propose several heuristics that exhibit very good performance on a variety of workloads, whose application execution times model profiles of parallel scientific codes. We show that co-scheduling leads to to faster workload completion time and to faster response times on average (hence increasing system throughput and saving energy), for significant benefits over traditional scheduling from both the user and system perspectives

    Heuristics and biases in cardiovascular disease prevention:How can we improve communication about risk, benefits and harms?

    Get PDF
    Objective Cardiovascular disease (CVD) prevention guidelines recommend medication based on the probability of a heart attack/stroke in the next 5–10 years. However, heuristics and biases make risk communication challenging for doctors. This study explored how patients interpret personalised CVD risk results presented in varying formats and timeframes. Methods GPs recruited 25 patients with CVD risk factors and varying medication history. Participants were asked to ‘think aloud’ while using two CVD risk calculators that present probabilistic risk in different ways, within a semi-structured interview. Transcribed audio-recordings were coded using Framework Analysis. Results Key themes were: 1) numbers lack meaning without a reference point; 2) risk results need to be both credible and novel; 3) selective attention to intervention effects. Risk categories (low/moderate/high) provided meaningful context, but short-term risk results were not credible if they didn’t match expectations. Colour-coded icon arrays showing the effect of age and interventions were seen as novel and motivating. Those on medication focused on benefits, while others focused on harms. Conclusion CVD risk formats need to be tailored to patient expectations and experiences in order to counteract heuristics and biases. Practice implications Doctors need access to multiple CVD risk formats to communicate effectively about CVD prevention

    Swarm Intelligence Based Multi-phase OPF For Peak Power Loss Reduction In A Smart Grid

    Full text link
    Recently there has been increasing interest in improving smart grids efficiency using computational intelligence. A key challenge in future smart grid is designing Optimal Power Flow tool to solve important planning problems including optimal DG capacities. Although, a number of OPF tools exists for balanced networks there is a lack of research for unbalanced multi-phase distribution networks. In this paper, a new OPF technique has been proposed for the DG capacity planning of a smart grid. During the formulation of the proposed algorithm, multi-phase power distribution system is considered which has unbalanced loadings, voltage control and reactive power compensation devices. The proposed algorithm is built upon a co-simulation framework that optimizes the objective by adapting a constriction factor Particle Swarm optimization. The proposed multi-phase OPF technique is validated using IEEE 8500-node benchmark distribution system.Comment: IEEE PES GM 2014, Washington DC, US
    • …
    corecore