306 research outputs found

    Survey of new vector computers: The CRAY 1S from CRAY research; the CYBER 205 from CDC and the parallel computer from ICL - architecture and programming

    Get PDF
    Problems which can arise with vector and parallel computers are discussed in a user oriented context. Emphasis is placed on the algorithms used and the programming techniques adopted. Three recently developed supercomputers are examined and typical application examples are given in CRAY FORTRAN, CYBER 205 FORTRAN and DAP (distributed array processor) FORTRAN. The systems performance is compared. The addition of parts of two N x N arrays is considered. The influence of the architecture on the algorithms and programming language is demonstrated. Numerical analysis of magnetohydrodynamic differential equations by an explicit difference method is illustrated, showing very good results for all three systems. The prognosis for supercomputer development is assessed

    Lattice gauge theories dynamical fermions and parallel computation

    Get PDF
    SIGLEAvailable from British Library Document Supply Centre- DSC:D71683/87 / BLDSC - British Library Document Supply CentreGBUnited Kingdo

    Trusted execution: applications and verification

    Get PDF
    Useful security properties arise from sealing data to specific units of code. Modern processors featuring Intel’s TXT and AMD’s SVM achieve this by a process of measured and trusted execution. Only code which has the correct measurement can access the data, and this code runs in an environment trusted from observation and interference. We discuss the history of attempts to provide security for hardware platforms, and review the literature in the field. We propose some applications which would benefit from use of trusted execution, and discuss functionality enabled by trusted execution. We present in more detail a novel variation on Diffie-Hellman key exchange which removes some reliance on random number generation. We present a modelling language with primitives for trusted execution, along with its semantics. We characterise an attacker who has access to all the capabilities of the hardware. In order to achieve automatic analysis of systems using trusted execution without attempting to search a potentially infinite state space, we define transformations that reduce the number of times the attacker needs to use trusted execution to a pre-determined bound. Given reasonable assumptions we prove the soundness of the transformation: no secrecy attacks are lost by applying it. We then describe using the StatVerif extensions to ProVerif to model the bounded invocations of trusted execution. We show the analysis of realistic systems, for which we provide case studies

    Biological sequence comparison on a parallel computer

    Get PDF

    Parallel algorithm for large scale electronic structure calculations

    Get PDF
    SIGLEAvailable from British Library Document Supply Centre- DSC:DX84152 / BLDSC - British Library Document Supply CentreGBUnited Kingdo

    The rhetoric of Americanisation: social construction and the British computer industry in the Post-World War II period

    Get PDF
    This research seeks to understand the process of technological development in the UK and the specific role of a ‘rhetoric of Americanisation’ in that process. The concept of a ‘rhetoric of Americanisation’ will be developed throughout the thesis through a study into the computer industry in the UK in the post-war period. Specifically, the thesis discusses the threat of America, or how actors in the network of innovation within the British computer industry perceived it as a threat and the effect that this perception had on actors operating in the networks of construction in the British computer industry. However, the reaction to this threat was not a simple one. Rather this story is marked by sectional interests and technopolitical machination attempting to capture this rhetoric of ‘threat’ and ‘falling behind’. In this thesis the concept of ‘threat’ and ‘falling behind’, or more simply the ‘rhetoric of Americanisation’, will be explored in detail and the effect this had on the development of the British computer industry. What form did the process of capture and modification by sectional interests within government and industry take and what impact did this have on the British computer industry? In answering these questions, the thesis will first develop a concept of a British culture of computing which acts as the surface of emergence for various ideologies of innovation within the social networks that made up the computer industry in the UK. In developing this understanding of a culture of computing, the fundamental distinction between the US and UK culture of computing will be explored. This in turn allows us to develop a concept of how Americanisation emerged as rhetorical construct. With the influence of a ‘rhetoric of Americanisation’, the culture of computing in the UK began to change and the process through which government and industry interacted in the development of computing technologies also began to change. In this second half of the thesis a more nuanced and complete view of the nature of innovation in computing in the UK in the sixties will be developed. This will be achieved through an understanding of the networks of interaction between government and industry and how these networks were reconfigured through a ‘rhetoric of Americanisation’. As a result of this, the thesis will arrive at a more complete view of change and development within the British computer industry and how interaction with government influences that change

    Internal dynamics of galaxy clusters from cosmological hydrodynamical simulations

    Get PDF
    Galaxy clusters are the most massive systems in the Universe. They are usually located at the nodes of the cosmic web from which they continuously accrete matter. In this work, by combining cosmological simulations and local Universe observations, we examined several properties of the different collisionless tracers of the internal dynamics of galaxy clusters - namely Dark Matter (DM), stars, and galaxies -- to gain insights into the main processes operating in structure formation and evolution. We base our analysis on the DIANOGA zoom-in simulation set which is composed of 29 Lagrangian regions at different levels of resolution and under varying physical conditions (full hydrodynamical and/or N-body simulations). Recent measurements (Biviano et al. 2013,2016; Capasso et al. 2019) of the pseudo-entropy (σ^2⁄ρ^(2/3) , where σ velocity dispersion and ρ density of the collisionless tracer) allowed us to study its role in the evolution of clusters as dynamical attractor (e.g., Taylor et al. 2001, Dehnen et al. 2005). Its fingerprint is the universal radial profile described by a simple power-law. We find good agreement in both normalisation and slope between observations and simulations. A significant tension is present with the galaxy member population, we discuss in detail the probable reasons behind this finding. A large body of spectroscopic measurements (Loubser et al. 2018; Sohn et al. 2020, 2021) were able to provide a large statistical sample to study the dynamics of the Brightest Cluster Galaxy (BCG). We compare scaling relations between the BCG and cluster velocity dispersions and corresponding masses: we find in general a good agreement with observational results for the former and significant tension in the latter. We analyse the key features of the velocity dispersion profiles, as traced by stars, DM, and galaxies (Sartoris et al. 2020) and they are in excellent agreement with simulations. We also quantify the assumed impact of the IntraCluster Light (ICL) in these measurements. Furthermore, given the existing dynamical distinction between BCG and ICL, we developed a Machine Learning (ML) method based on a supervised Random Forest to classify stars in simulated galaxy clusters in these two classes. We employ matched stellar catalogues (built from a modified version of Subfind, Dolag et al. 2010) to train and test the classifier. The input features are cluster mass, normalised particle clustercentric distance, and rest-frame velocity. The model is found to correctly identify most of the stars, while the larger errors are exhibited at the BCG outskirts, where the differences between the physical properties of the two components are less obvious. We find that our classifier provides consistent results in simulations for clusters at z<1, using different numerical resolutions and implementations of the feedback. The last part of the project has focused on creating a ML framework to bridge the observational analysis with predictions from simulations. Measuring the ICL in observations is a difficult task which is often solved by fitting functional profiles to the BCG+ICL light profile, but often providing significantly different results. We developed a method based on convolutional neural networks to identify the ICL distribution in mock images of galaxy clusters, according to the dynamical classification we routinely perform in simulations. We construct several sets of mock images based on different observables (i.e., magnitudes, line-of-sight velocity, and velocity dispersion) that can be employed as input by the network to predict the ICL distribution in such images. This project has highlighted the dependence of the ICL build-up on the numerical resolution of the simulations, a problem which requires further investigations.Galaxy clusters are the most massive systems in the Universe. They are usually located at the nodes of the cosmic web from which they continuously accrete matter. In this work, by combining cosmological simulations and local Universe observations, we examined several properties of the different collisionless tracers of the internal dynamics of galaxy clusters - namely Dark Matter (DM), stars, and galaxies -- to gain insights into the main processes operating in structure formation and evolution. We base our analysis on the DIANOGA zoom-in simulation set which is composed of 29 Lagrangian regions at different levels of resolution and under varying physical conditions (full hydrodynamical and/or N-body simulations). Recent measurements (Biviano et al. 2013,2016; Capasso et al. 2019) of the pseudo-entropy (σ^2⁄ρ^(2/3) , where σ velocity dispersion and ρ density of the collisionless tracer) allowed us to study its role in the evolution of clusters as dynamical attractor (e.g., Taylor et al. 2001, Dehnen et al. 2005). Its fingerprint is the universal radial profile described by a simple power-law. We find good agreement in both normalisation and slope between observations and simulations. A significant tension is present with the galaxy member population, we discuss in detail the probable reasons behind this finding. A large body of spectroscopic measurements (Loubser et al. 2018; Sohn et al. 2020, 2021) were able to provide a large statistical sample to study the dynamics of the Brightest Cluster Galaxy (BCG). We compare scaling relations between the BCG and cluster velocity dispersions and corresponding masses: we find in general a good agreement with observational results for the former and significant tension in the latter. We analyse the key features of the velocity dispersion profiles, as traced by stars, DM, and galaxies (Sartoris et al. 2020) and they are in excellent agreement with simulations. We also quantify the assumed impact of the IntraCluster Light (ICL) in these measurements. Furthermore, given the existing dynamical distinction between BCG and ICL, we developed a Machine Learning (ML) method based on a supervised Random Forest to classify stars in simulated galaxy clusters in these two classes. We employ matched stellar catalogues (built from a modified version of Subfind, Dolag et al. 2010) to train and test the classifier. The input features are cluster mass, normalised particle clustercentric distance, and rest-frame velocity. The model is found to correctly identify most of the stars, while the larger errors are exhibited at the BCG outskirts, where the differences between the physical properties of the two components are less obvious. We find that our classifier provides consistent results in simulations for clusters at z<1, using different numerical resolutions and implementations of the feedback. The last part of the project has focused on creating a ML framework to bridge the observational analysis with predictions from simulations. Measuring the ICL in observations is a difficult task which is often solved by fitting functional profiles to the BCG+ICL light profile, but often providing significantly different results. We developed a method based on convolutional neural networks to identify the ICL distribution in mock images of galaxy clusters, according to the dynamical classification we routinely perform in simulations. We construct several sets of mock images based on different observables (i.e., magnitudes, line-of-sight velocity, and velocity dispersion) that can be employed as input by the network to predict the ICL distribution in such images. This project has highlighted the dependence of the ICL build-up on the numerical resolution of the simulations, a problem which requires further investigations
    • 

    corecore