4,655 research outputs found

    Energy Saving Potential of Idle Pacman Supercomputing Nodes

    Get PDF
    To determine the energy saving potential of suspending idle supercomputing nodes without sacrificing efficiency, my research involved the setup of a compute node power usage monitoring system. This system measures how much power each node draws at its diff erent levels of operation using an automated Expect script. The script automates tasks with interactive command line interfaces, to perform the power measurement readings. Steps required for the power usage monitoring system include remotely logging into the Pacman Penguin compute cluster power distribution units (PDUs), feeding commands to the PDUs, and storing the returned data. Using a Python script the data is then parsed into a more coherent format and written to a common file format for analysis. With this system, the Arctic Region Supercomputing Center (ARSC) will be able to determine how much energy is used during diff erent levels of load intensity on the Pacman supercomputer and how much energy can be saved by suspending unnecessary nodes during levels of reduced activity. Power utilization by supercomputers is of major interest to those who design and purchase them. Since 2008, the leading source of worldwide supercomputer speed rankings has also included power consumption and power efficiency values. Because digital computers utilize electricity to perform computation, larger computers tend to utilize more energy and produce more heat. Pacman, an acronym for Pacific Area Climate Monitoring and Analysis Network, is a high performance supercomputer designed for large compute and memory intensive jobs. Pacman is composed of the following general computational nodes: • 256 four-core compute nodes containing two dual core 2.6 GHz AMD Opteron processors each • 20 twelve-core compute nodes containing two six core 2.6 GHz AMD Opteron processors each • 88 sixteen-core compute nodes containing two eight core 2.3 GHz AMD Opteron processors eac

    Social Impact Bonds: Overview and Considerations

    Get PDF
    One of the hottest topics in human services is "pay-for-success" approaches to government contracting. In this era of tight budgets and increased skepticism about the effectiveness of government-funded programs, the idea that the government could pay only for proven results has a broad appeal. And those who have identified prevention-focused models that have the potential to improve long-term outcomes and save the government money are deeply frustrated that they have been unable to attract the funding needed to take these programs to scale. Some advocates for expanded prevention efforts are confident that these programs could thrive under pay for success and see such an approach as a way to break out of the harmful cycle where what limited funds are available must be used to provide services for those who are already in crisis, and there are rarely sufficient funds to pay for prevention

    The staying power of self interest: Kenya's unshakeable elites

    Get PDF
    Kenyan political discourse has been dominated by elites and politics of self-interest since its independence. Kenya is notorious for its levels of corruption and questionable levels of accountability. The persistence of this status quo has defined the Kenyan democratic transition as one of 'dominant power politics' and epitomises Ghanian economist George Ayittey's description of the 'African Vampire State'. The post-election violence and controversy in 2007 – 2008 led to both a new constitution and a power-sharing agreement between the two major parties; these events have been described as the 'rebirth' of Kenya. This paper confronts the extent to which this 'rebirth' has had an impact on the contemporary political and economic climate within Kenya and assesses the extent to which elite dominance has shaped and continues to shape Kenyan political discourse

    An improved double-toroidal spectrometer for gas phase (e,2e) studies

    Get PDF
    A new spectrometer is described for measuring the momentum distributions of scattered electrons arising from electron-atom and electron-molecule ionization experiments. It incorporates and builds on elements from a number of previous designs, namely, a source of polarized electrons and two high-efficiency electrostatic electron energy analyzers. The analyzers each comprise a seven-element retarding-electrostatic lens system, four toroidal-sector electrodes, and a fast position-and-time-sensitive two-dimensional delay-line detector. Results are presented for the electron-impact-induced ionization of helium and the elastic scattering of electrons from argon and helium which demonstrate that high levels of momentum resolution and data-collection efficiency are achieved. Problematic aspects regarding variations in collection efficiency over the accepted momentum phase space are addressed and a methodology for their correction presented. Principles behind the present design and previous designs for electrostatic analyzers based around electrodes of toroidal-sector geometry are discussed and a framework is provided for optimizing future devices.The assistance of the AustralianGerman Research Cooperation Scheme and the Australian Research Council through Grant No. DP0452553 and a 1998 ARC RIEF grant is gratefully acknowledged

    Static Data Structure Lower Bounds Imply Rigidity

    Full text link
    We show that static data structure lower bounds in the group (linear) model imply semi-explicit lower bounds on matrix rigidity. In particular, we prove that an explicit lower bound of tω(log2n)t \geq \omega(\log^2 n) on the cell-probe complexity of linear data structures in the group model, even against arbitrarily small linear space (s=(1+ε)n)(s= (1+\varepsilon)n), would already imply a semi-explicit (PNP\bf P^{NP}\rm) construction of rigid matrices with significantly better parameters than the current state of art (Alon, Panigrahy and Yekhanin, 2009). Our results further assert that polynomial (tnδt\geq n^{\delta}) data structure lower bounds against near-optimal space, would imply super-linear circuit lower bounds for log-depth linear circuits (a four-decade open question). In the succinct space regime (s=n+o(n))(s=n+o(n)), we show that any improvement on current cell-probe lower bounds in the linear model would also imply new rigidity bounds. Our results rely on a new connection between the "inner" and "outer" dimensions of a matrix (Paturi and Pudlak, 2006), and on a new reduction from worst-case to average-case rigidity, which is of independent interest

    Dropouts and Their Relationship to Guidance

    Get PDF
    No abstract provided by author

    Aortic Stenosis

    Get PDF
    Aortic Stenosis is the progressive and permanent narrowing of the aortic valve that is located between the left ventricle and the aorta. The pathophysiology is endothelial damage to the valve resulting in lipid penetration, calcific changesand valve stiffness.Major risk factors for aortic stenosis are natural aging\u3e60 years(atherosclerotic changes in vasculature) and male gender.In the early phases of aortic stenosis, the body compensatesvia hypertrophy of the left ventricle to accommodateforthe increased pressure gradient. Progression is typically over years to decades untildecreased outflow of blood leads toinadequate perfusion to major organsystems including the heart itself. Patientsdo not typically have symptoms until flow is severely obstructed with a valve diameter \u3c1.0 cm². When patientsdevelop symptoms (primarily heart failure symptoms: syncope, exercise intolerance, chest pain), the stenosis is severe,and prognosis is poor without treatment. Currently, there are no pharmacological treatments proven to slow progression of aortic stenosis. Therefore, patients must undergo surgical aortic valve replacement (SAVR) or transcatheter aortic valve replacement (TAVR). TAVR procedures are designed to be a safer alternative for older individuals who donot meet the criteria for SAVR candidacy. A case study involving a 97-year old male who underwent a TAVR is discussed. In conclusion, aortic stenosis is the progressive narrowing of the aortic valve caused by endothelial damage that leads to valve stiffening and decreased outflow. Once the valve obstructs flow, heart failure can arise and ultimately causes death if not treated

    Measuring eccentricity in binary black hole inspirals with gravitational waves

    Full text link
    When binary black holes form in the field, it is expected that their orbits typically circularize before coalescence. In galactic nuclei and globular clusters, binary black holes can form dynamically. Recent results suggest that 5%\approx5\% of mergers in globular clusters result from three-body interactions. These three-body interactions are expected to induce significant orbital eccentricity 0.1\gtrsim 0.1 when they enter the Advanced LIGO band at a gravitational-wave frequency of 10 Hz. Measurements of binary black hole eccentricity therefore provide a means for determining whether or not dynamic formation is the primary channel for producing binary black hole mergers. We present a framework for performing Bayesian parameter estimation on gravitational-wave observations of black hole inspirals. Using this framework, and employing the non-spinning, inspiral-only EccentricFD waveform approximant, we determine the minimum detectable eccentricity for an event with masses and distance similar to GW150914. At design sensitivity, we find that the current generation of advanced observatories will be sensitive to orbital eccentricities of 0.05\gtrsim0.05 at a gravitational-wave frequency of 10 Hz, demonstrating that existing detectors can use eccentricity to distinguish between circular field binaries and globular cluster triples. We compare this result to eccentricity distributions predicted to result from three black hole binary formation channels, showing that measurements of eccentricity could be used to infer the population properties of binary black holes.Comment: 12 pages, 7 figures, 2 table

    Coherence Length, Coherence Width and the Helium Atom Microscope

    Get PDF
    This paper is an accompaniment to the three reports on the Helium Atom Microscope, written in the years 1991 and 1992, which assess various designs scenarios for an atom microscope, anticipates their performance and identifies hurdles to overcome. It was written between 1991 and 1993 while the author was an Alexander von Humboldt Fellow, hosted by the Max Planck Institut für Strömungsforschung, Göttingen, and under the guidance of Professor J. P. Toennies, when together they were investigating the feasibility of building such a device.” This paper focuses on the concepts of coherence length and coherence width, both of which should be considered, when optimizing an atom microscope design
    corecore