48,585 research outputs found

    Geddes at UCL “There was something more in town planning than met the eye!”

    Get PDF
    Patrick Geddes was at UCL from 1877 to 1878 although as a student of physiology, not as the ‘father of British Town Planning’ as he was to become. We explore his time here and the links he had both back to Charles Darwin and forward to Patrick Abercrombie. This is part of our wider quest to assess the impact of Geddes on evolutionary theory in the study of cities and planning of which we plan a more substantial paper which we will, in due course, post on this web site

    I’m so Self-Conscious: Kanye West’s Rhetorical Wrestling with Theodicy and Nihilism

    Get PDF
    Whether Kanye’s plea to God is to intervene because “the devil’s trying to break [him] down,” or that he (Kanye) is “tryna keep [his] faith,” Kanye West’s lamentations communicate his wrestling of succumbing to sufferings within the world. Despite the twelve-year span between “Jesus Walks” and “Ultralight Beam,” Kanye West’s rhetoric in both songs attempt to make meaning of theodicy—suffering; while simultaneously combating nihilism—the lack of hope. As a professed Christian who articulates the multiplicity of God through Jesus and himself (Kanye West), affirmed on his 2013 album Yeezus track, “I am God,” West complicates religiosity and self-consciousness. He does so by situating himself as both God and human; recognizing limitations of God who has yet to impact his situation as a Black man in America, and his human-self that operates as a venerated deity. West’s consciousness is an amalgamation of his warring with theodicy and nihilism. My essay implements a theo-rhetorical analysis of “Jesus Walks” and “Ultralight Beam” exploring meaning-making processes of locating God. In doing so, I define theodicy and nihilism as repelling mores that aid in self-preservation for West

    Chapter 9 Gene Drive Strategies for Population Replacement

    Get PDF
    Gene drive systems are selfish genetic elements capable of spreading into a population despite a fitness cost. A variety of these systems have been proposed for spreading disease-refractory genes into mosquito populations, thus reducing their ability to transmit diseases such as malaria and dengue fever to humans. Some have also been proposed for suppressing mosquito populations. We assess the alignment of these systems with design criteria for their safety and efficacy. Systems such as homing endonuclease genes, which manipulate inheritance through DNA cleavage and repair, are highly invasive and well-suited to population suppression efforts. Systems such as Medea, which use combinations of toxins and antidotes to favor their own inheritance, are highly stable and suitable for replacing mosquito populations with disease-refractory varieties. These systems offer much promise for future vector-borne disease control

    Stellar contributions to the hard X-ray galactic ridge

    Get PDF
    The number density of serendipitous sources in galactic plane Einstein Observatory IPC fields are compared with predictions based on the intensity of the HEAO-1 A2 unresolved hrd X-ray galactic ridge emission. It is concluded that theoretically predicted X-ray source populations of luminosity 8 x 10 to the 32nd power to 3 x 10 to the 34th power ergs s have 2 KeV to 10 KeV local surface densities of less than approximately .0008 L(32) pc/2 and are unlikely to be the dominant contributors to the hard X-ray ridge. An estimate for Be/neutron star binary systems, such as X Persei, gives a 2 keV to 10 keV local surface density of approximately 26 x 10 to the -5 power L(32) pc/2. Stellar systems of low luminosity, are more likely contributors. Both RS CVn and cataclysmic variable systems contribute 43% + or - 18% of the ridge. A more sensitive measurement of the ridge's hard X-ray spectrum should reveal Fe-line emission. We speculate that dM stars are further major contributors

    Minimum entropy restoration using FPGAs and high-level techniques

    Get PDF
    One of the greatest perceived barriers to the widespread use of FPGAs in image processing is the difficulty for application specialists of developing algorithms on reconfigurable hardware. Minimum entropy deconvolution (MED) techniques have been shown to be effective in the restoration of star-field images. This paper reports on an attempt to implement a MED algorithm using simulated annealing, first on a microprocessor, then on an FPGA. The FPGA implementation uses DIME-C, a C-to-gates compiler, coupled with a low-level core library to simplify the design task. Analysis of the C code and output from the DIME-C compiler guided the code optimisation. The paper reports on the design effort that this entailed and the resultant performance improvements

    Performance of Particle Flow Calorimetry at CLIC

    Full text link
    The particle flow approach to calorimetry can provide unprecedented jet energy resolution at a future high energy collider, such as the International Linear Collider (ILC). However, the use of particle flow calorimetry at the proposed multi-TeV Compact Linear Collider (CLIC) poses a number of significant new challenges. At higher jet energies, detector occupancies increase, and it becomes increasingly difficult to resolve energy deposits from individual particles. The experimental conditions at CLIC are also significantly more challenging than those at previous electron-positron colliders, with increased levels of beam-induced backgrounds combined with a bunch spacing of only 0.5 ns. This paper describes the modifications made to the PandoraPFA particle flow algorithm to improve the jet energy reconstruction for jet energies above 250 GeV. It then introduces a combination of timing and p_T cuts that can be applied to reconstructed particles in order to significantly reduce the background. A systematic study is performed to understand the dependence of the jet energy resolution on the jet energy and angle, and the physics performance is assessed via a study of the energy and mass resolution of W and Z particles in the presence of background at CLIC. Finally, the missing transverse momentum resolution is presented, and the fake missing momentum is quantified. The results presented in this paper demonstrate that high granularity particle flow calorimetry leads to a robust and high resolution reconstruction of jet energies and di-jet masses at CLIC.Comment: 14 pages, 11 figure

    Causal inference for continuous-time processes when covariates are observed only at discrete times

    Get PDF
    Most of the work on the structural nested model and g-estimation for causal inference in longitudinal data assumes a discrete-time underlying data generating process. However, in some observational studies, it is more reasonable to assume that the data are generated from a continuous-time process and are only observable at discrete time points. When these circumstances arise, the sequential randomization assumption in the observed discrete-time data, which is essential in justifying discrete-time g-estimation, may not be reasonable. Under a deterministic model, we discuss other useful assumptions that guarantee the consistency of discrete-time g-estimation. In more general cases, when those assumptions are violated, we propose a controlling-the-future method that performs at least as well as g-estimation in most scenarios and which provides consistent estimation in some cases where g-estimation is severely inconsistent. We apply the methods discussed in this paper to simulated data, as well as to a data set collected following a massive flood in Bangladesh, estimating the effect of diarrhea on children's height. Results from different methods are compared in both simulation and the real application.Comment: Published in at http://dx.doi.org/10.1214/10-AOS830 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Restoration of star-field images using high-level languages and core libraries

    Get PDF
    Research into the use of FPGAs in Image Processing began in earnest at the beginning of the 1990s. Since then, many thousands of publications have pointed to the computational capabilities of FPGAs. During this time, FPGAs have seen the application space to which they are applicable grow in tandem with their logic densities. When investigating a particular application, researchers compare FPGAs with alternative technologies such as Digital Signal Processors (DSPs), Application-Specific Integrated Cir-cuits (ASICs), microprocessors and vector processors. The metrics for comparison depend on the needs of the application, and include such measurements as: raw performance, power consumption, unit cost, board footprint, non-recurring engineering cost, design time and design cost. The key metrics for a par-ticular application may also include ratios of these metrics, e.g. power/performance, or performance/unit cost. The work detailed in this paper compares a 90nm-process commodity microprocessor with a plat-form based around a 90nm-process FPGA, focussing on design time and raw performance. The application chosen for implementation was a minimum entropy restoration of star-field images (see [1] for an introduction), with simulated annealing used to converge towards the globally-optimum solution. This application was not chosen in the belief that it would particularly suit one technology over another, but was instead selected as being representative of a computationally intense image-processing application
    • 

    corecore