10,869 research outputs found

    Theoretical calculation of the electromagnetic response of a radially layered model moon Technical report

    Get PDF
    Theoretical calculation of electromagnetic response of radially layered moon mode

    The art of being human : a project for general philosophy of science

    Get PDF
    Throughout the medieval and modern periods, in various sacred and secular guises, the unification of all forms of knowledge under the rubric of ‘science’ has been taken as the prerogative of humanity as a species. However, as our sense of species privilege has been called increasingly into question, so too has the very salience of ‘humanity’ and ‘science’ as general categories, let alone ones that might bear some essential relationship to each other. After showing how the ascendant Stanford School in the philosophy of science has contributed to this joint demystification of ‘humanity’ and ‘science’, I proceed on a more positive note to a conceptual framework for making sense of science as the art of being human. My understanding of ‘science’ is indebted to the red thread that runs from Christian theology through the Scientific Revolution and Enlightenment to the Humboldtian revival of the university as the site for the synthesis of knowledge as the culmination of self-development. Especially salient to this idea is science‘s epistemic capacity to manage modality (i.e. to determine the conditions under which possibilities can be actualised) and its political capacity to organize humanity into projects of universal concern. However, the challenge facing such an ideal in the twentyfirst century is that the predicate ‘human’ may be projected in three quite distinct ways, governed by what I call ‘ecological’, ‘biomedical’ and ‘cybernetic’ interests. Which one of these future humanities would claim today’s humans as proper ancestors and could these futures co-habit the same world thus become two important questions that general philosophy of science will need to address in the coming years

    Domestication as innovation : the entanglement of techniques, technology and chance in the domestication of cereal crops

    Get PDF
    The origins of agriculture involved pathways of domestication in which human behaviours and plant genetic adaptations were entangled. These changes resulted in consequences that were unintended at the start of the process. This paper highlights some of the key innovations in human behaviours, such as soil preparation, harvesting and threshing, and how these were coupled with genetic ‘innovations’ within plant populations. We identify a number of ‘traps’ for early cultivators, including the needs for extra labour expenditure on crop-processing and soil fertility maintenance, but also linked gains in terms of potential crop yields. Compilations of quantitative data across a few different crops for the traits of nonshattering and seed size are discussed in terms of the apparently slow process of domestication, and parallels and differences between different regional pathways are identified. We highlight the need to bridge the gap between a Neolithic archaeobotanical focus on domestication and a focus of later periods on crop-processing activities and labour organization. In addition, archaeobotanical data provide a basis for rethinking previous assumptions about how plant genetic data should be related to the origins of agriculture and we contrast two alternative hypotheses: gradual evolution with low selection pressure versus metastable equilibrium that prolonged the persistence of ‘semi-domesticated’ populations. Our revised understanding of the innovations involved in plant domestication highlight the need for new approaches to collecting, modelling and integrating genetic data and archaeobotanical evidence

    Supervised Competitive Learning

    Get PDF
    Supervised Competitive Learning (SCL) assembles a set of learning modules into a supervised learning system to address the stability-plasticity dilemma. Each learning module acts as a similarity detector for a prototype, and includes prototype resetting (akin to that of the ART) to respond to new prototypes. SCL has usually employed backpropagation networks as the learning modules. It has been tested with two feature abstractors: about 30 energy-based features, and a combination of energy-based and graphical features (about 60). Anout 75 subjects have been involved. In recent testing (15 college students), SCL recognized 99% (energy features only) of test digits, 91% (energy) and 96.6% (energy/graphical) of test letters, and 85% of test gestures (energy/graphical)/ SCL has also been tested with fuzzy sets as learning modules for recognizing handwriting digits and handwritten gestures, recognizing 97% of test digits, and 91% of test gestures

    Public geographies II: being organic

    Get PDF
    This second report on ‘public geographies' considers the diverse, emergent and shifting spaces of engaging with and in public/s. Taking as its focus the more ‘organic’ rather than ‘traditional’ approach to doing public geography, as discussed in the first report, it explores the multiple and unorthodox ways in which engagements across academic-public spheres play out, and what such engagements may mean for geography/ers. The report first explores the role of the internet in ‘enabling conversations', generating a range of opportunities for public geography through websites, wikis, blogs, file-sharing sites, discussion forums and more, thinking critically about how technologies may enable/disable certain kinds of publically engaged activities. It then considers issues of process and praxis: how collaborations with groups/communities/organizations beyond academia are often unplanned, serendipitous encounters that evolve organically into research/learning/teaching endeavours; but also that personal politics/positionality bring an agency to bear upon whether we, as academics, follow the leads we may stumble upon. The report concludes with a provocative question – given that many non-academics appear to be doing some amazing and inspiring projects and activities, thoughtful, critical and (arguably) examples of organic public geographies, what then is academia’s role

    Supervised Competitive Learning Part I: SCL with Backpropagation Networks

    Get PDF
    SCL assembles a set of learning modules into a supervised learning system to address the stability-plasticity dilemma. Each learning module acts as a similarity detector for a prototype, and includes prototype resetting (akin to that of ART) to respond to new prototypes. Here (Part I) we report SCL results using back-propagation networks as the learning modules. We used two feature extractors: about 30 energy-based features, and a combination of energy-based and graphical features (about 60). SCL recognized 98% (energy) and 99% (energy/graphical) of test digits, and 91% (energy) and 96% (energy/graphical) of test letters. In the accompanying paper (Part II), we report the results of SCL using fuzzy sets as learning moduels for recognizing handwritten digits

    What is wrong with a big house?

    Full text link
    ln Australia in the 1950s, the average house size was approximately 100 mz. By 2008, the average size of a new house had risen to approximately 238 mz i.e. an increase of nearly 140%. Over the same period, occupancy levels have fallen by nearly one third from 3.7 to 2.5 persons per household. The aim of this paper is to contrast the total and per capita resource demand (direct and embodied energy, water and materials) for two houses typical of their respective era and draw some conclusions from the results. Using the software Autodesk Revit Architecture and drawings for typical 1950 and 2009 houses, the material quantities for these dwellings have been determined. Using known coefficients, the embodied energy and water in the materials have been calculated. Operating energy requirements have been calculated using NatHERS estimates. Water requirements have been calculated using historical and current water data. The greenhouse gas emissions associated with the resource use have also been calculated using established coefficients. Results are compared on a per capita basis. The research found that although the energy to operate the modern house and annual water use had fallen, the embodied energy and associated greenhouse gas emissions from material use had risen significantly. This was driven by the size of the house and the change in construction practices.<br /

    Biologically Inspired Feedback Design for Drosophila Flight

    Get PDF
    We use a biologically motivated model of the Drosophila's flight mechanics and sensor processing to design a feedback control scheme to regulate forward flight. The model used for insect flight is the grand unified fly (GUF) [3] simulation consisting of rigid body kinematics, aerodynamic forces and moments, sensory systems, and a 3D environment model. We seek to design a control algorithm that will convert the sensory signals into proper wing beat commands to regulate forward flight. Modulating the wing beat frequency and mean stroke angle produces changes in the flight envelope. The sensory signals consist of estimates of rotational velocity from the haltere organs and translational velocity estimates from visual elementary motion detectors (EMD's) and matched retinal velocity filters. The controller is designed based on a longitudinal model of the flight dynamics. Feedforward commands are generated based on a desired forward velocity. The dynamics are linearized around this operating point and a feedback controller designed to correct deviations from the operating point. The control algorithm is implemented in the GUF simulator and achieves the desired tracking of the forward reference velocities and exhibits biologically realistic responses

    Neutrino-Neutrino Scattering and Matter-Enhanced Neutrino Flavor Transformation in Supernovae

    Get PDF
    We examine matter-enhanced neutrino flavor transformation (ντ(μ)νe\nu_{\tau(\mu)}\rightleftharpoons\nu_e) in the region above the neutrino sphere in Type II supernovae. Our treatment explicitly includes contributions to the neutrino-propagation Hamiltonian from neutrino-neutrino forward scattering. A proper inclusion of these contributions shows that they have a completely negligible effect on the range of νe\nu_e-ντ(μ)\nu_{\tau(\mu)} vacuum mass-squared difference, δm2\delta m^2, and vacuum mixing angle, θ\theta, or equivalently sin22θ\sin^22\theta, required for enhanced supernova shock re-heating. When neutrino background effects are included, we find that rr-process nucleosynthesis from neutrino-heated supernova ejecta remains a sensitive probe of the mixing between a light νe\nu_e and a ντ(μ)\nu_{\tau(\mu)} with a cosmologically significant mass. Neutrino-neutrino scattering contributions are found to have a generally small effect on the (δm2, sin22θ)(\delta m^2,\ \sin^22\theta) parameter region probed by rr-process nucleosynthesis. We point out that the nonlinear effects of the neutrino background extend the range of sensitivity of rr-process nucleosynthesis to smaller values of δm2\delta m^2.Comment: 38 pages, tex, DOE/ER/40561-150-INT94-00-6

    Simulation of primordial object formation

    Full text link
    We have included the chemical rate network responsible for the formation of molecular Hydrogen in the N-body hydrodynamic code, Hydra, in order to study the formation of the first cosmological at redshifts between 10 and 50. We have tested our implementation of the chemical and cooling processes by comparing N-body top hat simulations with theoretical predictions from a semi-analytic model and found them to be in good agreement. We find that post-virialization properties are insensitive to the initial abundance of molecular hydrogen. Our main objective was to determine the minimum mass (MSG(z)M_{SG}(z)) of perturbations that could become self gravitating (a prerequisite for star formation), and the redshift at which this occurred. We have developed a robust indicator for detecting the presence of a self-gravitating cloud in our simulations and find that we can do so with a baryonic particle mass-resolution of 40 solar masses. We have performed cosmological simulations of primordial objects and find that the object's mass and redshift at which they become self gravitating agree well with the MSG(z)M_{SG}(z) results from the top hat simulations. Once a critical molecular hydrogen fractional abundance of about 0.0005 has formed in an object, the cooling time drops below the dynamical time at the centre of the cloud and the gas free falls in the dark matter potential wells, becoming self gravitating a dynamical time later.Comment: 45 pages, 17 figures, submitted to Ap
    corecore