9,731 research outputs found

    Post-Impact Thermal Evolution of Porous Planetesimals

    Full text link
    Impacts between planetesimals have largely been ruled out as a heat source in the early Solar System, by calculations that show them to be an inefficient heat source and unlikely to cause global heating. However, the long-term, localized thermal effects of impacts on planetesimals have never been fully quantified. Here, we simulate a range of impact scenarios between planetesimals to determine the post-impact thermal histories of the parent bodies, and hence the importance of impact heating in the thermal evolution of planetesimals. We find on a local scale that heating material to petrologic type 6 is achievable for a range of impact velocities and initial porosities, and impact melting is possible in porous material at a velocity of > 4 km/s. Burial of heated impactor material beneath the impact crater is common, insulating that material and allowing the parent body to retain the heat for extended periods (~ millions of years). Cooling rates at 773 K are typically 1 - 1000 K/Ma, matching a wide range of measurements of metallographic cooling rates from chondritic materials. While the heating presented here is localized to the impact site, multiple impacts over the lifetime of a parent body are likely to have occurred. Moreover, as most meteorite samples are on the centimeter to meter scale, the localized effects of impact heating cannot be ignored.Comment: 38 pages, 9 figures, Revised for Geochimica et Cosmochimica Acta (Sorry, they do not accept LaTeX

    Comparison of two sampling protocols and four home-range estimators using radio-tracking data from urban badgers Meles meles

    Get PDF
    Radio-telemetry is often the method of choice for studies of species whose behaviour is difficult to observe directly. However, considerable debate has ensued about the best way of deriving home-range estimates. In recent years, kernel estimators have become the most widely used method, together with the oldest and simplest method, the minimum convex polygon (MCP). More recently, it has been suggested that the local convex hull (LCH) might be more appropriate than kernel methods in cases where an animal’s home range includes a priori inaccessible areas. Yet another method, the Brownian bridge (BB), explicitly uses autocorrelated data to determine movement paths and, ultimately, home ranges or migration routes of animals. Whereas several studies have used simulation techniques to compare these different methods, few have used data from real animals. We used radio-telemetric data from urban badgers Meles meles to compare two sampling protocols (10-minute vs at least 30-minute inter-fix intervals) and four home-range estimators (MCP, fixed kernels (FK), LCH and BB). We used a multi-response permutation procedure and randomisation tests to compare overall patterns of fixes and degree of overlap of home ranges estimated using data from different sampling protocols, and a general linear model to compare the influence of sampling protocols and home-range estimator on the size of habitat patches. The shape of the estimated home ranges was influenced by sampling protocol in some cases. By contrast, the sizes and proportions of different habitats within home ranges were influenced by estimator type but not by sampling protocol. LCH performed consistently better than FK, and is especially appropriate for patchy study areas containing frequent no-go zones. However, we recommend using LCH in combination with other methods to estimate total range size, because LCH tended to produce smaller estimates than any other method. Results relating to BB are preliminary but suggest that this method is unsuitable for species in which range size is small compared to average travel speed.Marie-Curie Intra-European Fellowship (BSSUB - 24007); Defra WSC contract WM0304; Wildlife Biology granted the permit to upload the article to this repositor

    Phase equilibrium modeling for high temperature metallization on GaAs solar cells

    Get PDF
    Recent trends in performance specifications and functional requirements have brought about the need for high temperature metallization technology to be developed for survivable DOD space systems and to enhance solar cell reliability. The temperature constitution phase diagrams of selected binary and ternary systems were reviewed to determine the temperature and type of phase transformation present in the alloy systems. Of paramount interest are the liquid-solid and solid-solid transformations. Data are being utilized to aid in the selection of electrical contact materials to gallium arsenide solar cells. Published data on the phase diagrams for binary systems is readily available. However, information for ternary systems is limited. A computer model is being developed which will enable the phase equilibrium predictions for ternary systems where experimental data is lacking

    Focussing quantum states

    Get PDF
    Does the size of atoms present a lower limit to the size of electronic structures that can be fabricated in solids? This limit can be overcome by using devices that exploit quantum mechanical scattering of electron waves at atoms arranged in focussing geometries on selected surfaces. Calculations reveal that features smaller than a hydrogen atom can be obtained. These structures are potentially useful for device applications and offer a route to the fabrication of ultrafine and well defined tips for scanning tunneling microscopy.Comment: 4 pages, 4 figure

    Real-time monocular SLAM: Why filter?

    Full text link
    Abstract—While the most accurate solution to off-line structure from motion (SFM) problems is undoubtedly to extract as much correspondence information as possible and perform global optimisation, sequential methods suitable for live video streams must approximate this to fit within fixed computational bounds. Two quite different approaches to real-time SFM — also called monocular SLAM (Simultaneous Localisation and Mapping) — have proven successful, but they sparsify the problem in different ways. Filtering methods marginalise out past poses and summarise the information gained over time with a probability distribution. Keyframe methods retain the optimisation approach of global bundle adjustment, but computationally must select only a small number of past frames to process. In this paper we perform the first rigorous analysis of the relative advantages of filtering and sparse optimisation for sequential monocular SLAM. A series of experiments in simulation as well using a real image SLAM system were performed by means of covariance propagation and Monte Carlo methods, and comparisons made using a combined cost/accuracy measure. With some well-discussed reservations, we conclude that while filtering may have a niche in systems with low processing resources, in most modern applications keyframe optimisation gives the most accuracy per unit of computing time. I

    Bioinformatics tools for analysing viral genomic data

    Get PDF
    The field of viral genomics and bioinformatics is experiencing a strong resurgence due to high-throughput sequencing (HTS) technology, which enables the rapid and cost-effective sequencing and subsequent assembly of large numbers of viral genomes. In addition, the unprecedented power of HTS technologies has enabled the analysis of intra-host viral diversity and quasispecies dynamics in relation to important biological questions on viral transmission, vaccine resistance and host jumping. HTS also enables the rapid identification of both known and potentially new viruses from field and clinical samples, thus adding new tools to the fields of viral discovery and metagenomics. Bioinformatics has been central to the rise of HTS applications because new algorithms and software tools are continually needed to process and analyse the large, complex datasets generated in this rapidly evolving area. In this paper, the authors give a brief overview of the main bioinformatics tools available for viral genomic research, with a particular emphasis on HTS technologies and their main applications. They summarise the major steps in various HTS analyses, starting with quality control of raw reads and encompassing activities ranging from consensus and de novo genome assembly to variant calling and metagenomics, as well as RNA sequencing

    Simple strong glass forming models: mean-field solution with activation

    Full text link
    We introduce simple models, inspired by previous models for froths and covalent glasses, with trivial equilibrium properties but dynamical behaviour characteristic of strong glass forming systems. These models are also a generalization of backgammon or urn models to a non--constant number of particles, where entropic barriers are replaced by energy barriers, allowing for the existence of activated processes. We formulate a mean--field version of the models, which keeps most of the features of the finite dimensional ones, and solve analytically the out--of--equilibrium dynamics in the low temperature regime where activation plays an essential role.Comment: 18 pages, 9 figure

    Glassy behaviour in an exactly solved spin system with a ferromagnetic transition

    Full text link
    We show that applying simple dynamical rules to Baxter's eight-vertex model leads to a system which resembles a glass-forming liquid. There are analogies with liquid, supercooled liquid, glassy and crystalline states. The disordered phases exhibit strong dynamical heterogeneity at low temperatures, which may be described in terms of an emergent mobility field. Their dynamics are well-described by a simple model with trivial thermodynamics, but an emergent kinetic constraint. We show that the (second order) thermodynamic transition to the ordered phase may be interpreted in terms of confinement of the excitations in the mobility field. We also describe the aging of disordered states towards the ordered phase, in terms of simple rate equations.Comment: 11 page

    Caught in a ‘spiral’. Barriers to healthy eating and dietary health promotion needs from the perspective of unemployed young people and their service providers

    Get PDF
    NoThe number of young people in Europe who are not in education, employment or training (NEET) is increasing. Given that young people from disadvantaged backgrounds tend to have diets of poor nutritional quality, this exploratory study sought to understand barriers and facilitators to healthy eating and dietary health promotion needs of unemployed young people aged 16–20 years. Three focus group discussions were held with young people (n = 14). Six individual interviews and one paired interview with service providers (n = 7). Data were recorded, transcribed verbatim and thematically content analysed. Themes were then fitted to social cognitive theory (SCT). Despite understanding of the principles of healthy eating, a ‘spiral’ of interrelated social, economic and associated psychological problems was perceived to render food and health of little value and low priority for the young people. The story related by the young people and corroborated by the service providers was of a lack of personal and vicarious experience with food. The proliferation and proximity of fast food outlets and the high perceived cost of ‘healthy’ compared to ‘junk’ food rendered the young people low in self-efficacy and perceived control to make healthier food choices. Agency was instead expressed through consumption of junk food and drugs. Both the young people and service providers agreed that for dietary health promotion efforts to succeed, social problems needed to be addressed and agency encouraged through (individual and collective) active engagement of the young people themselves
    • …
    corecore