3,648 research outputs found

    RAST Model: Simulation of Tensiotraces to Facilitate Drophad Engineering

    Get PDF
    Tensiography is a technique that determines the physical and chemical properties of a liquid by illuminating a growing pendant drop from within using a source fibre. Light reflected internally at the surface of the drop is recieved by a collector fibre and is converted into an electric signal called a tensiotrace, which is a graph of reflected light as a function of drop volume. The instrument obtaining this signal is called multianalyser. A numerical model that simulates tensiotraces through a raytracing analysis (RAST - Raytracing Analysis for the Simulation of Tensiotraces) of the multianalyser as been developed to define theoretically how the tensiotrace describes the physical and chemical properties of a liquid. The purpose of this study is to investigate the model as an engineering/design assistant leading to discoveries and improvements to the multianalyser

    The Properties of X-ray Cold Fronts in a Statistical Sample of Simulated Galaxy Clusters

    Full text link
    We examine the incidence of cold fronts in a large sample of galaxy clusters extracted from a (512h^-1 Mpc) hydrodynamic/N-body cosmological simulation with adiabatic gas physics computed with the Enzo adaptive mesh refinement code. This simulation contains a sample of roughly 4000 galaxy clusters with M > 10^14 M_sun at z=0. For each simulated galaxy cluster, we have created mock 0.3-8.0 keV X-ray observations and spectroscopic-like temperature maps. We have searched these maps with a new automated algorithm to identify the presence of cold fronts in projection. Using a threshold of a minimum of 10 cold front pixels in our images, corresponding to a total comoving length L_cf > 156h^-1 kpc, we find that roughly 10-12% of all projections in a mass-limited sample would be classified as cold front clusters. Interestingly, the fraction of clusters with extended cold front features in our synthetic maps of a mass-limited sample trends only weakly with redshift out to z=1.0. However, when using different selection functions, including a simulated flux limit, the trending with redshift changes significantly. The likelihood of finding cold fronts in the simulated clusters in our sample is a strong function of cluster mass. In clusters with M>7.5x10^14 M_sun the cold front fraction is 40-50%. We also show that the presence of cold fronts is strongly correlated with disturbed morphology as measured by quantitative structure measures. Finally, we find that the incidence of cold fronts in the simulated cluster images is strongly dependent on baryonic physics.Comment: 16 pages, 21 figures, Accepted to Ap

    The Santa Fe Light Cone Simulation Project: II. The Prospects for Direct Detection of the WHIM with SZE Surveys

    Full text link
    Detection of the Warm-Hot Intergalactic Medium (WHIM) using Sunyaev-Zeldovich effect (SZE) surveys is an intriguing possibility, and one that may allow observers to quantify the amount of "missing baryons" in the WHIM phase. We estimate the necessary sensitivity for detecting low density WHIM gas with the South Pole Telescope (SPT) and Planck Surveyor for a synthetic 100 square degree sky survey. This survey is generated from a very large, high dynamic range adaptive mesh refinement cosmological simulation performed with the Enzo code. We find that for a modest increase in the SPT survey sensitivity (a factor of 2-4), the WHIM gas makes a detectable contribution to the integrated sky signal. For a Planck-like satellite, similar detections are possible with a more significant increase in sensitivity (a factor of 8-10). We point out that for the WHIM gas, the kinematic SZE signal can sometimes dominate the thermal SZE where the thermal SZE decrement is maximal (150 GHz), and that using the combination of the two increases the chance of WHIM detection using SZE surveys. However, we find no evidence of unique features in the thermal SZE angular power spectrum that may aid in its detection. Interestingly, there are differences in the power spectrum of the kinematic SZE, which may not allow us to detect the WHIM directly, but could be an important contaminant in cosmological analyses of the kSZE-derived velocity field. Corrections derived from numerical simulations may be necessary to account for this contamination.Comment: 9 pages, submitted to Astrophysical Journa

    Human-ecodynamics and the intertidal zones of the Zanzibar Archipelago

    Get PDF
    The intertidal zone, covering the nearshore fringe of coasts and islands and extending from the high-water mark to areas that remain fully submerged, encompasses a range of habitats containing resources that are as important to modern populations as they were to humans in prehistory. Effectively bridging land and sea, intertidal environments are extremely dynamic, requiring complexity and variability in how people engaged with them in the past, much as they do in the present. Here we review and reconsider environmental, archaeological, and modern socio-ecological evidence from the Zanzibar Archipelago on eastern Africa’s Swahili coast, focusing on marine molluscs to gain insight into the trajectories of human engagement with nearshore habitats and resources. We highlight the potential drivers of change and/or stability in human-intertidal interactions through time and space, set against a backdrop of the significant socio-economic and socio-ecological changes apparent in the archipelago, and along the Swahili coast, during the late Holocene.1 Introduction 2 Background 2.1 Unguja and Pemba Islands, Zanzibar Archipelago 2.2 Archaeological and historical overview 2.3 Study site locations, descriptions and chronology 2.3.1 Northern Pemba: Pango la Watoro and Msuka Mjini 2.3.2 Southern Pemba: Ras Mkumbuu 2.3.3 Northern Unguja: Fukuchani and Mvuleni 2.3.4 Southern Unguja: Unguja Ukuu, Kuumbi Cave and Mifupani 2.4 Palaeoenvironmental context 3 Materials and methods 3.1 Identification and abundance 3.2 Richness, nestedness and taxonomic composition 3.3 Diversity indices 3.4 Molluscan zonation and benthic habitat attribution 4 Results 4.1 Assemblage characteristics 4.2 Richness and nestedness 4.3 Taxonomic composition 4.4 Assemblage diversity 4.5 Zonation and benthic habitat analysis 5 Discussion 6 Conclusio

    Rapid detection of human blood in triatomines (kissing bugs) utilizing a lateral flow immunochromatographic assay - A pilot study

    Get PDF
    BACKGROUND DNA- and proteomics-based techniques are currently used to identify a triatomine human blood meal. These methods are time consuming, require access to laboratories with sophisticated equipment, and trained personnel. OBJECTIVES We tested a rapid and specific immunochromatographic assay (that detects human blood in forensic samples) to determine if human blood was present in triatomines and their fecal excreta. METHODS We fed Triatoma rubida human blood (positive control) or mouse blood (negative control) and performed the assay on the abdominal contents and fecal excreta. Triatomine field specimens collected in and around human habitations and excreta were also tested. FINDINGS The assay was positive in triatomines fed human blood (N = 5/5) and fecal excreta from bugs known to have ingested human blood (N = 5/5). Bugs feeding on mice (N = 15/15) and their fecal excreta (N = 8/8) were negative for human blood. Human blood was detected in 47% (N = 23/49) triatomines, representing six different species, collected in the field. MAIN CONCLUSIONS The pilot study shows that this rapid and specific test may have applications in triatomine research. Further study is needed to determine the sensitivity of this assay compared to other well-established techniques, such as DNA- and proteomics-based methodologies and the assay's application in the field.Open access journalThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]

    Metaphoric coherence: Distinguishing verbal metaphor from `anomaly\u27

    Get PDF
    Theories and computational models of metaphor comprehension generally circumvent the question of metaphor versus “anomaly” in favor of a treatment of metaphor versus literal language. Making the distinction between metaphoric and “anomalous” expressions is subject to wide variation in judgment, yet humans agree that some potentially metaphoric expressions are much more comprehensible than others. In the context of a program which interprets simple isolated sentences that are potential instances of cross‐modal and other verbal metaphor, I consider some possible coherence criteria which must be satisfied for an expression to be “conceivable” metaphorically. Metaphoric constraints on object nominals are represented as abstracted or extended along with the invariant structural components of the verb meaning in a metaphor. This approach distinguishes what is preserved in metaphoric extension from that which is “violated”, thus referring to both “similarity” and “dissimilarity” views of metaphor. The role and potential limits of represented abstracted properties and constraints is discussed as they relate to the recognition of incoherent semantic combinations and the rejection or adjustment of metaphoric interpretations

    Photoelectron diffraction: from phenomenological demonstration to practical tool

    Get PDF
    The potential of photoelectron diffraction—exploiting the coherent interference of directly-emitted and elastically scattered components of the photoelectron wavefield emitted from a core level of a surface atom to obtain structural information—was first appreciated in the 1970s. The first demonstrations of the effect were published towards the end of that decade, but the method has now entered the mainstream armoury of surface structure determination. This short review has two objectives: First, to outline the way that the idea emerged and the way this evolved in my own collaboration with Neville Smith and his colleagues at Bell Labs in the early years: Second, to provide some insight into the current state-of-the art in application of (scanned-energy mode) photoelectron diffraction to address two key issue in quantitative surface structure determination, namely, complexity and precision. In this regard a particularly powerful aspect of photoelectron diffraction is its elemental and chemical-state specificity

    An argument for the use of Aristotelian method in bioethics

    Get PDF
    The main claim of this paper is that the method outlined and used in Aristotle's Ethics is an appropriate and credible one to use in bioethics. Here “appropriate” means that the method is capable of establishing claims and developing concepts in bioethics and “credible” that the method has some plausibility, it is not open to obvious and immediate objection. It begins by suggesting why this claim matters and then gives a brief outline of Aristotle's method. The main argument is made in three stages. First, it is argued that Aristotelian method is credible because it compares favourably with alternatives. In this section it is shown that Aristotelian method is not vulnerable to criticisms that are made both of methods that give a primary place to moral theory (such as utilitarianism) and those that eschew moral theory (such as casuistry and social science approaches). As such, it compares favourably with these other approaches that are vulnerable to at least some of these criticisms. Second, the appropriateness of Aristotelian method is indicated through outlining how it would deal with a particular case. Finally, it is argued that the success of Aristotle's philosophy is suggestive of both the credibility and appropriateness of his method.</p
    • 

    corecore