26,868 research outputs found

    Fragmentation of Nuclei at Intermediate and High Energies in Modified Cascade Model

    Get PDF
    The process of nuclear multifragmentation has been implemented, together with evaporation and fission channels of the disintegration of excited remnants in nucleus-nucleus collisions using percolation theory and the intranuclear cascade model. Colliding nuclei are treated as face--centered--cubic lattices with nucleons occupying the nodes of the lattice. The site--bond percolation model is used. The code can be applied for calculation of the fragmentation of nuclei in spallation and multifragmentation reactions.Comment: 19 pages, 10 figure

    High-precision radiocarbon dating of the construction phase of Oakbank Crannog, Loch Tay, Perthshire

    Get PDF
    Many of the Loch Tay crannogs were built in the Early Iron Age and so calibration of the radiocarbon ages produces very broad calendar age ranges due to the well-documented Hallstatt plateau in the calibration curve. However, the large oak timbers that were used in the construction of some of the crannogs potentially provide a means of improving the precision of the dating through subdividing them into decadal or subdecadal increments, dating them to high precision and wiggle-matching the resulting data to the master <sup>14</sup>C calibration curve. We obtained a sample from 1 oak timber from Oakbank Crannog comprising 70 rings (Sample OB06 WMS 1, T103) including sapwood that was complete to the bark edge. The timber is situated on the northeast edge of the main living area of the crannog and as a large and strong oak pile would have been a useful support in more than 1 phase of occupation and may be related to the earliest construction phase of the site. This was sectioned into 5-yr increments and dated to a precision of approximately ±8–16 <sup>14</sup>C yr (1 σ). The wiggle-match predicts that the last ring dated was formed around 500 BC (maximum range of 520–465 BC) and should be taken as indicative of the likely time of construction of Oakbank Crannog. This is a considerable improvement on the estimates based on single <sup>14</sup>C ages made on oak samples, which typically encompassed the period from around 800–400 BC

    Formation of molecular oxygen in ultracold O + OH reaction

    Full text link
    We discuss the formation of molecular oxygen in ultracold collisions between hydroxyl radicals and atomic oxygen. A time-independent quantum formalism based on hyperspherical coordinates is employed for the calculations. Elastic, inelastic and reactive cross sections as well as the vibrational and rotational populations of the product O2 molecules are reported. A J-shifting approximation is used to compute the rate coefficients. At temperatures T = 10 - 100 mK for which the OH molecules have been cooled and trapped experimentally, the elastic and reactive rate coefficients are of comparable magnitude, while at colder temperatures, T < 1 mK, the formation of molecular oxygen becomes the dominant pathway. The validity of a classical capture model to describe cold collisions of OH and O is also discussed. While very good agreement is found between classical and quantum results at T=0.3 K, at higher temperatures, the quantum calculations predict a larger rate coefficient than the classical model, in agreement with experimental data for the O + OH reaction. The zero-temperature limiting value of the rate coefficient is predicted to be about 6.10^{-12} cm^3 molecule^{-1} s^{-1}, a value comparable to that of barrierless alkali-metal atom - dimer systems and about a factor of five larger than that of the tunneling dominated F + H2 reaction.Comment: 9 pages, 8 figure

    Immune compromise in HIV-1/HTLV-1 coinfection with paradoxical resolution of CD4 lymphocytosis during antiretroviral therapy: a case report

    Get PDF
    Human immunodeficiency virus type-1 (HIV-1) and human T lymphotropic virus type-1 (HTLV-1) infections have complex effects on adaptive immunity, with specific tropism for, but contrasting effects on, CD4 T lymphocytes: depletion with HIV-1, proliferation with HTLV-1. Impaired T lymphocyte function occurs early in HIV-1 infection but opportunistic infections (OIs) rarely occur in the absence of CD4 lymphopenia. In the unusual case where a HIV-1 infected individual with a high CD4 count presents with recurrent OIs, a clinician is faced with the possibility of a second underlying comorbidity. We present a case of pseudo-adult T cell leukemia/lymphoma (ATLL) in HIV-1/HTLV-1 coinfection where the individual fulfilled Shimoyama criteria for chronic ATLL and had pulmonary Mycobacterium kansasii, despite a high CD4 lymphocyte count. However, there was no evidence of clonal T-cell proliferation by T-cell receptor gene rearrangement studies nor of monoclonal HTLV-1 integration by high-throughput sequencing. Mutually beneficial interplay between HIV-1 and HTLV-1, maintaining high level HIV-1 and HTLV-1 viremia and proliferation of poorly functional CD4 cells despite chronicity of infection is a postulated mechanism. Despite good microbiological response to antimycobacterial therapy, the patient remained systemically unwell with refractory anemia. Subsequent initiation of combined antiretroviral therapy led to paradoxical resolution of CD4 T lymphocytosis as well as HIV-1 viral suppression and decreased HTLV-1 proviral load. This is proposed to be the result of attenuation of immune activation post-HIV virological control. This case illustrates the importance of screening for HTLV-1 in HIV-1 patients with appropriate clinical presentation and epidemiological risk factors and explores mechanisms for the complex interactions on HIV-1/HTLV-1 adaptive immunity

    Random billiards with wall temperature and associated Markov chains

    Full text link
    By a random billiard we mean a billiard system in which the standard specular reflection rule is replaced with a Markov transition probabilities operator P that, at each collision of the billiard particle with the boundary of the billiard domain, gives the probability distribution of the post-collision velocity for a given pre-collision velocity. A random billiard with microstructure (RBM) is a random billiard for which P is derived from a choice of geometric/mechanical structure on the boundary of the billiard domain. RBMs provide simple and explicit mechanical models of particle-surface interaction that can incorporate thermal effects and permit a detailed study of thermostatic action from the perspective of the standard theory of Markov chains on general state spaces. We focus on the operator P itself and how it relates to the mechanical/geometric features of the microstructure, such as mass ratios, curvatures, and potentials. The main results are as follows: (1) we characterize the stationary probabilities (equilibrium states) of P and show how standard equilibrium distributions studied in classical statistical mechanics, such as the Maxwell-Boltzmann distribution and the Knudsen cosine law, arise naturally as generalized invariant billiard measures; (2) we obtain some basic functional theoretic properties of P. Under very general conditions, we show that P is a self-adjoint operator of norm 1 on an appropriate Hilbert space. In a simple but illustrative example, we show that P is a compact (Hilbert-Schmidt) operator. This leads to the issue of relating the spectrum of eigenvalues of P to the features of the microstructure;(3) we explore the latter issue both analytically and numerically in a few representative examples;(4) we present a general algorithm for simulating these Markov chains based on a geometric description of the invariant volumes of classical statistical mechanics

    Effects of Manual and Automatic Natural Ventilation Control Strategies on Thermal Comfort, Indoor Air Quality and Energy Consumption

    Get PDF
    Occupants of naturally ventilated buildings can tolerate wider ranges of temperature and Indoor Air Quality (IAQ) if they have more control over their environment. Meanwhile, due to the complexity of advanced natural ventilation (ANV) strategies, introducing some form of automatic control is essential despite the fact that they limit the occupants’ control over their environment. Therefore, it is essential to understand the performance of ANV systems and occupants’ behaviours in order to identify a balance between automatic and manual controls to enhance the performance of ANV systems while maintaining the occupants’ comfort. The aim of the work reported in this paper is to evaluate the effects of a retrofitted ANV system with manual and automatic controls on thermal comfort, indoor air quality and energy consumption in an open-plan office building in the UK. Physical measurements were used to study the building performance in terms of thermal comfort, IAQ and energy consumption. The results revealed that occupants were much more aware about thermal comfort compared to IAQ. Therefore, relying on the occupants to control the ventilation system would considerably increase the risk of poor IAQ in buildings. Moreover, introducing automatic controls did not affect the thermal comfort conditions for those who understood and actively controlled the ANV system, while the situation improved for those occupants who were not active. Results of this study showed that introducing automated natural ventilation helped to reduce energy consumption by 8%

    Vaccinations, infections and antibacterials in the first grass pollen season of life and risk of later hayfever

    Get PDF
    Published source: Bremner, S. A., Carey, I. M., DeWilde, S., Richards, N., Maier, W. C., Hilton, S. R., Strachan, D. P. and Cook, D. G. (2007), Vaccinations, infections and antibacterials in the first grass pollen season of life and risk of later hayfever. Clinical & Experimental Allergy, 37: 512–517. doi: 10.1111/j.1365-2222.2007.02697.

    Strategies for protecting intellectual property when using CUDA applications on graphics processing units

    Get PDF
    Recent advances in the massively parallel computational abilities of graphical processing units (GPUs) have increased their use for general purpose computation, as companies look to take advantage of big data processing techniques. This has given rise to the potential for malicious software targeting GPUs, which is of interest to forensic investigators examining the operation of software. The ability to carry out reverse-engineering of software is of great importance within the security and forensics elds, particularly when investigating malicious software or carrying out forensic analysis following a successful security breach. Due to the complexity of the Nvidia CUDA (Compute Uni ed Device Architecture) framework, it is not clear how best to approach the reverse engineering of a piece of CUDA software. We carry out a review of the di erent binary output formats which may be encountered from the CUDA compiler, and their implications on reverse engineering. We then demonstrate the process of carrying out disassembly of an example CUDA application, to establish the various techniques available to forensic investigators carrying out black-box disassembly and reverse engineering of CUDA binaries. We show that the Nvidia compiler, using default settings, leaks useful information. Finally, we demonstrate techniques to better protect intellectual property in CUDA algorithm implementations from reverse engineering

    Implementing an apparent-horizon finder in three dimensions

    Get PDF
    Locating apparent horizons is not only important for a complete understanding of numerically generated spacetimes, but it may also be a crucial component of the technique for evolving black-hole spacetimes accurately. A scheme proposed by Libson et al., based on expanding the location of the apparent horizon in terms of symmetric trace-free tensors, seems very promising for use with three-dimensional numerical data sets. In this paper, we generalize this scheme and perform a number of code tests to fully calibrate its behavior in black-hole spacetimes similar to those we expect to encounter in solving the binary black-hole coalescence problem. An important aspect of the generalization is that we can compute the symmetric trace-free tensor expansion to any order. This enables us to determine how far we must carry the expansion to achieve results of a desired accuracy. To accomplish this generalization, we describe a new and very convenient set of recurrence relations which apply to symmetric trace-free tensors.Comment: 14 pages (RevTeX 3.0 with 3 figures
    • 

    corecore