18,888 research outputs found

    Analysis of plasmas generated by fission fragments

    Get PDF
    A kinetic model is developed for a plasma generated by fission fragments and the results are employed to study helium plasma generated in a tube coated with fissionable material. Because both the heavy particles and electrons play important roles in creating the plasma, their effects are considered simultaneously. The calculations are carried out for a range of neutron fluxes and pressures. In general, the predictions of the theory are in good agreement with available intensity measurements. Moreover, the theory predicts the experimentally measured inversions. However, the calculated gain coefficients are such that lasing is not expected to take place in a helium plasma generated by fission fragments. The effects of an externally applied electric field are also considered

    Thermodynamic properties of UF6 at high temperatures

    Get PDF
    The equilibrium composition and the thermodynamic properties of the mixture resulting from the decomposition of uranium hexafluoride is calculated for temperatures ranging from 600 K to 4000 K at pressures from 0.01 atmospheres to 10 atmospheres

    The electron Boltzmann equation in a plasma generated by fission fragments

    Get PDF
    A Boltzmann equation formulation is presented for the determination of the electron distribution function in a plasma generated by fission fragments. The formulation takes into consideration ambipolar diffusion, elastic and inelastic collisions, recombination and ionization, and allows for the fact that the primary electrons are not monoenergetic. Calculations for He in a tube coated with fissionable material show that, over a wide pressure and neutron flux range, the distribution function is non-Maxwellian, but the electrons are essentially thermal. Moreover, about a third of the energy of the primary electrons is transferred into the inelastic levels of He. This fraction of energy transfer is almost independent of pressure and neutron flux but increases sharply in the presence of a sustainer electric field

    GPU-Based Volume Rendering of Noisy Multi-Spectral Astronomical Data

    Full text link
    Traditional analysis techniques may not be sufficient for astronomers to make the best use of the data sets that current and future instruments, such as the Square Kilometre Array and its Pathfinders, will produce. By utilizing the incredible pattern-recognition ability of the human mind, scientific visualization provides an excellent opportunity for astronomers to gain valuable new insight and understanding of their data, particularly when used interactively in 3D. The goal of our work is to establish the feasibility of a real-time 3D monitoring system for data going into the Australian SKA Pathfinder archive. Based on CUDA, an increasingly popular development tool, our work utilizes the massively parallel architecture of modern graphics processing units (GPUs) to provide astronomers with an interactive 3D volume rendering for multi-spectral data sets. Unlike other approaches, we are targeting real time interactive visualization of datasets larger than GPU memory while giving special attention to data with low signal to noise ratio - two critical aspects for astronomy that are missing from most existing scientific visualization software packages. Our framework enables the astronomer to interact with the geometrical representation of the data, and to control the volume rendering process to generate a better representation of their datasets.Comment: 4 pages, 1 figure, to appear in the proceedings of ADASS XIX, Oct 4-8 2009, Sapporo, Japan (ASP Conf. Series

    Unleashing the Power of Distributed CPU/GPU Architectures: Massive Astronomical Data Analysis and Visualization case study

    Full text link
    Upcoming and future astronomy research facilities will systematically generate terabyte-sized data sets moving astronomy into the Petascale data era. While such facilities will provide astronomers with unprecedented levels of accuracy and coverage, the increases in dataset size and dimensionality will pose serious computational challenges for many current astronomy data analysis and visualization tools. With such data sizes, even simple data analysis tasks (e.g. calculating a histogram or computing data minimum/maximum) may not be achievable without access to a supercomputing facility. To effectively handle such dataset sizes, which exceed today's single machine memory and processing limits, we present a framework that exploits the distributed power of GPUs and many-core CPUs, with a goal of providing data analysis and visualizing tasks as a service for astronomers. By mixing shared and distributed memory architectures, our framework effectively utilizes the underlying hardware infrastructure handling both batched and real-time data analysis and visualization tasks. Offering such functionality as a service in a "software as a service" manner will reduce the total cost of ownership, provide an easy to use tool to the wider astronomical community, and enable a more optimized utilization of the underlying hardware infrastructure.Comment: 4 Pages, 1 figures, To appear in the proceedings of ADASS XXI, ed. P.Ballester and D.Egret, ASP Conf. Serie

    Modeling and Analysis of Content Caching in Wireless Small Cell Networks

    Full text link
    Network densification with small cell base stations is a promising solution to satisfy future data traffic demands. However, increasing small cell base station density alone does not ensure better users quality-of-experience and incurs high operational expenditures. Therefore, content caching on different network elements has been proposed as a mean of offloading he backhaul by caching strategic contents at the network edge, thereby reducing latency. In this paper, we investigate cache-enabled small cells in which we model and characterize the outage probability, defined as the probability of not satisfying users requests over a given coverage area. We analytically derive a closed form expression of the outage probability as a function of signal-to-interference ratio, cache size, small cell base station density and threshold distance. By assuming the distribution of base stations as a Poisson point process, we derive the probability of finding a specific content within a threshold distance and the optimal small cell base station density that achieves a given target cache hit probability. Furthermore, simulation results are performed to validate the analytical model.Comment: accepted for publication, IEEE ISWCS 201
    corecore