1,456 research outputs found

    Particle-in-cell simulation of a mildly relativistic collision of an electron-ion plasma carrying a quasi-parallel magnetic field: Electron acceleration and magnetic field amplification at supernova shocks

    Full text link
    Plasma processes close to SNR shocks result in the amplification of magnetic fields and in the acceleration of electrons, injecting them into the diffusive acceleration mechanism. The acceleration of electrons and the B field amplification by the collision of two plasma clouds, each consisting of electrons and ions, at a speed of 0.5c is investigated. A quasi-parallel guiding magnetic field, a cloud density ratio of 10 and a plasma temperature of 25 keV are considered. A quasi-planar shock forms at the front of the dense plasma cloud. It is mediated by a circularly left-hand polarized electromagnetic wave with an electric field component along the guiding magnetic field. Its propagation direction is close to that of the guiding field and orthogonal to the collision boundary. It has a low frequency and a wavelength that equals several times the ion inertial length, which would be indicative of a dispersive Alfven wave close to the ion cyclotron resonance frequency of the left-handed mode (ion whistler), provided that the frequency is appropriate. However, it moves with the super-alfvenic plasma collision speed, suggesting that it is an Alfven precursor or a nonlinear MHD wave such as a Short Large-Amplitude Magnetic Structure (SLAMS). The growth of the magnetic amplitude of this wave to values well in excess of those of the quasi-parallel guiding field and of the filamentation modes results in a quasi-perpendicular shock. We present evidence for the instability of this mode to a four wave interaction. The waves developing upstream of the dense cloud give rise to electron acceleration ahead of the collision boundary. Energy equipartition between the ions and the electrons is established at the shock and the electrons are accelerated to relativistic speeds.Comment: 16 pages, 18 figures, Accepted for publication by Astron & Astrophy

    Neutrinos from photo-hadronic interactions in Pks2155-304

    Full text link
    The high-peaked BL Lac object Pks2155-304 shows high variability at multiwavelengths, i.e. from optical up to TeV energies. A giant flare of around 1 hour at X-ray and TeV energies was observed in 2006. In this context, it is essential to understand the physical processes in terms of the primary spectrum and the radiation emitted, since high-energy emission can arise in both leptonic and hadronic processes. In this contribution, we investigate the possibility of neutrino production in photo-hadronic interactions. In particular, we predict a direct correlation between optical and TeV energies at sufficiently high optical radiation fields. We show that in the blazar Pks2155-304, the optical emission in the low-state is sufficient to lead to photo-hadronic interactions and therefore to the production of high-energy photons.Comment: contribution to RICAP 2009 and ICRC 2009 - both papers are combined in one draft. 11 pages, 3 figure

    Software development effort estimation using function points and simpler functional measures: a comparison

    Get PDF
    Background-Functional Size Measures are widely used for estimating the development effort of software. After the introduction of Function Points, a few "simplified"measures have been proposed, aiming to make measurement simpler and quicker, but also to make measures applicable when fully detailed software specifications are not yet available. It has been shown that, in general, software size measures expressed in Function Points do not support more accurate effort estimation with respect to simplified measures. Objective-Many practitioners believe that when considering "complex"projects, i.e., project that involve many complex transactions and data, traditional Function Points measures support more accurate estimates than simpler functional size measures that do not account for greater-Then-Average complexity. In this paper, we aim to produce evidence that confirms or disproves such belief. Method-Based on a dataset that contains both effort and size data, an empirical study is performed, to provide some evidence concerning the relations that link functional size (measured in different ways) and development effort. Results-Our analysis shows that there is no statistically significant evidence that Function Points are generally better at estimating more complex projects than simpler measures. Function Points appeared better in some specific conditions, but in those conditions they also performed worse than simpler measures when dealing with less complex projects. Conclusions-Traditional Function Points do not seem to effectively account for software complexity. To improve effort estimation, researchers should probably dedicate their effort to devise a way of measuring software complexity that can be used in effort models together with (traditional or simplified) functional size measures

    Estimating functional size of software with confidence intervals

    Get PDF
    In many projects, software functional size is measured via the IFPUG (International Function Point Users Group) Function Point Analysis method. However, applying Function Point Analysis using the IFPUG process is possible only when functional user requirements are known completely and in detail. To solve this problem, several early estimation methods have been proposed and have become de facto standard processes. Among these, a prominent one is the ‘NESMA (Netherlands Software Metrics Association) estimated’ (also known as High-level Function Point Analysis) method. The NESMA estimated method simplifies the measurement by assigning fixed weights to Base Functional Components, instead of determining the weights via the detailed analysis of data and transactions. This makes the process faster and cheaper, and applicable when some details concerning data and transactions are not yet known. The accuracy of the mentioned method has been evaluated, also via large-scale empirical studies, showing that the yielded approximate measures are sufficiently accurate for practical usage. However, a limitation of the method is that it provides a specific size estimate, while other methods can provide confidence intervals, i.e., they indicate with a given confidence level that the size to be estimated is in a range. In this paper, we aim to enhance the NESMA estimated method with the possibility of computing a confidence interval. To this end, we carry out an empirical study, using data from real-life projects. The proposed approach appears effective. We expect that the possibility to estimate that the size of an application is in a range will help project managers deal with the risks connected with inevitable estimation errors

    Using locally weighted regression to estimate the functional size of software: a preliminary study

    Get PDF
    In software engineering, measuring software functional size via the IFPUG (International Function Point Users Group) Function Point Analysis using the standard manual process can be a long and expensive activity. To solve this problem, several early estimation methods have been proposed and have become de facto standard processes. Among these, a prominent one is High-level Function Point Analysis. Recently, the Simple Function Point method has been released by IFPUG; although it is a proper measurement method, it has a great level of convertibility to traditional Function Points and may be used as an estimation method. Both High-level Function Point Analysis and Simple Function Point skip the difficult and time-consuming activities needed to weight data and transaction functions. This makes the process faster and cheaper, but yields approximate measures. The accuracy of the mentioned method has been evaluated, also via large-scale empirical studies, showing that the yielded approximate measures are sufficiently accurate for practical usage. In this paper, locally weighted regression is applied to the problem outlined above. This empirical study shows that estimates obtained via locally weighted regression are more accurate than those obtained via High-level Function Point Analysis, but are not substantially better than those yielded by alternative estimation methods using linear regression. The Simple Function Point method appears to yield measures that are well correlated with those obtained via standard measurement. In conclusion, locally weighted regression appears to be effective and accurate enough for estimating software functional size

    Recollimation Shocks in Magnetized Relativistic Jets

    Get PDF
    We have performed two-dimensional special-relativistic magnetohydrodynamic simulations of non-equilibrium over-pressured relativistic jets in cylindrical geometry. Multiple stationary recollimation shock and rarefaction structures are produced along the jet by the nonlinear interaction of shocks and rarefaction waves excited at the interface between the jet and the surrounding ambient medium. Although initially the jet is kinematically dominated, we have considered axial, toroidal and helical magnetic fields to investigate the effects of different magnetic-field topologies and strengths on the recollimation structures. We find that an axial field introduces a larger effective gas-pressure and leads to stronger recollimation shocks and rarefactions, resulting in larger flow variations. The jet boost grows quadratically with the initial magnetic field. On the other hand, a toroidal field leads to weaker recollimation shocks and rarefactions, modifying significantly the jet structure after the first recollimation rarefaction and shock. The jet boost decreases systematically. For a helical field, instead, the behaviour depends on the magnetic pitch, with a phenomenology that ranges between the one seen for axial and toroidal magnetic fields, respectively. In general, however, a helical magnetic field yields a more complex shock and rarefaction substructure close to the inlet that significantly modifies the jet structure. The differences in shock structure resulting from different field configurations and strengths may have observable consequences for disturbances propagating through a stationary recollimation shock.Comment: 14 pages, 15 figures and 1 table, accepted for publication in Ap

    Cosmic Plasmas and Electromagnetic Phenomena

    Get PDF
    During the past few decades, plasma science has witnessed a great growth in laboratory studies, in simulations, and in space. Plasma is the most common phase of ordinary matter in the universe. It is a state in which ionized matter (even as low as 1%) becomes highly electrically conductive. As such, long-range electric and magnetic fields dominate its behavior. Cosmic plasmas are mostly associated with stars, supernovae, pulsars and neutron stars, quasars and active galaxies at the vicinities of black holes (i.e., their jets and accretion disks). Cosmic plasma phenomena can be studied with different methods, such as laboratory experiments, astrophysical observations, and theoretical/computational approaches (i.e., MHD, particle-in-cell simulations, etc.). They exhibit a multitude of complex magnetohydrodynamic behaviors, acceleration, radiation, turbulence, and various instability phenomena. This Special Issue addresses the growing need of the plasma science principles in astrophysics and presents our current understanding of the physics of astrophysical plasmas, their electromagnetic behaviors and properties (e.g., shocks, waves, turbulence, instabilities, collimation, acceleration and radiation), both microscopically and macroscopically. This Special Issue provides a series of state-of-the-art reviews from international experts in the field of cosmic plasmas and electromagnetic phenomena using theoretical approaches, astrophysical observations, laboratory experiments, and state-of-the-art simulation studies
    • …
    corecore