118 research outputs found

    Primary gamma ray selection in a hybrid timing/imaging Cherenkov array

    Full text link
    This work is a methodical study on hybrid reconstruction techniques for hybrid imaging/timing Cherenkov observations. This type of hybrid array is to be realized at the gamma-observatory TAIGA intended for very high energy gamma-ray astronomy (>30 TeV). It aims at combining the cost-effective timing-array technique with imaging telescopes. Hybrid operation of both of these techniques can lead to a relatively cheap way of development of a large area array. The joint approach of gamma event selection was investigated on both types of simulated data: the image parameters from the telescopes, and the shower parameters reconstructed from the timing array. The optimal set of imaging parameters and shower parameters to be combined is revealed. The cosmic ray background suppression factor depending on distance and energy is calculated. The optimal selection technique leads to cosmic ray background suppression of about 2 orders of magnitude on distances up to 450 m for energies greater than 50 TeV.Comment: 4 pages, 5 figures; proceedings of the 19th International Symposium on Very High Energy Cosmic Ray Interactions (ISVHECRI 2016

    Role of gluons in soft and semi-hard multiple hadron production in pp collisions at LHC

    Full text link
    Hadron inclusive spectra in pp collisions are analyzed within the modified quark-gluon string model including both the longitudinal and transverse motion of quarks in the proton in the wide region of initial energies. The self-consistent analysis shows that the experimental data on the inclusive spectra of light hadrons like pions and kaons at ISR energies can be satisfactorily described at transverse momenta not larger than 1-2 GeV/c. We discuss some difficulties to apply this model at energies above the ISR and suggest to include the distribution of gluons in the proton unintegrated over the internal transverse momentum. It leads to an increase in the inclusive spectra of hadrons and allows us to extend the satisfactory description of the data in the central rapidity region at energies higher than ISR.Comment: 19 pages, 20 figure

    Unintegrated gluon distribution and soft pp collisions at LHC

    Full text link
    We found the parameterization of the unintegrated gluon distribution from the best description of the LHC data on the inclusive spectra of hadrons produced in pppp collisions at the mid-rapidity region and small transverse momenta. It is different from the one obtained within perturbative QCD only at low intrinsic transverse momenta ktk_t. The application of this distribution to analysis of the e−pe-p DIS allows us to get the results which do not contradict the H1 and ZEUS data on the structure functions at low xx. So, the connection between the soft processes at LHC and low-xx physics at HERA is found.Comment: 8 pages, 4 figures. Contributed to 3rd Workshop on Multi-Parton Interactions at the LHC (MPI11), Hamburg, 21-25 November 201

    Hard-core Radius of Nucleons within the Induced Surface Tension Approach

    Full text link
    In this work we discuss a novel approach to model the hadronic and nuclear matter equations of state using the induced surface tension concept. Since the obtained equations of state, classical and quantum, are among the most successful ones in describing the properties of low density phases of strongly interacting matter, they set strong restrictions on the possible value of the hard-core radius of nucleons. Therefore, we perform a detailed analysis of its value which follows from hadronic and nuclear matter properties and find the most trustworthy range of its values: the hard-core radius of nucleons is 0.30--0.36 fm. A comparison with the phenomenology of neutron stars implies that the hard-core radius of nucleons has to be temperature and density dependent.Comment: 12 pages, 4 figures, references added, typos correcte

    Software implementation of the main cluster analysis tools

    Get PDF
    This article discusses an approach to creating a complex of programs for the implementation of cluster analysis methods. A number of cluster analysis tools for processing the initial data set and their software implementation are analyzed, as well as the complexity of the application of cluster data analysis. An approach to data is generalized from the point of view of factual material that supplies information for the problem under study and is the basis for discussion, analysis and decision-making. Cluster analysis is a procedure that combines objects or variables into groups based on a given rule. The work provides a grouping of multivariate data using proximity measures such as sample correlation coefficient and its module, cosine of the angle between vectors and Euclidean distance. The authors proposed a method for grouping by centers, by the nearest neighbor and by selected standards. The results can be used by analysts in the process of creating a data analysis structure and will improve the efficiency of clustering algorithms. The practical significance of the results of the application of the developed algorithms is expressed in the software package created by means of the C ++ language in the VS environment
    • 

    corecore