81 research outputs found

    Document Word Clouds: Visualising Web Documents as Tag Clouds to Aid Users in Relevance Decisions

    Get PDF
    ΠΔρÎčέχΔÎč Ï„Îż Ï€Î»ÎźÏÎ”Ï‚ ÎșÎ”ÎŻÎŒÎ”ÎœÎżInformation Retrieval systems spend a great effort on determining the significant terms in a document. When, instead, a user is looking at a document he cannot benefit from such information. He has to read the text to understand which words are important. In this paper we take a look at the idea of enhancing the perception of web documents with visualisation techniques borrowed from the tag clouds of Web 2.0. Highlighting the important words in a document by using a larger font size allows to get a quick impression of the relevant concepts in a text. As this process does not depend on a user query it can also be used for explorative search. A user study showed, that already simple TF-IDF values used as notion of word importance helped the users to decide quicker, whether or not a document is relevant to a topic

    Semantic contextualisation of social tag-based profiles and item recommendations

    Full text link
    Proceedigns of 12th International Conference, EC-Web 2011, Toulouse, France, August 30 - September 1, 2011.The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-642-23014-1_9We present an approach that efficiently identifies the semantic meanings and contexts of social tags within a particular folksonomy, and exploits them to build contextualised tag-based user and item profiles. We apply our approach to a dataset obtained from Delicious social bookmarking system, and evaluate it through two experiments: a user study consisting of manual judgements of tag disambiguation and contextualisation cases, and an offline study measuring the performance of several tag-powered item recommendation algorithms by using contextualised profiles. The results obtained show that our approach is able to accurately determine the actual semantic meanings and contexts of tag annotations, and allow item recommenders to achieve better precision and recall on their predictions.This work was supported by the Spanish Ministry of Science and Innovation (TIN2008-06566-C04-02), and the Community of Madrid (CCG10- UAM/TIC-5877

    Results of the Search for Strange Quark Matter and Q-balls with the SLIM Experiment

    Full text link
    The SLIM experiment at the Chacaltaya high altitude laboratory was sensitive to nuclearites and Q-balls, which could be present in the cosmic radiation as possible Dark Matter components. It was sensitive also to strangelets, i.e. small lumps of Strange Quark Matter predicted at such altitudes by various phenomenological models. The analysis of 427 m^2 of Nuclear Track Detectors exposed for 4.22 years showed no candidate event. New upper limits on the flux of downgoing nuclearites and Q-balls at the 90% C.L. were established. The null result also restricts models for strangelets propagation through the Earth atmosphere.Comment: 14 pages, 11 EPS figure

    Time-Sensitive User Profile for Optimizing Search Personlization

    Get PDF
    International audienceThanks to social Web services, Web search engines have the opportunity to afford personalized search results that better fit the user’s information needs and interests. To achieve this goal, many personalized search approaches explore user’s social Web interactions to extract his preferences and interests, and use them to model his profile. In our approach, the user profile is implicitly represented as a vector of weighted terms which correspond to the user’s interests extracted from his online social activities. As the user interests may change over time, we propose to weight profiles terms not only according to the content of these activities but also by considering the freshness. More precisely, the weights are adjusted with a temporal feature. In order to evaluate our approach, we model the user profile according to data collected from Twitter. Then, we rerank initial search results accurately to the user profile. Moreover, we proved the significance of adding a temporal feature by comparing our method with baselines models that does not consider the user profile dynamics

    Fitting the integrated Spectral Energy Distributions of Galaxies

    Full text link
    Fitting the spectral energy distributions (SEDs) of galaxies is an almost universally used technique that has matured significantly in the last decade. Model predictions and fitting procedures have improved significantly over this time, attempting to keep up with the vastly increased volume and quality of available data. We review here the field of SED fitting, describing the modelling of ultraviolet to infrared galaxy SEDs, the creation of multiwavelength data sets, and the methods used to fit model SEDs to observed galaxy data sets. We touch upon the achievements and challenges in the major ingredients of SED fitting, with a special emphasis on describing the interplay between the quality of the available data, the quality of the available models, and the best fitting technique to use in order to obtain a realistic measurement as well as realistic uncertainties. We conclude that SED fitting can be used effectively to derive a range of physical properties of galaxies, such as redshift, stellar masses, star formation rates, dust masses, and metallicities, with care taken not to over-interpret the available data. Yet there still exist many issues such as estimating the age of the oldest stars in a galaxy, finer details ofdust properties and dust-star geometry, and the influences of poorly understood, luminous stellar types and phases. The challenge for the coming years will be to improve both the models and the observational data sets to resolve these uncertainties. The present review will be made available on an interactive, moderated web page (sedfitting.org), where the community can access and change the text. The intention is to expand the text and keep it up to date over the coming years.Comment: 54 pages, 26 figures, Accepted for publication in Astrophysics & Space Scienc

    Study of doubly strange systems using stored antiprotons

    Get PDF
    Bound nuclear systems with two units of strangeness are still poorly known despite their importance for many strong interaction phenomena. Stored antiprotons beams in the GeV range represent an unparalleled factory for various hyperon-antihyperon pairs. Their outstanding large production probability in antiproton collisions will open the floodgates for a series of new studies of systems which contain two or even more units of strangeness at the P‟ANDA experiment at FAIR. For the first time, high resolution Îł-spectroscopy of doubly strange ΛΛ-hypernuclei will be performed, thus complementing measurements of ground state decays of ΛΛ-hypernuclei at J-PARC or possible decays of particle unstable hypernuclei in heavy ion reactions. High resolution spectroscopy of multistrange Ξ−-atoms will be feasible and even the production of Ω−-atoms will be within reach. The latter might open the door to the |S|=3 world in strangeness nuclear physics, by the study of the hadronic Ω−-nucleus interaction. For the first time it will be possible to study the behavior of Ξ‟+ in nuclear systems under well controlled conditions

    Measurement of the azimuthal anisotropy of Y(1S) and Y(2S) mesons in PbPb collisions at √S^{S}NN = 5.02 TeV

    Get PDF
    The second-order Fourier coefficients (υ2_{2}) characterizing the azimuthal distributions of ΄(1S) and ΄(2S) mesons produced in PbPb collisions at sNN\sqrt{s_{NN}} = 5.02 TeV are studied. The ΄mesons are reconstructed in their dimuon decay channel, as measured by the CMS detector. The collected data set corresponds to an integrated luminosity of 1.7 nb−1^{-1}. The scalar product method is used to extract the υ2_{2} coefficients of the azimuthal distributions. Results are reported for the rapidity range |y| < 2.4, in the transverse momentum interval 0 < pT_{T} < 50 GeV/c, and in three centrality ranges of 10–30%, 30–50% and 50–90%. In contrast to the J/ψ mesons, the measured υ2_{2} values for the ΄ mesons are found to be consistent with zero

    Measurement of prompt D0^{0} and D‟\overline{D}0^{0} meson azimuthal anisotropy and search for strong electric fields in PbPb collisions at root SNN\sqrt{S_{NN}} = 5.02 TeV

    Get PDF
    The strong Coulomb field created in ultrarelativistic heavy ion collisions is expected to produce a rapiditydependent difference (Av2) in the second Fourier coefficient of the azimuthal distribution (elliptic flow, v2) between D0 (uc) and D0 (uc) mesons. Motivated by the search for evidence of this field, the CMS detector at the LHC is used to perform the first measurement of Av2. The rapidity-averaged value is found to be (Av2) = 0.001 ? 0.001 (stat)? 0.003 (syst) in PbPb collisions at ?sNN = 5.02 TeV. In addition, the influence of the collision geometry is explored by measuring the D0 and D0mesons v2 and triangular flow coefficient (v3) as functions of rapidity, transverse momentum (pT), and event centrality (a measure of the overlap of the two Pb nuclei). A clear centrality dependence of prompt D0 meson v2 values is observed, while the v3 is largely independent of centrality. These trends are consistent with expectations of flow driven by the initial-state geometry. ? 2021 The Author. Published by Elsevier B.V. This is an open access article under the CC BY licens

    Performance of the CMS Level-1 trigger in proton-proton collisions at √s = 13 TeV

    Get PDF
    At the start of Run 2 in 2015, the LHC delivered proton-proton collisions at a center-of-mass energy of 13\TeV. During Run 2 (years 2015–2018) the LHC eventually reached a luminosity of 2.1× 1034^{34} cm−2^{-2}s−1^{-1}, almost three times that reached during Run 1 (2009–2013) and a factor of two larger than the LHC design value, leading to events with up to a mean of about 50 simultaneous inelastic proton-proton collisions per bunch crossing (pileup). The CMS Level-1 trigger was upgraded prior to 2016 to improve the selection of physics events in the challenging conditions posed by the second run of the LHC. This paper describes the performance of the CMS Level-1 trigger upgrade during the data taking period of 2016–2018. The upgraded trigger implements pattern recognition and boosted decision tree regression techniques for muon reconstruction, includes pileup subtraction for jets and energy sums, and incorporates pileup-dependent isolation requirements for electrons and tau leptons. In addition, the new trigger calculates high-level quantities such as the invariant mass of pairs of reconstructed particles. The upgrade reduces the trigger rate from background processes and improves the trigger efficiency for a wide variety of physics signals

    Studies of charm and beauty hadron long-range correlations in pp and pPb collisions at LHC energies

    Get PDF
    • 

    corecore