391 research outputs found

    HypTrails: A Bayesian Approach for Comparing Hypotheses About Human Trails on the Web

    Full text link
    When users interact with the Web today, they leave sequential digital trails on a massive scale. Examples of such human trails include Web navigation, sequences of online restaurant reviews, or online music play lists. Understanding the factors that drive the production of these trails can be useful for e.g., improving underlying network structures, predicting user clicks or enhancing recommendations. In this work, we present a general approach called HypTrails for comparing a set of hypotheses about human trails on the Web, where hypotheses represent beliefs about transitions between states. Our approach utilizes Markov chain models with Bayesian inference. The main idea is to incorporate hypotheses as informative Dirichlet priors and to leverage the sensitivity of Bayes factors on the prior for comparing hypotheses with each other. For eliciting Dirichlet priors from hypotheses, we present an adaption of the so-called (trial) roulette method. We demonstrate the general mechanics and applicability of HypTrails by performing experiments with (i) synthetic trails for which we control the mechanisms that have produced them and (ii) empirical trails stemming from different domains including website navigation, business reviews and online music played. Our work expands the repertoire of methods available for studying human trails on the Web.Comment: Published in the proceedings of WWW'1

    Enabling quantitative data analysis through e-infrastructures

    Get PDF
    This paper discusses how quantitative data analysis in the social sciences can engage with and exploit an e-Infrastructure. We highlight how a number of activities which are central to quantitative data analysis, referred to as ‘data management’, can benefit from e-infrastructure support. We conclude by discussing how these issues are relevant to the DAMES (Data Management through e-Social Science) research Node, an ongoing project that aims to develop e-Infrastructural resources for quantitative data analysis in the social sciences

    Sharing interoperable workflow provenance: A review of best practices and their practical application in CWLProv

    Get PDF
    Background: The automation of data analysis in the form of scientific workflows has become a widely adopted practice in many fields of research. Computationally driven data-intensive experiments using workflows enable Automation, Scaling, Adaption and Provenance support (ASAP). However, there are still several challenges associated with the effective sharing, publication and reproducibility of such workflows due to the incomplete capture of provenance and lack of interoperability between different technical (software) platforms. Results: Based on best practice recommendations identified from literature on workflow design, sharing and publishing, we define a hierarchical provenance framework to achieve uniformity in the provenance and support comprehensive and fully re-executable workflows equipped with domain-specific information. To realise this framework, we present CWLProv, a standard-based format to represent any workflow-based computational analysis to produce workflow output artefacts that satisfy the various levels of provenance. We utilise open source community-driven standards; interoperable workflow definitions in Common Workflow Language (CWL), structured provenance representation using the W3C PROV model, and resource aggregation and sharing as workflow-centric Research Objects (RO) generated along with the final outputs of a given workflow enactment. We demonstrate the utility of this approach through a practical implementation of CWLProv and evaluation using real-life genomic workflows developed by independent groups. Conclusions: The underlying principles of the standards utilised by CWLProv enable semantically-rich and executable Research Objects that capture computational workflows with retrospective provenance such that any platform supporting CWL will be able to understand the analysis, re-use the methods for partial re-runs, or reproduce the analysis to validate the published findings.Submitted to GigaScience (GIGA-D-18-00483

    BN domains included into carbon nanotubes: role of interface

    Full text link
    We present a density functional theory study on the shape and arrangement of small BN domains embedded into single-walled carbon nanotubes. We show a strong tendency for the BN hexagons formation at the simultaneous inclusion of B and N atoms within the walls of carbon nanotubes. The work emphasizes the importance of a correct description of the BN-C frontier. We suggest that BN-C interface will be formed preferentially with the participation of N-C bonds. Thus, we propose a new way of stabilizing the small BN inclusions through the formation of nitrogen terminated borders. The comparison between the obtained results and the available experimental data on formation of BN plackets within the single walled carbon nanotubes is presented. The mirror situation of inclusion of carbon plackets within single walled BN nanotubes is considered within the proposed formalism. Finally, we show that the inclusion of small BN plackets inside the CNTs strongly affects the electronic character of the initial systems, opening a band gap. The nitrogen excess in the BN plackets introduces donor states in the band gap and it might thus result in a promising way for n-doping single walled carbon nanotubes

    An Effective-Medium Tight-Binding Model for Silicon

    Full text link
    A new method for calculating the total energy of Si systems is presented. The method is based on the effective-medium theory concept of a reference system. Instead of calculating the energy of an atom in the system of interest a reference system is introduced where the local surroundings are similar. The energy of the reference system can be calculated selfconsistently once and for all while the energy difference to the reference system can be obtained approximately. We propose to calculate it using the tight-binding LMTO scheme with the Atomic-Sphere Approximation(ASA) for the potential, and by using the ASA with charge-conserving spheres we are able to treat open system without introducing empty spheres. All steps in the calculational method is {\em ab initio} in the sense that all quantities entering are calculated from first principles without any fitting to experiment. A complete and detailed description of the method is given together with test calculations of the energies of phonons, elastic constants, different structures, surfaces and surface reconstructions. We compare the results to calculations using an empirical tight-binding scheme.Comment: 26 pages (11 uuencoded Postscript figures appended), LaTeX, CAMP-090594-

    Discovery of the Optical Transient of the Gamma Ray Burst 990308

    Full text link
    The optical transient of the faint Gamma Ray Burst 990308 was detected by the QUEST camera on the Venezuelan 1-m Schmidt telescope starting 3.28 hours after the burst. Our photometry gives V=18.32±0.07V = 18.32 \pm 0.07, R=18.14±0.06R = 18.14 \pm 0.06, B=18.65±0.23B = 18.65 \pm 0.23, and R=18.22±0.05R = 18.22 \pm 0.05 for times ranging from 3.28 to 3.47 hours after the burst. The colors correspond to a spectral slope of close to fνν1/3f_{\nu} \propto \nu^{1/3}. Within the standard synchrotron fireball model, this requires that the external medium be less dense than 104cm310^{4} cm^{-3}, the electrons contain >20> 20% of the shock energy, and the magnetic field energy must be less than 24% of the energy in the electrons for normal interstellar or circumstellar densities. We also report upper limits of V>12.0V > 12.0 at 132 s (with LOTIS), V>13.4V > 13.4 from 132-1029s (with LOTIS), V>15.3V > 15.3 at 28.2 min (with Super-LOTIS), and a 8.5 GHz flux of <114μJy< 114 \mu Jy at 110 days (with the Very Large Array). WIYN 3.5-m and Keck 10-m telescopes reveal this location to be empty of any host galaxy to R>25.7R > 25.7 and K>23.3K > 23.3. The lack of a host galaxy likely implies that it is either substantially subluminous or more distant than a red shift of 1.2\sim 1.2.Comment: ApJ Lett submitted, 5 pages, 2 figures, no space for 12 coauthor

    On the shortness of vectors to be found by the Ideal-SVP quantum algorithm

    Get PDF
    The hardness of finding short vectors in ideals of cyclotomic number fields (hereafter, Ideal-SVP) can serve as a worst-case assumption for numerous efficient cryptosystems, via the average-case problems Ring-SIS and Ring-LWE. For a while, it could be assumed the Ideal-SVP problem was as hard a
    corecore