53,316 research outputs found

    Interfacing GHz-bandwidth heralded single photons with a room-temperature Raman quantum memory

    Full text link
    Photonics is a promising platform for quantum technologies. However, photon sources and two-photon gates currently only operate probabilistically. Large-scale photonic processing will therefore be impossible without a multiplexing strategy to actively select successful events. High time-bandwidth-product quantum memories - devices that store and retrieve single photons on-demand - provide an efficient remedy via active synchronisation. Here we interface a GHz-bandwidth heralded single-photon source and a room-temperature Raman memory with a time-bandwidth product exceeding 1000. We store heralded single photons and observe a clear influence of the input photon statistics on the retrieved light, which agrees with our theoretical model. The preservation of the stored field's statistics is limited by four-wave-mixing noise, which we identify as the key remaining challenge in the development of practical memories for scalable photonic information processing

    From Frequency to Meaning: Vector Space Models of Semantics

    Full text link
    Computers understand very little of the meaning of human language. This profoundly limits our ability to give instructions to computers, the ability of computers to explain their actions to us, and the ability of computers to analyse and process text. Vector space models (VSMs) of semantics are beginning to address these limits. This paper surveys the use of VSMs for semantic processing of text. We organize the literature on VSMs according to the structure of the matrix in a VSM. There are currently three broad classes of VSMs, based on term-document, word-context, and pair-pattern matrices, yielding three classes of applications. We survey a broad range of applications in these three categories and we take a detailed look at a specific open source project in each category. Our goal in this survey is to show the breadth of applications of VSMs for semantics, to provide a new perspective on VSMs for those who are already familiar with the area, and to provide pointers into the literature for those who are less familiar with the field

    General cost analysis for scholarly communication in Germany : results of the "Houghton Report" for Germany

    Get PDF
    Management Summary: Conducted within the project “Economic Implications of New Models for Information Supply for Science and Research in Germany”, the Houghton Report for Germany provides a general cost and benefit analysis for scientific communication in Germany comparing different scenarios according to their specific costs and explicitly including the German National License Program (NLP). Basing on the scholarly lifecycle process model outlined by Björk (2007), the study compared the following scenarios according to their accounted costs: - Traditional subscription publishing, - Open access publishing (Gold Open Access; refers primarily to journal publishing where access is free of charge to readers, while the authors or funding organisations pay for publication) - Open Access self-archiving (authors deposit their work in online open access institutional or subject-based repositories, making it freely available to anyone with Internet access; further divided into (i) CGreen Open Access’ self-archiving operating in parallel with subscription publishing; and (ii) the ‘overlay services’ model in which self-archiving provides the foundation for overlay services (e.g. peer review, branding and quality control services)) - the NLP. Within all scenarios, five core activity elements (Fund research and research communication; perform research and communicate the results; publish scientific and scholarly works; facilitate dissemination, retrieval and preservation; study publications and apply the knowledge) were modeled and priced with all their including activities. Modelling the impacts of an increase in accessibility and efficiency resulting from more open access on returns to R&D over a 20 year period and then comparing costs and benefits, we find that the benefits of open access publishing models are likely to substantially outweigh the costs and, while smaller, the benefits of the German NLP also exceed the costs. This analysis of the potential benefits of more open access to research findings suggests that different publishing models can make a material difference to the benefits realised, as well as the costs faced. It seems likely that more Open Access would have substantial net benefits in the longer term and, while net benefits may be lower during a transitional period, they are likely to be positive for both ‘author-pays’ Open Access publishing and the ‘over-lay journals’ alternatives (‘Gold Open Access’), and for parallel subscription publishing and self-archiving (‘Green Open Access’). The NLP returns substantial benefits and savings at a modest cost, returning one of the highest benefit/cost ratios available from unilateral national policies during a transitional period (second to that of ‘Green Open Access’ self-archiving). Whether ‘Green Open Access’ self-archiving in parallel with subscriptions is a sustainable model over the longer term is debateable, and what impact the NLP may have on the take up of Open Access alternatives is also an important consideration. So too is the potential for developments in Open Access or other scholarly publishing business models to significantly change the relative cost-benefit of the NLP over time. The results are comparable to those of previous studies from the UK and Netherlands. Green Open Access in parallel with the traditional model yields the best benefits/cost ratio. Beside its benefits/cost ratio, the meaningfulness of the NLP is given by its enforceability. The true costs of toll access publishing (beside the buyback” of information) is the prohibition of access to research and knowledge for society

    Towards optical intensity interferometry for high angular resolution stellar astrophysics

    Full text link
    Most neighboring stars are still detected as point sources and are beyond the angular resolution reach of current observatories. Methods to improve our understanding of stars at high angular resolution are investigated. Air Cherenkov telescopes (ACTs), primarily used for Gamma-ray astronomy, enable us to increase our understanding of the circumstellar environment of a particular system. When used as optical intensity interferometers, future ACT arrays will allow us to detect stars as extended objects and image their surfaces at high angular resolution. Optical stellar intensity interferometry (SII) with ACT arrays, composed of nearly 100 telescopes, will provide means to measure fundamental stellar parameters and also open the possibility of model-independent imaging. A data analysis algorithm is developed and permits the reconstruction of high angular resolution images from simulated SII data. The capabilities and limitations of future ACT arrays used for high angular resolution imaging are investigated via Monte-Carlo simulations. Simple stellar objects as well as stellar surfaces with localized hot or cool regions can be accurately imaged. Finally, experimental efforts to measure intensity correlations are expounded. The functionality of analog and digital correlators is demonstrated. Intensity correlations have been measured for a simulated star emitting pseudo-thermal light, resulting in angular diameter measurements. The StarBase observatory, consisting of a pair of 3 m telescopes separated by 23 m, is described.Comment: PhD dissertatio

    Access and usability issues of scholarly electronic publications

    Get PDF
    This chapter looks at the various access and usability issues related to scholarly information resources. It first looks at the various channels through which a user can get access to scholarly electronic publications. It then discusses the issues and studies surrounding usability. Some important parameters for measuring the usability of information access systems have been identified. Finally the chapter looks at the major problems facing the users in getting access to scholarly information through today's hybrid libraries, and mentions some possible measures to resolve these problems

    Stellar Intensity Interferometry: Prospects for sub-milliarcsecond optical imaging

    Full text link
    Using kilometric arrays of air Cherenkov telescopes, intensity interferometry may increase the spatial resolution in optical astronomy by an order of magnitude, enabling images of rapidly rotating stars with structures in their circumstellar disks and winds, or mapping out patterns of nonradial pulsations across stellar surfaces. Intensity interferometry (pioneered by Hanbury Brown and Twiss) connects telescopes only electronically, and is practically insensitive to atmospheric turbulence and optical imperfections, permitting observations over long baselines and through large airmasses, also at short optical wavelengths. The required large telescopes with very fast detectors are becoming available as arrays of air Cherenkov telescopes, distributed over a few square km. Digital signal handling enables very many baselines to be synthesized, while stars are tracked with electronic time delays, thus synthesizing an optical interferometer in software. Simulated observations indicate limiting magnitudes around m(v)=8, reaching resolutions ~30 microarcsec in the violet. The signal-to-noise ratio favors high-temperature sources and emission-line structures, and is independent of the optical passband, be it a single spectral line or the broad spectral continuum. Intensity interferometry provides the modulus (but not phase) of any spatial frequency component of the source image; for this reason image reconstruction requires phase retrieval techniques, feasible if sufficient coverage of the interferometric (u,v)-plane is available. Experiments are in progress; test telescopes have been erected, and trials in connecting large Cherenkov telescopes have been carried out. This paper reviews this interferometric method in view of the new possibilities offered by arrays of air Cherenkov telescopes, and outlines observational programs that should become realistic already in the rather near future.Comment: New Astronomy Reviews, in press; 101 pages, 11 figures, 185 reference

    Digital Library Evaluation: Toward an Evolution of Concepts

    Get PDF
    published or submitted for publicatio

    High-speed noise-free optical quantum memory

    Full text link
    Quantum networks promise to revolutionise computing, simulation, and communication. Light is the ideal information carrier for quantum networks, as its properties are not degraded by noise in ambient conditions, and it can support large bandwidths enabling fast operations and a large information capacity. Quantum memories, devices that store, manipulate, and release on demand quantum light, have been identified as critical components of photonic quantum networks, because they facilitate scalability. However, any noise introduced by the memory can render the device classical by destroying the quantum character of the light. Here we introduce an intrinsically noise-free memory protocol based on two-photon off-resonant cascaded absorption (ORCA). We consequently demonstrate for the first time successful storage of GHz-bandwidth heralded single photons in a warm atomic vapour with no added noise; confirmed by the unaltered photon statistics upon recall. Our ORCA memory platform meets the stringent noise-requirements for quantum memories whilst offering technical simplicity and high-speed operation, and therefore is immediately applicable to low-latency quantum networks
    • 

    corecore