402 research outputs found

    Crowdsourcing Linked Data on listening experiences through reuse and enhancement of library data

    Get PDF
    Research has approached the practice of musical reception in a multitude of ways, such as the analysis of professional critique, sales figures and psychological processes activated by the act of listening. Studies in the Humanities, on the other hand, have been hindered by the lack of structured evidence of actual experiences of listening as reported by the listeners themselves, a concern that was voiced since the early Web era. It was however assumed that such evidence existed, albeit in pure textual form, but could not be leveraged until it was digitised and aggregated. The Listening Experience Database (LED) responds to this research need by providing a centralised hub for evidence of listening in the literature. Not only does LED support search and reuse across nearly 10,000 records, but it also provides machine-readable structured data of the knowledge around the contexts of listening. To take advantage of the mass of formal knowledge that already exists on the Web concerning these contexts, the entire framework adopts Linked Data principles and technologies. This also allows LED to directly reuse open data from the British Library for the source documentation that is already published. Reused data are re-published as open data with enhancements obtained by expanding over the model of the original data, such as the partitioning of published books and collections into individual stand-alone documents. The database was populated through crowdsourcing and seamlessly incorporates data reuse from the very early data entry phases. As the sources of the evidence often contain vague, fragmentary of uncertain information, facilities were put in place to generate structured data out of such fuzziness. Alongside elaborating on these functionalities, this article provides insights into the most recent features of the latest instalment of the dataset and portal, such as the interlinking with the MusicBrainz database, the relaxation of geographical input constraints through text mining, and the plotting of key locations in an interactive geographical browser

    Attacks on quantum key distribution protocols that employ non-ITS authentication

    Full text link
    We demonstrate how adversaries with unbounded computing resources can break Quantum Key Distribution (QKD) protocols which employ a particular message authentication code suggested previously. This authentication code, featuring low key consumption, is not Information-Theoretically Secure (ITS) since for each message the eavesdropper has intercepted she is able to send a different message from a set of messages that she can calculate by finding collisions of a cryptographic hash function. However, when this authentication code was introduced it was shown to prevent straightforward Man-In-The-Middle (MITM) attacks against QKD protocols. In this paper, we prove that the set of messages that collide with any given message under this authentication code contains with high probability a message that has small Hamming distance to any other given message. Based on this fact we present extended MITM attacks against different versions of BB84 QKD protocols using the addressed authentication code; for three protocols we describe every single action taken by the adversary. For all protocols the adversary can obtain complete knowledge of the key, and for most protocols her success probability in doing so approaches unity. Since the attacks work against all authentication methods which allow to calculate colliding messages, the underlying building blocks of the presented attacks expose the potential pitfalls arising as a consequence of non-ITS authentication in QKD-postprocessing. We propose countermeasures, increasing the eavesdroppers demand for computational power, and also prove necessary and sufficient conditions for upgrading the discussed authentication code to the ITS level.Comment: 34 page

    Estimates for practical quantum cryptography

    Get PDF
    In this article I present a protocol for quantum cryptography which is secure against attacks on individual signals. It is based on the Bennett-Brassard protocol of 1984 (BB84). The security proof is complete as far as the use of single photons as signal states is concerned. Emphasis is given to the practicability of the resulting protocol. For each run of the quantum key distribution the security statement gives the probability of a successful key generation and the probability for an eavesdropper's knowledge, measured as change in Shannon entropy, to be below a specified maximal value.Comment: Authentication scheme corrected. Other improvements of presentatio

    Quantum Nonlocality without Entanglement

    Get PDF
    We exhibit an orthogonal set of product states of two three-state particles that nevertheless cannot be reliably distinguished by a pair of separated observers ignorant of which of the states has been presented to them, even if the observers are allowed any sequence of local operations and classical communication between the separate observers. It is proved that there is a finite gap between the mutual information obtainable by a joint measurement on these states and a measurement in which only local actions are permitted. This result implies the existence of separable superoperators that cannot be implemented locally. A set of states are found involving three two-state particles which also appear to be nonmeasurable locally. These and other multipartite states are classified according to the entropy and entanglement costs of preparing and measuring them by local operations.Comment: 27 pages, Latex, 6 ps figures. To be submitted to Phys. Rev. A. Version 2: 30 pages, many small revisions and extensions, author added. Version 3: Proof in Appendix D corrected, many small changes; final version for Phys. Rev. A Version 4: Report of Popescu conjecture modifie

    Quantum encryption with certified deletion

    Get PDF
    Given a ciphertext, is it possible to prove the deletion of the underlying plaintext? Since classical ciphertexts can be copied, clearly such a feat is impossible using classical information alone. In stark contrast to this, we show that quantum encodings enable certified deletion. More precisely, we show that it is possible to encrypt classical data into a quantum ciphertext such that the recipient of the ciphertext can produce a classical string which proves to the originator that the recipient has relinquished any chance of recovering the plaintext should the decryption key be revealed. Our scheme is feasible with current quantum technology: the honest parties only require quantum devices for single-qubit preparation and measurements; the scheme is also robust against noise in these devices. Furthermore, we provide an analysis that is suitable in the finite-key regime.Comment: 28 pages, 1 figure. Some technical details modifie

    Daylight quantum key distribution over 1.6 km

    Get PDF
    Quantum key distribution (QKD) has been demonstrated over a point-to-point 1.6\sim1.6-km atmospheric optical path in full daylight. This record transmission distance brings QKD a step closer to surface-to-satellite and other long-distance applications.Comment: 4 pages, 2 figures, 1 table. Submitted to PRL on 14 January 2000 for publication consideratio

    Data visualization in yield component analysis: an expert study

    Get PDF
    Even though data visualization is a common analytical tool in numerous disciplines, it has rarely been used in agricultural sciences, particularly in agronomy. In this paper, we discuss a study on employing data visualization to analyze a multiplicative model. This model is often used by agronomists, for example in the so-called yield component analysis. The multiplicative model in agronomy is normally analyzed by statistical or related methods. In practice, unfortunately, usefulness of these methods is limited since they help to answer only a few questions, not allowing for a complex view of the phenomena studied. We believe that data visualization could be used for such complex analysis and presentation of the multiplicative model. To that end, we conducted an expert survey. It showed that visualization methods could indeed be useful for analysis and presentation of the multiplicative model

    Tailoring Adjuvant Endocrine Therapy for Postmenopausal Breast Cancer: A CYP2D6 Multiple-Genotype-Based Modeling Analysis and Validation

    Get PDF
    Purpose: Previous studies have suggested that postmenopausal women with breast cancer who present with wild-type CYP2D6 may actually have similar or superior recurrence-free survival outcomes when given tamoxifen in place of aromatase inhibitors (AIs). The present study established a CYP2D6 multiple-genotype-based model to determine the optimal endocrine therapy for patients harboring wild-type CYP2D6. Methods: We created a Markov model to determine whether tamoxifen or AIs maximized 5-year disease-free survival (DFS) for extensive metabolizer (EM) patients using annual hazard ratio (HR) data from the BIG 1-98 trial. We then replicated the model by evaluating 9-year event-free survival (EFS) using HR data from the ATAC trial. In addition, we employed two-way sensitivity analyses to explore the impact of HR of decreased-metabolizer (DM) and its frequency on survival by studying a range of estimates. Results: The 5-year DFS of tamoxifen-treated EM patients was 83.3%, which is similar to that of genotypically unselected patients who received an AI (83.7%). In the validation study, we further demonstrated that the 9-year EFS of tamoxifentreated EM patients was 81.4%, which is higher than that of genotypically unselected patients receiving tamoxifen (78.4%) and similar to that of patients receiving an AI (83.2%). Two-way sensitivity analyses demonstrated the robustness of the results
    corecore