19,589 research outputs found

    Sequential Complexity as a Descriptor for Musical Similarity

    Get PDF
    We propose string compressibility as a descriptor of temporal structure in audio, for the purpose of determining musical similarity. Our descriptors are based on computing track-wise compression rates of quantised audio features, using multiple temporal resolutions and quantisation granularities. To verify that our descriptors capture musically relevant information, we incorporate our descriptors into similarity rating prediction and song year prediction tasks. We base our evaluation on a dataset of 15500 track excerpts of Western popular music, for which we obtain 7800 web-sourced pairwise similarity ratings. To assess the agreement among similarity ratings, we perform an evaluation under controlled conditions, obtaining a rank correlation of 0.33 between intersected sets of ratings. Combined with bag-of-features descriptors, we obtain performance gains of 31.1% and 10.9% for similarity rating prediction and song year prediction. For both tasks, analysis of selected descriptors reveals that representing features at multiple time scales benefits prediction accuracy.Comment: 13 pages, 9 figures, 8 tables. Accepted versio

    Identifying Cover Songs Using Information-Theoretic Measures of Similarity

    Get PDF
    This work is licensed under a Creative Commons Attribution 3.0 License. For more information, see http://creativecommons.org/licenses/by/3.0/This paper investigates methods for quantifying similarity between audio signals, specifically for the task of cover song detection. We consider an information-theoretic approach, where we compute pairwise measures of predictability between time series. We compare discrete-valued approaches operating on quantized audio features, to continuous-valued approaches. In the discrete case, we propose a method for computing the normalized compression distance, where we account for correlation between time series. In the continuous case, we propose to compute information-based measures of similarity as statistics of the prediction error between time series. We evaluate our methods on two cover song identification tasks using a data set comprised of 300 Jazz standards and using the Million Song Dataset. For both datasets, we observe that continuous-valued approaches outperform discrete-valued approaches. We consider approaches to estimating the normalized compression distance (NCD) based on string compression and prediction, where we observe that our proposed normalized compression distance with alignment (NCDA) improves average performance over NCD, for sequential compression algorithms. Finally, we demonstrate that continuous-valued distances may be combined to improve performance with respect to baseline approaches. Using a large-scale filter-and-refine approach, we demonstrate state-of-the-art performance for cover song identification using the Million Song Dataset.The work of P. Foster was supported by an Engineering and Physical Sciences Research Council Doctoral Training Account studentship

    IDENTIFICATION OF COVER SONGS USING INFORMATION THEORETIC MEASURES OF SIMILARITY

    Get PDF
    13 pages, 5 figures, 4 tables. v3: Accepted version13 pages, 5 figures, 4 tables. v3: Accepted version13 pages, 5 figures, 4 tables. v3: Accepted versio

    Dipole studies on organometallic compounds

    Get PDF
    Not availabl

    A Graphical Language for Proof Strategies

    Full text link
    Complex automated proof strategies are often difficult to extract, visualise, modify, and debug. Traditional tactic languages, often based on stack-based goal propagation, make it easy to write proofs that obscure the flow of goals between tactics and are fragile to minor changes in input, proof structure or changes to tactics themselves. Here, we address this by introducing a graphical language called PSGraph for writing proof strategies. Strategies are constructed visually by "wiring together" collections of tactics and evaluated by propagating goal nodes through the diagram via graph rewriting. Tactic nodes can have many output wires, and use a filtering procedure based on goal-types (predicates describing the features of a goal) to decide where best to send newly-generated sub-goals. In addition to making the flow of goal information explicit, the graphical language can fulfil the role of many tacticals using visual idioms like branching, merging, and feedback loops. We argue that this language enables development of more robust proof strategies and provide several examples, along with a prototype implementation in Isabelle

    Shear horizontal (SH) ultrasound wave propagation around smooth corners

    Get PDF
    Shear horizontal (SH) ultrasound guided waves are being used in an increasing number of non-destructive testing (NDT) applications. One advantage SH waves have over some wave types, is their ability to propagate around curved surfaces with little energy loss; to understand the geometries around which they could propagate, the wave reflection must be quantified. A 0.83 mm thick aluminium sheet was placed in a bending machine, and a shallow bend was introduced. Periodically-poled magnet (PPM) electromagnetic acoustic transducers (EMATs), for emission and reception of SH waves, were placed on the same side of the bend, so that reflected waves were received. Additional bending of the sheet demonstrated a clear relationship between bend angles and the reflected signal. Models suggest that the reflection is a linear superposition of the reflections from each bend segment, such that sharp turns lead to a larger peak-to-peak amplitude, in part due to increased phase coherence

    Ultrasonic metal sheet thickness measurement without prior wave speed calibration

    Get PDF
    Conventional ultrasonic mensuration of sample thickness from one side only requires the bulk wave reverberation time and a calibration speed. This speed changes with temperature, stress, and microstructure, limiting thickness measurement accuracy. Often, only one side of a sample is accessible, making in situ calibration impossible. Non-contact ultrasound can generate multiple shear horizontal guided wave modes on one side of a metal plate. Measuring propagation times of each mode at different transducer separations, allows sheet thickness to be calculated to better than 1% accuracy for sheets of at least 1.5 mm thickness, without any calibration

    High-precision radiocarbon dating of the construction phase of Oakbank Crannog, Loch Tay, Perthshire

    Get PDF
    Many of the Loch Tay crannogs were built in the Early Iron Age and so calibration of the radiocarbon ages produces very broad calendar age ranges due to the well-documented Hallstatt plateau in the calibration curve. However, the large oak timbers that were used in the construction of some of the crannogs potentially provide a means of improving the precision of the dating through subdividing them into decadal or subdecadal increments, dating them to high precision and wiggle-matching the resulting data to the master <sup>14</sup>C calibration curve. We obtained a sample from 1 oak timber from Oakbank Crannog comprising 70 rings (Sample OB06 WMS 1, T103) including sapwood that was complete to the bark edge. The timber is situated on the northeast edge of the main living area of the crannog and as a large and strong oak pile would have been a useful support in more than 1 phase of occupation and may be related to the earliest construction phase of the site. This was sectioned into 5-yr increments and dated to a precision of approximately ±8–16 <sup>14</sup>C yr (1 σ). The wiggle-match predicts that the last ring dated was formed around 500 BC (maximum range of 520–465 BC) and should be taken as indicative of the likely time of construction of Oakbank Crannog. This is a considerable improvement on the estimates based on single <sup>14</sup>C ages made on oak samples, which typically encompassed the period from around 800–400 BC

    Speckle-visibility spectroscopy: A tool to study time-varying dynamics

    Get PDF
    We describe a multispeckle dynamic light scattering technique capable of resolving the motion of scattering sites in cases that this motion changes systematically with time. The method is based on the visibility of the speckle pattern formed by the scattered light as detected by a single exposure of a digital camera. Whereas previous multispeckle methods rely on correlations between images, here the connection with scattering site dynamics is made more simply in terms of the variance of intensity among the pixels of the camera for the specified exposure duration. The essence is that the speckle pattern is more visible, i.e. the variance of detected intensity levels is greater, when the dynamics of the scattering site motion is slow compared to the exposure time of the camera. The theory for analyzing the moments of the spatial intensity distribution in terms of the electric field autocorrelation is presented. It is demonstrated for two well-understood samples, a colloidal suspension of Brownian particles and a coarsening foam, where the dynamics can be treated as stationary. However, the method is particularly appropriate for samples in which the dynamics vary with time, either slowly or rapidly, limited only by the exposure time fidelity of the camera. Potential applications range from soft-glassy materials, to granular avalanches, to flowmetry of living tissue.Comment: review - theory and experimen
    corecore