3,995 research outputs found

    A Discrete Fourier Kernel and Fraenkel's Tiling Conjecture

    Full text link
    The set B_{p,r}^q:=\{\floor{nq/p+r} \colon n\in Z \} with integers p, q, r) is a Beatty set with density p/q. We derive a formula for the Fourier transform \hat{B_{p,r}^q}(j):=\sum_{n=1}^p e^{-2 \pi i j \floor{nq/p+r} / q}. A. S. Fraenkel conjectured that there is essentially one way to partition the integers into m>2 Beatty sets with distinct densities. We conjecture a generalization of this, and use Fourier methods to prove several special cases of our generalized conjecture.Comment: 24 pages, 6 figures (now with minor revisions and clarifications

    An Investigation of Factors Influencing Algorithm Selection for High Dimensional Continuous Optimisation Problems

    Get PDF
    The problem of algorithm selection is of great importance to the optimisation community, with a number of publications present in the Body-of-Knowledge. This importance stems from the consequences of the No-Free-Lunch Theorem which states that there cannot exist a single algorithm capable of solving all possible problems. However, despite this importance, the algorithm selection problem has of yet failed to gain widespread attention . In particular, little to no work in this area has been carried out with a focus on large-scale optimisation; a field quickly gaining momentum in line with advancements and influence of big data processing. As such, it is not as yet clear as to what factors, if any, influence the selection of algorithms for very high-dimensional problems (> 1000) - and it is entirely possible that algorithms that may not work well in lower dimensions may in fact work well in much higher dimensional spaces and vice-versa. This work therefore aims to begin addressing this knowledge gap by investigating some of these influencing factors for some common metaheuristic variants. To this end, typical parameters native to several metaheuristic algorithms are firstly tuned using the state-of-the-art automatic parameter tuner, SMAC. Tuning produces separate parameter configurations of each metaheuristic for each of a set of continuous benchmark functions; specifically, for every algorithm-function pairing, configurations are found for each dimensionality of the function from a geometrically increasing scale (from 2 to 1500 dimensions). The nature of this tuning is therefore highly computationally expensive necessitating the use of SMAC. Using these sets of parameter configurations, a vast amount of performance data relating to the large-scale optimisation of our benchmark suite by each metaheuristic was subsequently generated. From the generated data and its analysis, several behaviours presented by the metaheuristics as applied to large-scale optimisation have been identified and discussed. Further, this thesis provides a concise review of the relevant literature for the consumption of other researchers looking to progress in this area in addition to the large volume of data produced, relevant to the large-scale optimisation of our benchmark suite by the applied set of common metaheuristics. All work presented in this thesis was funded by EPSRC grant: EP/J017515/1 through the DAASE project

    The effect of mark enhancement techniques on the subsequent detection of saliva

    Get PDF
    There appears to be a limited but growing body of research on the sequential analysis/treatment of multiple types of evidence. The development of an integrated forensic approach is necessary to maximise evidence recovery and to ensure that a particular treatment is not detrimental to other types of evidence. This study aims to assess the effect of latent and blood mark enhancement techniques (e.g. fluorescence, ninhydrin, acid violet 17, black iron-oxide powder suspension) on the subsequent detection of saliva. Saliva detection was performed by means of a presumptive test (Phadebas®) in addition to analysis by a rapid stain identification (RSID) kit test and confirmatory DNA testing. Additional variables included a saliva depletion series and a number of different substrates with varying porosities as well as different ageing periods. Examination and photography under white light and fluorescence was carried out prior to and after chemical enhancement All enhancement techniques (except Bluestar® Forensic Magnum luminol) employed in this study resulted in an improved visualisation of the saliva stains, although the inherent fluorescence of saliva was sometimes blocked after chemical treatment. The use of protein stains was, in general, detrimental to the detection of saliva. Positive results were less pronounced after the use of black iron-oxide powder suspension, cyanoacrylate fuming followed by BY40 and ninhydrin when compared to the respective positive controls. The application of Bluestar® Forensic Magnum luminol and black magnetic powder proved to be the least detrimental, with no significant difference between the test results and the positive controls. The use of non-destructive fluorescence examination provided good visualisation; however, only the first few marks in the depletion were observed. Of the samples selected for DNA analysis only depletion 1 samples contained sufficient DNA quantity for further processing using standard methodology. The 28 day delay between sample deposition and collection resulted in a 5-fold reduction in the amount of useable DNA. When sufficient DNA quantities were recovered, enhancement techniques did not have a detrimental effect on the ability to generate DNA profiles. This study aims to contribute to a strategy for maximising evidence recovery and efficiency for the detection of latent marks and saliva. The results demonstrate that most of the enhancement techniques employed in this study were not detrimental to the subsequent detection of saliva by means of presumptive, confirmative and DNA tests

    On Perturbations of Unitary Minimal Models by Boundary Condition Changing Operators

    Get PDF
    In this note we consider boundary perturbations in the A-Series unitary minimal models by phi_{r,r+2} fields on superpositions of boundaries. In particular, we consider perturbations by boundary condition changing operators. Within conformal perturbation theory we explicitly map out the space of perturbative renormalisation group flows for the example phi_{1,3} and find that this sheds light on more general phi_{r,r+2} perturbations. Finally, we find a simple diagrammatic representation for the space of flows from a single Cardy boundary condition.Comment: 27 pages, 10 figure

    Labour and Employment Surveys of the Department of Labour

    Get PDF
    The Department of Labour has 5 major data collections the results of which are available to the public, viz. (1) The Quarterly Employment Survey (Q.E.S.); (2) The Job Vacancy Survey (J.V.S.); (3) Monthly Employment Operations (M.E.O.); (4) Immigration Permit Information (I.P.I.); and (5) Apprenticeship Statistics. The Department also maintains a computerized mailing list, for the use of ·Q .E. S. and J.V.S., the Central Address Register of Business (CARB). This paper does 2 things. First, it sets out details of each of these collections, and of CARB, and secondly it outlines some new developments which the Department has planned for the near future

    A Renormalisation group for TCSA

    Full text link
    We discuss the errors introduced by level truncation in the study of boundary renormalisation group flows by the Truncated Conformal Space Approach. We show that the TCSA results can have the qualitative form of a sequence of RG flows between different conformal boundary conditions. In the case of a perturbation by the field phi(13), we propose a renormalisation group equation for the coupling constant which predicts a fixed point at a finite value of the TCSA coupling constant and we compare the predictions with data obtained using TBA equations.Comment: 11 pages, 7 figures, talk presented by G Watts at the workshop "Integrable Models and Applications: from Strings to Condensed Matter", Santiago de Compostela, Spain, 12-16 September 200

    Empirically Validating Conformal Prediction on Modern Vision Architectures Under Distribution Shift and Long-tailed Data

    Full text link
    Conformal prediction has emerged as a rigorous means of providing deep learning models with reliable uncertainty estimates and safety guarantees. Yet, its performance is known to degrade under distribution shift and long-tailed class distributions, which are often present in real world applications. Here, we characterize the performance of several post-hoc and training-based conformal prediction methods under these settings, providing the first empirical evaluation on large-scale datasets and models. We show that across numerous conformal methods and neural network families, performance greatly degrades under distribution shifts violating safety guarantees. Similarly, we show that in long-tailed settings the guarantees are frequently violated on many classes. Understanding the limitations of these methods is necessary for deployment in real world and safety-critical applications
    • …
    corecore