10,204 research outputs found

    Leisure studies education: Historical trends and pedagogical futures in the United Kingdom and beyond

    Get PDF
    This paper is an attempt to stimulate debate about the decline of leisure studies and the rise of courses and subject fields defined by sport, events, tourism management. It is argued that although this decline has happened, there are two possible futures for a re-purposed leisure studies that would ensure its survival

    On the chiral and deconfinement phase transitions in parity-conserving QED_3 at finite temperature

    Get PDF
    We present some results about the interplay between the chiral and deconfinement phase transitions in parity-conserving QED3 (with N flavours of massless 4 component fermions) at finite temperature. Following Grignani et al (Phys. Rev. D53, 7157 (1996), Nucl. Phys. B473, 143 (1996)), confinement is discussed in terms of an effective Sine-Gordon theory for the timelike component of the gauge field A_0. But whereas in the references above the fermion mass m is a Lagrangian parameter, we consider the m=0 case and ask whether an effective S-G theory can again be derived with m replaced by the dynamically generated mass Sigma which appears below T_{ch}, the critical temperature for the chiral phase transition. The fermion and gauge sectors are strongly interdependent, but as a first approximation we decouple them by taking Sigma to be a constant, depending only on the constant part of the gauge field. We argue that the existence of a low-temperature confining phase may be associated with the generation of Sigma; and that, analogously, the vanishing of Sigma for T > T_{ch} drives the system to its deconfining phase. The effect of the gauge field dynamics on mass generation is also indicated. (38kb)Comment: 1 reference adde

    Bayesian filtering unifies adaptive and non-adaptive neural network optimization methods

    Full text link
    We formulate the problem of neural network optimization as Bayesian filtering, where the observations are the backpropagated gradients. While neural network optimization has previously been studied using natural gradient methods which are closely related to Bayesian inference, they were unable to recover standard optimizers such as Adam and RMSprop with a root-mean-square gradient normalizer, instead getting a mean-square normalizer. To recover the root-mean-square normalizer, we find it necessary to account for the temporal dynamics of all the other parameters as they are geing optimized. The resulting optimizer, AdaBayes, adaptively transitions between SGD-like and Adam-like behaviour, automatically recovers AdamW, a state of the art variant of Adam with decoupled weight decay, and has generalisation performance competitive with SGD

    Tensor Monte Carlo: particle methods for the GPU era

    Get PDF
    Multi-sample, importance-weighted variational autoencoders (IWAE) give tighter bounds and more accurate uncertainty estimates than variational autoencoders (VAE) trained with a standard single-sample objective. However, IWAEs scale poorly: as the latent dimensionality grows, they require exponentially many samples to retain the benefits of importance weighting. While sequential Monte-Carlo (SMC) can address this problem, it is prohibitively slow because the resampling step imposes sequential structure which cannot be parallelised, and moreover, resampling is non-differentiable which is problematic when learning approximate posteriors. To address these issues, we developed tensor Monte-Carlo (TMC) which gives exponentially many importance samples by separately drawing KK samples for each of the nn latent variables, then averaging over all KnK^n possible combinations. While the sum over exponentially many terms might seem to be intractable, in many cases it can be computed efficiently as a series of tensor inner-products. We show that TMC is superior to IWAE on a generative model with multiple stochastic layers trained on the MNIST handwritten digit database, and we show that TMC can be combined with standard variance reduction techniques

    Compositional data analysis of geological variability and process : a case study

    Get PDF
    Developments in the statistical analysis of compositional data over the last two decades have made possible a much deeper exploration of the nature of variability and the possible processes associated with compositional data sets from many disciplines. In this paper, we concentrate on geochemical data. First, we explain how hypotheses of compositional variability may be formulated within the natural sample space, the unit simplex, including useful hypotheses of sub-compositional discrimination and specific perturbational change. Then we develop through standard methodology, such as generalised likelihood ratio tests, statistical tools to allow the systematic investigation of a lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require special construction. We comment on the use of graphical methods in compositional data analysis and on the ordination of specimens. The recent development of the concept of compositional processes is then explained, together with the necessary tools for a staying-in-the-simplex approach, such as the singular value decomposition of a compositional data set. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major oxide and trace element compositions of metamorphosed limestones from the Grampian Highlands of Scotland. Finally, we discuss some unresolved problems in the statistical analysis of compositional processes

    A Review of State-of-the-Art Large Sized Foam Cutting Rapid Prototyping and Manufacturing Technologies.

    Get PDF
    Purpose – Current additive rapid prototyping (RP) technologies fail to efficiently produce objects greater than 0.5?m3 due to restrictions in build size, build time and cost. A need exists to develop RP and manufacturing technologies capable of producing large objects in a rapid manner directly from computer-aided design data. Foam cutting RP is a relatively new technology capable of producing large complex objects using inexpensive materials. The purpose of this paper is to describe nine such technologies that have been developed or are currently being developed at institutions around the world. The relative merits of each system are discussed. Recommendations are given with the aim of enhancing the performance of existing and future foam cutting RP systems. Design/methodology/approach – The review is based on an extensive literature review covering academic publications, company documents and web site information. Findings – The paper provides insights into the different machine configurations and cutting strategies. The most successful machines and cutting strategies are identified. Research limitations/implications – Most of the foam cutting RP systems described have not been developed to the commercial level, thus a benchmark study directly comparing the nine systems was not possible. Originality/value – This paper provides the first overview of foam cutting RP technology, a field which is over a decade old. The information contained in this paper will help improve future developments in foam cutting RP systems

    TOURISM IMPACT of WIND FARMS:Submitted to Renewables Inquiry Scottish Government

    Get PDF

    Using the Finite Element Method to Determine the Temperature Distributions in Hot-wire Cutting.

    Get PDF
    Hot-wire cutting is a common material removal process used to shape and sculpt plastic foam materials, such as expanded polystyrene (EPS). Due to the low cost and sculpt-ability of plastic foams they are popular materials for large sized (> 1 m³) prototypes and bespoke visual artefacts. Recent developments in robotic foam sculpting machines have greatly increased the ability of hot-tools to sculpt complex geometrical surfaces bringing the subject into the realm of subtractive rapid prototyping/manufacturing. Nevertheless foam cut objects are not being exploited to their full potential due to the common perception that hot-wires are a low accuracy cutting tool. If greater accuracy for hot-wires can be obtained, it could provide a low cost method of producing high value functional engineering parts. Polystyrene patterns for lost foam casting are one such possibility. A nonlinear transient thermal finite element model was developed with the purpose of predicting the kerf width of hot-wire cut foams. Accurate predictions of the kerfwidth during cutting will allow the tool paths to be corrected off-line at the tool definition stage of the CAM process. Finite element analysis software (ANSYS) was used to simulate the hot-wire plastic foam cutting. The material property models were compiled from experimental data and commonly accepted values found in literature. The simulations showed good agreement with the experimental data and thus the model is thought to be reliable. The simulations provide an effective method of predicting kerf widths, under steady state cutting conditions. Limitations and further developments to the model are described
    corecore