256 research outputs found

    A study of the social and physical environment in catering kitchens and the role of the chef in promoting positive health and safety behaviour

    Get PDF
    This is the account of a mixed method study of chefs and their kitchens in order to identify the nature of their workplace and how this affects their ability to manage health and safety in the kitchen. It included extended periods of observation, monitoring of physical parameters, analysis of records of reported accidents, and a series of reflexive interviews. The findings were integrated and then fed back in a smaller number of second interviews in order to test whether the findings fitted in with the chefs' understanding of their world. Major factors identified included survival in a market environment, the status of the chef (and the kitchen) within organisations, marked autocracy of chefs, and an increasing tempo building up to service time with commensurate heat, noise, and activity. In particular during the crescendo, a threshold shift in risk tolerance was identified. The factors, their interplay, and their implications for health and safety in the catering kitchen are discussed

    1,3-allylic strain as a strategic diversification element for constructing libraries of substituted 2-arylpiperidines

    Get PDF
    Flipping diversity: Minimization of 1,3-allylic strain (A1, 3 strain) is a recurring element in the design of a stereochemically and spatially diverse collection of 2-arylpiperidines. A1, 3 strain guides the regioselective addition of nucleophiles and N-substituents leverage A 1, 3 strain to direct each stereoisomer to two different conformer populations, thus doubling the number of library members

    Mersenne Primes, Polygonal Anomalies and String Theory Classification

    Get PDF
    It is pointed out that the Mersenne primes Mp=(2p−1)M_p=(2^p-1) and associated perfect numbers Mp=2p−1Mp{\cal M}_p=2^{p-1}M_p play a significant role in string theory; this observation may suggest a classification of consistent string theories.Comment: 10 pages LaTe

    Enhanced Symmetries in Multiparameter Flux Vacua

    Full text link
    We give a construction of type IIB flux vacua with discrete R-symmetries and vanishing superpotential for hypersurfaces in weighted projective space with any number of moduli. We find that the existence of such vacua for a given space depends on properties of the modular group, and for Fermat models can be determined solely by the weights of the projective space. The periods of the geometry do not in general have arithmetic properties, but live in a vector space whose properties are vital to the construction.Comment: 32 pages, LaTeX. v2: references adde

    Use of dispersion modelling for Environmental Impact Assessment of biological air pollution from composting: Progress, problems and prospects

    Get PDF
    © 2017 The Authors With the increase in composting as a sustainable waste management option, biological air pollution (bioaerosols) from composting facilities have become a cause of increasing concern due to their potential health impacts. Estimating community exposure to bioaerosols is problematic due to limitations in current monitoring methods. Atmospheric dispersion modelling can be used to estimate exposure concentrations, however several issues arise from the lack of appropriate bioaerosol data to use as inputs into models, and the complexity of the emission sources at composting facilities. This paper analyses current progress in using dispersion models for bioaerosols, examines the remaining problems and provides recommendations for future prospects in this area. A key finding is the urgent need for guidance for model users to ensure consistent bioaerosol modelling practices

    The valence-fluctuating ground state of plutonium

    Get PDF
    A central issue in material science is to obtain understanding of the electronic correlations that control complex materials. Such electronic correlations frequently arise because of the competition of localized and itinerant electronic degrees of freedom. Although the respective limits of well-localized or entirely itinerant ground states are well understood, the intermediate regime that controls the functional properties of complex materials continues to challenge theoretical understanding. We have used neutron spectroscopy to investigate plutonium, which is a prototypical material at the brink between bonding and nonbonding configurations. Our study reveals that the ground state of plutonium is governed by valence fluctuations, that is, a quantum mechanical superposition of localized and itinerant electronic configurations as recently predicted by dynamical mean field theory. Our results not only resolve the long-standing controversy between experiment and theory on plutonium’s magnetism but also suggest an improved understanding of the effects of such electronic dichotomy in complex materials.JRC.E.6-Actinide researc

    Consumption caught in the cash nexus.

    Get PDF
    During the last thirty years, ‘consumption’ has become a major topic in the study of contemporary culture within anthropology, psychology and sociology. For many authors it has become central to understanding the nature of material culture in the modern world but this paper argues that the concept is, in British writing at least, too concerned with its economic origins in the selling and buying of consumer goods or commodities. It is argued that to understand material culture as determined through the monetary exchange for things - the cash nexus - leads to an inadequate sociological understanding of the social relations with objects. The work of Jean Baudrillard is used both to critique the concept of consumption as it leads to a focus on advertising, choice, money and shopping and to point to a more sociologically adequate approach to material culture that explores objects in a system of models and series, ‘atmosphere’, functionality, biography, interaction and mediation

    Low Complexity Regularization of Linear Inverse Problems

    Full text link
    Inverse problems and regularization theory is a central theme in contemporary signal processing, where the goal is to reconstruct an unknown signal from partial indirect, and possibly noisy, measurements of it. A now standard method for recovering the unknown signal is to solve a convex optimization problem that enforces some prior knowledge about its structure. This has proved efficient in many problems routinely encountered in imaging sciences, statistics and machine learning. This chapter delivers a review of recent advances in the field where the regularization prior promotes solutions conforming to some notion of simplicity/low-complexity. These priors encompass as popular examples sparsity and group sparsity (to capture the compressibility of natural signals and images), total variation and analysis sparsity (to promote piecewise regularity), and low-rank (as natural extension of sparsity to matrix-valued data). Our aim is to provide a unified treatment of all these regularizations under a single umbrella, namely the theory of partial smoothness. This framework is very general and accommodates all low-complexity regularizers just mentioned, as well as many others. Partial smoothness turns out to be the canonical way to encode low-dimensional models that can be linear spaces or more general smooth manifolds. This review is intended to serve as a one stop shop toward the understanding of the theoretical properties of the so-regularized solutions. It covers a large spectrum including: (i) recovery guarantees and stability to noise, both in terms of ℓ2\ell^2-stability and model (manifold) identification; (ii) sensitivity analysis to perturbations of the parameters involved (in particular the observations), with applications to unbiased risk estimation ; (iii) convergence properties of the forward-backward proximal splitting scheme, that is particularly well suited to solve the corresponding large-scale regularized optimization problem

    Late style and speaking out: J A Symonds's In the Key of Blue

    Get PDF
    This article examines In the Key of Blue (1893)—an essay collection by John Addington Symonds—as a case study in queer public utterance during the early 1890s. Viewed through the critical lens of late style, as theorised by Edward Said, the evolution of this project, from compilation through to reader reception, reveals Symonds's determination to “speak out” on the subject of homosexuality. Paradoxically, In the Key of Blue was thus a timely and untimely work: it belonged to a brief period of increased visibility and expressiveness when dealing with male same-sex desire, spearheaded by a younger generation of Decadent writers, but it also cut against the grain of nineteenth-century social taboo and legal repression. Symonds's essay collection brought together new and previously unpublished work with examples of his writing for the periodical press. These new combinations, appearing together for the first time, served to facilitate new readings and new inferences, bringing homosexual themes to the fore. This article traces the dialogic structure of In the Key of Blue , its strategies for articulating homosexual desire, and examines the response of reviewers, from the hostile to celebratory
    • 

    corecore