2,957 research outputs found

    Autoxidation of lipids in parchment

    Get PDF
    Historic parchment is a macromolecular material, which is complex due to its natural origin, inhomogeneity of the skin structure, unknown environmental history and potential localised degradation. Most research into its stability has so far focussed on thermal and structural methods of analyses. Using gas chromatographic analysis of the atmosphere surrounding parchment during oxidation, we provide the experimental evidence on the production of volatile aldehydes, which can be the products of lipid autoxidation. Oxidation of parchment with different aldehyde emissions was additionally followed in situ using chemiluminometry and the same techniques were used to evaluate the oxidation of differently delipidised parchment. It was shown that the production of peroxides and the emission of aldehydes from the material decrease with lower lipid content. Building on this evidence, we can conclude that the presence of lipids (either initially present in the skin or resulting from conservation intervention) leads to oxidative degradation of collagen and that the non-destructive analysis of the emission of volatiles could be used as a quick tool for evaluation of parchment stability

    The spillover effects of monitoring:A field experiment

    Get PDF
    Published Online: March 13, 2015We provide field experimental evidence of the effects of monitoring in a context where productivity is multidimensional and only one dimension is monitored and incentivized. We hire students to do a job for us. The job consists of identifying euro coins. We study the direct effects of monitoring and penalizing mistakes on work quality and evaluate spillovers on unmonitored dimensions of productivity (punctuality and theft). We find that monitoring improves work quality only if incentives are harsh, but substantially reduces punctuality irrespectively of the associated incentives. Monitoring does not affect theft, with 10% of participants stealing overall. Our findings are supportive of a reciprocity mechanism, whereby workers retaliate for being distrusted

    Analysis of airborne Doppler lidar, Doppler radar and tall tower measurements of atmospheric flows in quiescent and stormy weather

    Get PDF
    The first experiment to combine airborne Doppler Lidar and ground-based dual Doppler Radar measurements of wind to detail the lower tropospheric flows in quiescent and stormy weather was conducted in central Oklahoma during four days in June-July 1981. Data from these unique remote sensing instruments, coupled with data from conventional in-situ facilities, i.e., 500-m meteorological tower, rawinsonde, and surface based sensors, were analyzed to enhance understanding of wind, waves and turbulence. The purposes of the study were to: (1) compare winds mapped by ground-based dual Doppler radars, airborne Doppler lidar, and anemometers on a tower; (2) compare measured atmospheric boundary layer flow with flows predicted by theoretical models; (3) investigate the kinematic structure of air mass boundaries that precede the development of severe storms; and (4) study the kinematic structure of thunderstorm phenomena (downdrafts, gust fronts, etc.) that produce wind shear and turbulence hazardous to aircraft operations. The report consists of three parts: Part 1, Intercomparison of Wind Data from Airborne Lidar, Ground-Based Radars and Instrumented 444 m Tower; Part 2, The Structure of the Convective Atmospheric Boundary Layer as Revealed by Lidar and Doppler Radars; and Part 3, Doppler Lidar Observations in Thunderstorm Environments

    An Algorithmic Approach to Quantum Field Theory

    Full text link
    The lattice formulation provides a way to regularize, define and compute the Path Integral in a Quantum Field Theory. In this paper we review the theoretical foundations and the most basic algorithms required to implement a typical lattice computation, including the Metropolis, the Gibbs sampling, the Minimal Residual, and the Stabilized Biconjugate inverters. The main emphasis is on gauge theories with fermions such as QCD. We also provide examples of typical results from lattice QCD computations for quantities of phenomenological interest.Comment: 44 pages, to be published in IJMP

    Old and New Fields on Super Riemann Surfaces

    Get PDF
    The ``new fields" or ``superconformal functions" on N=1N=1 super Riemann surfaces introduced recently by Rogers and Langer are shown to coincide with the Abelian differentials (plus constants), viewed as a subset of the functions on the associated N=2N=2 super Riemann surface. We confirm that, as originally defined, they do not form a super vector space.Comment: 9 pages, LaTex. Published version: minor changes for clarity, two new reference

    Interpolating Action for Strings and Membranes - a Study of Symmetries in the Constrained Hamiltonian Approach

    Full text link
    A master action for bosonic strings and membranes, interpolating between the Nambu--Goto and Polyakov formalisms, is discussed. The role of the gauge symmetries vis-\`{a}-vis reparametrization symmetries of the various actions is analyzed by a constrained Hamiltonian approach. This analysis reveals the difference between strings and higher branes, which is essentially tied to a degree of freedom count. The cosmological term for membranes follows naturally in this scheme. The conncetion of our aproach with the Arnowitt--Deser--Misner representation in general relativity is illuminated.Comment: LaTex, 23 pages; discussion on ADM representation included and new references adde

    Fast Optimal Transport Averaging of Neuroimaging Data

    Full text link
    Knowing how the Human brain is anatomically and functionally organized at the level of a group of healthy individuals or patients is the primary goal of neuroimaging research. Yet computing an average of brain imaging data defined over a voxel grid or a triangulation remains a challenge. Data are large, the geometry of the brain is complex and the between subjects variability leads to spatially or temporally non-overlapping effects of interest. To address the problem of variability, data are commonly smoothed before group linear averaging. In this work we build on ideas originally introduced by Kantorovich to propose a new algorithm that can average efficiently non-normalized data defined over arbitrary discrete domains using transportation metrics. We show how Kantorovich means can be linked to Wasserstein barycenters in order to take advantage of an entropic smoothing approach. It leads to a smooth convex optimization problem and an algorithm with strong convergence guarantees. We illustrate the versatility of this tool and its empirical behavior on functional neuroimaging data, functional MRI and magnetoencephalography (MEG) source estimates, defined on voxel grids and triangulations of the folded cortical surface.Comment: Information Processing in Medical Imaging (IPMI), Jun 2015, Isle of Skye, United Kingdom. Springer, 201
    corecore