3,364 research outputs found

    An Instrument for the Determination of Photographic Film Drying Rates

    Get PDF
    The way in which a photographic material dries is commonly represented as a plot of rate of vaporization vs. time. Generation of these curves in the past has been a lengthy, error prone process. This project\u27s attempt was to greatly simplify the generation of these curves. A photographic drier with a thermal sensing element was designed, constructed, and tested. The drier produces a heated air flow perpendicular to the film plane. A thermistor prope was placed in contact with the base of the film to sense the temperature of the film during drying. A rate of vaporization curve was obtained by determining the per cent moisture vaporized from the film at descrete time intervals by weighing the film in its semi-dry and dry state. This data was correlated with the temperature data obtained from the apparatus. Good correlation was found at the constant rate period between both methods. However no correlation was found between the methods during the falling rate period

    Additively Manufactured K-Band Septum Polarizers: A Comparative Study

    Get PDF
    A septum waveguide polarizer with integrated circular to square waveguide transition has been manufactured using three different additive manufacturing (AM) processes. Polymer and AlSi10Mg printing processes from a specialized RF manufacturer and a selective laser melting (SLM) process from a university research group are evaluated. Measurements confirm the polarizer design is well suited for AM

    Heavy-flavor dynamics in nucleus-nucleus collisions: from RHIC to LHC

    Get PDF
    The stochastic dynamics of c and b quarks in the fireball created in nucleus-nucleus collisions at RHIC and LHC is studied employing a relativistic Langevin equation, based on a picture of multiple uncorrelated random collisions with the medium. Heavy-quark transport coefficients are evaluated within a pQCD approach, with a proper HTL resummation of medium effects for soft scatterings. The Langevin equation is embedded in a multi-step setup developed to study heavy-flavor observables in pp and AA collisions, starting from a NLO pQCD calculation of initial heavy-quark yields, complemented in the nuclear case by shadowing corrections, k_T-broadening and nuclear geometry effects. Then, only for AA collisions, the Langevin equation is solved numerically in a background medium described by relativistic hydrodynamics. Finally, the propagated heavy quarks are made hadronize and decay into electrons. Results for the nuclear modification factor R_AA of heavy-flavor hadrons and electrons from their semi-leptonic decays are provided, both for RHIC and LHC beam energies.Comment: 4 pages, 2 figures (3 eps files); submitted for publication in the proceedings of "Quark Matter 2011", 23-28 May 2011, Annecy (France

    Symmetry-breaking transitions in networks of nonlinear circuit elements

    Full text link
    We investigate a nonlinear circuit consisting of N tunnel diodes in series, which shows close similarities to a semiconductor superlattice or to a neural network. Each tunnel diode is modeled by a three-variable FitzHugh-Nagumo-like system. The tunnel diodes are coupled globally through a load resistor. We find complex bifurcation scenarios with symmetry-breaking transitions that generate multiple fixed points off the synchronization manifold. We show that multiply degenerate zero-eigenvalue bifurcations occur, which lead to multistable current branches, and that these bifurcations are also degenerate with a Hopf bifurcation. These predicted scenarios of multiple branches and degenerate bifurcations are also found experimentally.Comment: 32 pages, 11 figures, 7 movies available as ancillary file

    Overview of experimental results in PbPb collisions at sqrt{s_NN} = 2.76 TeV by the CMS Collaboration

    Get PDF
    The CMS experiment at the LHC is a general-purpose apparatus with a set of large acceptance and high granularity detectors for hadrons, electrons, photons and muons, providing unique capabilities for both proton-proton and ion-ion collisions. The data collected during the November 2010 PbPb run at sqrt{s_NN} = 2.76 TeV was analyzed and multiple measurements of the properties of the hot and dense matter were obtained. Global event properties, detailed study of jet production and jet properties, isolated photons, quarkonia and weak bosons were measured and compared to pp data and Monte Carlo simulations.Comment: 8 pages, 10 figures, proceedings for Quark Matter 2011, Annecy, France, May 23-28, 201

    Heterogeneous Delays in Neural Networks

    Full text link
    We investigate heterogeneous coupling delays in complex networks of excitable elements described by the FitzHugh-Nagumo model. The effects of discrete as well as of uni- and bimodal continuous distributions are studied with a focus on different topologies, i.e., regular, small-world, and random networks. In the case of two discrete delay times resonance effects play a major role: Depending on the ratio of the delay times, various characteristic spiking scenarios, such as coherent or asynchronous spiking, arise. For continuous delay distributions different dynamical patterns emerge depending on the width of the distribution. For small distribution widths, we find highly synchronized spiking, while for intermediate widths only spiking with low degree of synchrony persists, which is associated with traveling disruptions, partial amplitude death, or subnetwork synchronization, depending sensitively on the network topology. If the inhomogeneity of the coupling delays becomes too large, global amplitude death is induced

    The emergence of international food safety standards and guidelines: understanding the current landscape through a historical approach

    Get PDF
    Following the Second World War, the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) teamed up to construct an International Codex Alimentarius (or 'food code') which emerged in 1963. The Codex Committee on Food Hygiene (CCFH) was charged with the task of developing microbial hygiene standards, although it found itself embroiled in debate with the WHO over the nature these standards should take. The WHO was increasingly relying upon the input of biometricians and especially the International Commission on Microbial Specifications for Foods (ICMSF) which had developed statistical sampling plans for determining the microbial counts in the final end products. The CCFH, however, was initially more focused on a qualitative approach which looked at the entire food production system and developed codes of practice as well as more descriptive end-product specifications which the WHO argued were 'not scientifically correct'. Drawing upon historical archival material (correspondence and reports) from the WHO and FAO, this article examines this debate over microbial hygiene standards and suggests that there are many lessons from history which could shed light upon current debates and efforts in international food safety management systems and approaches

    One Hundred Years of Philosophy of Science: The View from Munich

    Get PDF
    These days, a number of philosophers of science indulge in lamenting about a crisis of their discipline. They complain about its loss of relevance, and bemoan the mar gi na lization of their dis cipline in the philosophical community and in the wider academia , Hardcastle and Richardson ). The Munich take on the philosophy of science does not succumb to this temptation. According to it, philosophy of science is well and alive. In Carlos Ulises Moulines’s Die Entwicklung der modernen Wissen schaftstheorie Eine historische Einführung the word “crisis” is used only in reference to the 1940s when clas sical logical positivism encountered some dif fi culties in dealing with problems concerning veri fi cation, the ana ly tic/synthetic distinction, and similar conundrums. For Moulines, “crisis” is not a word that applies to contemporary philosophy of scienc

    Unraveling biogeochemical phosphorus dynamics in hyperarid Mars‐analogue soils using stable oxygen isotopes in phosphate

    Get PDF
    With annual precipitation less than 20 mm and extreme UV intensity, the Atacama Desert in northern Chile has long been utilized as an analogue for recent Mars. In these hyperarid environments, water and biomass are extremely limited, and thus, it becomes difficult to generate a full picture of biogeochemical phosphate‐water dynamics. To address this problem, we sampled soils from five Atacama study sites and conducted three main analyses—stable oxygen isotopes in phosphate, enzyme pathway predictions, and cell culture experiments. We found that high sedimentation rates decrease the relative size of the organic phosphorus pool, which appears to hinder extremophiles. Phosphoenzyme and pathway prediction analyses imply that inorganic pyrophosphatase is the most likely catalytic agent to cycle P in these environments, and this process will rapidly overtake other P utilization strategies. In these soils, the biogenic δ18O signatures of the soil phosphate (δ18OPO4) can slowly overprint lithogenic δ18OPO4 values over a timescale of tens to hundreds of millions of years when annual precipitation is more than 10 mm. The δ18OPO4 of calcium‐bound phosphate minerals seems to preserve the δ18O signature of the water used for biogeochemical P cycling, pointing toward sporadic rainfall and gypsum hydration water as key moisture sources. Where precipitation is less than 2 mm, biological cycling is restricted and bedrock δ18OPO4 values are preserved. This study demonstrates the utility of δ18OPO4 values as indicative of biogeochemical cycling and hydrodynamics in an extremely dry Mars‐analogue environment
    • …
    corecore