35,794 research outputs found

    Fragility of iron-based glasses

    Get PDF
    The viscosity of various iron-based bulk-glass-forming liquids is measured around the glass transition, and the associated fragility is calculated. Fragility is found to vary broadly between compositions, from a low value of ~43, which indicates fairly “strong” liquid behavior, to ~65, well within the region of “fragile” behavior. Despite a strong covalent bonding identified in the structure of this class of metal/metalloid glasses, their liquid fragility can be remarkably high, exceeding even the very fragile palladium and platinum bulk-glass formers. An inverse correlation between glass-forming ability and fragility is identified, suggesting that iron-based glasses are effectively “kinetically” stabilized

    Measuring situation awareness in complex systems: Comparison of measures study

    Get PDF
    Situation Awareness (SA) is a distinct critical commodity for teams working in complex industrial systems and its measurement is a key provision in system, procedural and training design efforts. This article describes a study that was undertaken in order to compare three different SA measures (a freeze probe recall approach, a post trial subjective rating approach and a critical incident interview technique) when used to assess participant SA during a military planning task. The results indicate that only the freeze probe recall method produced a statistically significant correlation with performance on the planning task and also that there was no significant correlation between the three methods, which suggests that they were effectively measuring different things during the trials. In conclusion, the findings, whilst raising doubts over the validity of post trial subjective rating and interview-based approaches, offer validation evidence for the use of freeze probe recall approaches to measure SA. The findings are subsequently discussed with regard to their implications for the future measurement of SA in complex collaborative systems

    Diagnosing numerical Cherenkov instabilities in relativistic plasma simulations based on general meshes

    Full text link
    Numerical Cherenkov radiation (NCR) or instability is a detrimental effect frequently found in electromagnetic particle-in-cell (EM-PIC) simulations involving relativistic plasma beams. NCR is caused by spurious coupling between electromagnetic-field modes and multiple beam resonances. This coupling may result from the slow down of poorly-resolved waves due to numerical (grid) dispersion and from aliasing mechanisms. NCR has been studied in the past for finite-difference-based EM-PIC algorithms on regular (structured) meshes with rectangular elements. In this work, we extend the analysis of NCR to finite-element-based EM-PIC algorithms implemented on unstructured meshes. The influence of different mesh element shapes and mesh layouts on NCR is studied. Analytic predictions are compared against results from finite-element-based EM-PIC simulations of relativistic plasma beams on various mesh types.Comment: 31 pages, 20 figure

    Chaos assisted adiabatic passage

    Full text link
    We study the exact dynamics underlying stimulated Raman adiabatic passage (STIRAP) for a particle in a multi-level anharmonic system (the infinite square-well) driven by two sequential laser pulses, each with constant carrier frequency. In phase space regions where the laser pulses create chaos, the particle can be transferred coherently into energy states different from those predicted by traditional STIRAP. It appears that a transition to chaos can provide a new tool to control the outcome of STIRAP

    An Improved Approximate Consensus Algorithm in the Presence of Mobile Faults

    Full text link
    This paper explores the problem of reaching approximate consensus in synchronous point-to-point networks, where each pair of nodes is able to communicate with each other directly and reliably. We consider the mobile Byzantine fault model proposed by Garay '94 -- in the model, an omniscient adversary can corrupt up to ff nodes in each round, and at the beginning of each round, faults may "move" in the system (i.e., different sets of nodes may become faulty in different rounds). Recent work by Bonomi et al. '16 proposed a simple iterative approximate consensus algorithm which requires at least 4f+14f+1 nodes. This paper proposes a novel technique of using "confession" (a mechanism to allow others to ignore past behavior) and a variant of reliable broadcast to improve the fault-tolerance level. In particular, we present an approximate consensus algorithm that requires only 7f/2+1\lceil 7f/2\rceil + 1 nodes, an f/2\lfloor f/2 \rfloor improvement over the state-of-the-art algorithms. Moreover, we also show that the proposed algorithm is optimal within a family of round-based algorithms

    Azimuthal distributions of radial momentum and velocity in relativistic heavy ion collisions

    Full text link
    Azimuthal distributions of radial (transverse) momentum, mean radial momentum, and mean radial velocity of final state particles are suggested for relativistic heavy ion collisions. Using transport model AMPT with string melting, these distributions for Au + Au collisions at 200 GeV are presented and studied. It is demonstrated that the distribution of total radial momentum is more sensitive to the anisotropic expansion, as the anisotropies of final state particles and their associated transverse momentums are both counted in the measure. The mean radial velocity distribution is compared with the radial {\deg}ow velocity. The thermal motion contributes an isotropic constant to mean radial velocity

    Effect of venting range hood flow rate on size-resolved ultrafine particle concentrations from gas stove cooking

    Get PDF
    Cooking is the main source of ultrafine particles (UFP) in homes. This study investigated the effect of venting range hood flow rate on size-resolved UFP concentrations from gas stove cooking. The same cooking protocol was conducted 60 times using three venting range hoods operated at six flow rates in twin research houses. Size-resolved particle (10–420 nm) concentrations were monitored using a NanoScan scanning mobility particle sizer (SMPS) from 15 min before cooking to 3 h after the cooking had stopped. Cooking increased the background total UFP number concentrations to 1.3 × 103 particles/cm3 on average, with a mean exposure-relevant source strength of 1.8 × 1012 particles/min. Total particle peak reductions ranged from 25% at the lowest fan flow rate of 36 L/s to 98% at the highest rate of 146 L/s. During the operation of a venting range hood, particle removal by deposition was less significant compared to the increasing air exchange rate driven by exhaust ventilation. Exposure to total particles due to cooking varied from 0.9 to 5.8 × 104 particles/cm3·h, 3 h after cooking ended. Compared to the 36 L/s range hood, higher flow rates of 120 and 146 L/s reduced the first-hour post-cooking exposure by 76% and 85%, respectively. © 2018 Crown Copyright. Published with license by Taylor & Francis Group, LLC

    Quantifying the origin of metallic glass formation

    Get PDF
    The waiting time to form a crystal in a unit volume of homogeneous undercooled liquid exhibits a pronounced minimum τ_X* at a ‘nose temperature’ T* located between the glass transition temperature T_g, and the crystal melting temperature, T_L. Turnbull argued that τ_X* should increase rapidly with the dimensionless ratio t_(rg)=T_g/T_L. Angell introduced a dimensionless ‘fragility parameter’, m, to characterize the fall of atomic mobility with temperature above T_g. Both t_(rg) and m are widely thought to play a significant role in determining τ_X*. Here we survey and assess reported data for T_L, T_g, t_(rg), m and τ_X* for a broad range of metallic glasses with widely varying τ_X*. By analysing this database, we derive a simple empirical expression for τ_X*(t_(rg), m) that depends exponentially on t_(rg) and m, and two fitting parameters. A statistical analysis shows that knowledge of t_(rg) and m alone is therefore sufficient to predict τ_X* within estimated experimental errors. Surprisingly, the liquid/crystal interfacial free energy does not appear in this expression for τ_X*

    How Many Cooks Spoil the Soup?

    Get PDF
    In this work, we study the following basic question: "How much parallelism does a distributed task permit?" Our definition of parallelism (or symmetry) here is not in terms of speed, but in terms of identical roles that processes have at the same time in the execution. We initiate this study in population protocols, a very simple model that not only allows for a straightforward definition of what a role is, but also encloses the challenge of isolating the properties that are due to the protocol from those that are due to the adversary scheduler, who controls the interactions between the processes. We (i) give a partial characterization of the set of predicates on input assignments that can be stably computed with maximum symmetry, i.e., Θ(Nmin)\Theta(N_{min}), where NminN_{min} is the minimum multiplicity of a state in the initial configuration, and (ii) we turn our attention to the remaining predicates and prove a strong impossibility result for the parity predicate: the inherent symmetry of any protocol that stably computes it is upper bounded by a constant that depends on the size of the protocol.Comment: 19 page
    corecore