582 research outputs found

    Perturbation theory of the space-time non-commutative real scalar field theories

    Full text link
    The perturbative framework of the space-time non-commutative real scalar field theory is formulated, based on the unitary S-matrix. Unitarity of the S-matrix is explicitly checked order by order using the Heisenberg picture of Lagrangian formalism of the second quantized operators, with the emphasis of the so-called minimal realization of the time-ordering step function and of the importance of the \star-time ordering. The Feynman rule is established and is presented using ϕ4\phi^4 scalar field theory. It is shown that the divergence structure of space-time non-commutative theory is the same as the one of space-space non-commutative theory, while there is no UV-IR mixing problem in this space-time non-commutative theory.Comment: Latex 26 pages, notations modified, add reference

    Search for Millicharged Particles at SLAC

    Get PDF
    Particles with electric charge q < 10^(-3)e and masses in the range 1--100 MeV/c^2 are not excluded by present experiments. An experiment uniquely suited to the production and detection of such "millicharged" particles has been carried out at SLAC. This experiment is sensitive to the infrequent excitation and ionization of matter expected from the passage of such a particle. Analysis of the data rules out a region of mass and charge, establishing, for example, a 95%-confidence upper limit on electric charge of 4.1X10^(-5)e for millicharged particles of mass 1 MeV/c^2 and 5.8X10^(-4)e for mass 100 MeV/c^2.Comment: 4 pages, REVTeX, multicol, 3 figures. Minor typo corrected. Submitted to Physical Review Letter

    Effects of gluteal kinesio-taping on performance with respect to fatigue in rugby players

    Get PDF
    Kinesio-tape® has been suggested to increase blood circulation and lymph flow and might influence the muscle's ability to maintain strength during fatigue. Therefore, the aim of this study was to investigate the influence of gluteal Kinesio-tape® on lower limb muscle strength in non-fatigued and fatigued conditions. A total of 10 male rugby union players performed 20-m sprint and vertical jump tests before and after a rugby-specific fatigue protocol. The 20-m sprint time was collected using light gates (SMARTSPEED). A 9-camera motion analysis system (VICON, 100 Hz) and a force plate (Kistler, 1000 Hz) measured the kinematics and kinetics during a counter movement jump and drop-jump. The effect of tape and fatigue on jump height, maximal vertical ground reaction force, reactivity strength index as well as lower limb joint work were analysed via a two-way analysis of variance. The fatigue protocol resulted in significantly decreased performance of sprint time, jump heights and alterations in joint work. No statistical differences were found between the taped and un-taped conditions in non-fatigued and fatigued situation as well as in the interaction with fatigue. Therefore, taping the gluteal muscle does not influence the leg explosive strength after fatiguing in healthy rugby players

    Computational modeling of beam-customization devices for heavy-charged-particle radiotherapy

    Get PDF
    A model for beam customization with collimators and a range-compensating filter based on the phase-space theory for beam transport is presented for dose distribution calculation in treatment planning of radiotherapy with protons and heavier ions. Independent handling of pencil beams in conventional pencil-beam algorithms causes unphysical collimator-height dependence in the middle of large fields, which is resolved by the framework comprised of generation, transport, collimation, regeneration, range-compensation, and edge-sharpening processes with a matrix of pencil beams. The model was verified to be consistent with measurement and analytic estimation at a submillimeter level in penumbra of individual collimators with a combinational-collimated carbon-ion beam. The model computation is fast, accurate, and readily applicable to pencil-beam algorithms in treatment planning with capability of combinational collimation to make best use of the beam-customization devices.Comment: 16 pages, 5 figure

    The SCEC/USGS Dynamic Earthquake Rupture Code Verification Exercise

    Get PDF
    Numerical simulations of earthquake rupture dynamics are now common, yet it has been difficult to test the validity of thesesimulations because there have been few field observations and no analytic solutions with which to compare the results. This paper describes the Southern California Earthquake Center/U.S. Geological Surve(SCEC/USGS) Dynamic Earthquake Rupture Code Verification Exercise, where codes that simulate spontaneous rupture dynamics in three dimensions are evaluated and the results produced by these codes are compared using Web-based tools. This is the first time that a broad and rigorous examination of numerous spontaneous rupture codes has been performed—a significant advance in this science. The automated process developed to attain this achievement provides for a future where testing of codes is easily accomplished. Scientists who use computer simulations to understand earthquakes utilize a range of techniques. Most of these assume that earthquakes are caused by slip at depth on faults in the Earth, but hereafter the strategies vary. Among the methods used in earthquake mechanics studies are kinematic approaches and dynamic approaches. The kinematic approach uses a computer code that prescribes the spatial and temporal evolution of slip on the causative fault (or faults). These types of simulations are very helpful, especially since they can be used in seismic data inversions to relate the ground motions recorded in the field to slip on the fault(s) at depth. However, these kinematic solutions generally provide no insight into the physics driving the fault slip or information about why the involved fault(s) slipped that much (or that little). In other words, these kinematic solutions may lack information about the physical dynamics of earthquake rupture that will be most helpful in forecasting future events. To help address this issue, some researchers use computer codes to numerically simulate earthquakes and construct dynamic, spontaneous rupture (hereafter called “spontaneous rupture”) solutions. For these types of numerical simulations, rather than prescribing the slip function at each location on the fault(s), just the friction constitutive properties and initial stress conditions are prescribed. The subsequent stresses and fault slip spontaneously evolve over time as part of the elasto-dynamic solution. Therefore, spontaneous rupture computer simulations of earthquakes allow us to include everything that we know, or think that we know, about earthquake dynamics and to test these ideas against earthquake observations

    Random field sampling for a simplified model of melt-blowing considering turbulent velocity fluctuations

    Full text link
    In melt-blowing very thin liquid fiber jets are spun due to high-velocity air streams. In literature there is a clear, unsolved discrepancy between the measured and computed jet attenuation. In this paper we will verify numerically that the turbulent velocity fluctuations causing a random aerodynamic drag on the fiber jets -- that has been neglected so far -- are the crucial effect to close this gap. For this purpose, we model the velocity fluctuations as vector Gaussian random fields on top of a k-epsilon turbulence description and develop an efficient sampling procedure. Taking advantage of the special covariance structure the effort of the sampling is linear in the discretization and makes the realization possible

    Pressure study of the noncentrosymmetric 5d-electron superconductors CaMSi3 (M= Ir, Pt)

    Full text link
    We report hydrostatic pressure study on the Rashba-type noncentrosymmetric superconduc- tors CaMSi3 (M= Ir, Pt). The temperature dependence of the resistivity for both compounds is well described by the conventional Bloch-Gr\"uneisen formalism at each pressure. This fact suggests that electron-phonon scattering is dominant in these compounds. The superconducting critical temperature Tc decreases by pressure as \sim 0.2 K/GPa above 0.41 GPa to 2 GPa for both compounds. This behavior of Tc can be explained with a modest decrease of the density of states based on the conventional BCS theory.Comment: 4 pages, 4 figure

    A human biomonitoring (HBM) Global Registry Framework: Further advancement of HBM research following the FAIR principles.

    Get PDF
    Data generated by the rapidly evolving human biomonitoring (HBM) programmes are providing invaluable opportunities to support and advance regulatory risk assessment and management of chemicals in occupational and environmental health domains. However, heterogeneity across studies, in terms of design, terminology, biomarker nomenclature, and data formats, limits our capacity to compare and integrate data sets retrospectively (reuse). Registration of HBM studies is common for clinical trials; however, the study designs and resulting data collections cannot be traced easily. We argue that an HBM Global Registry Framework (HBM GRF) could be the solution to several of challenges hampering the (re)use of HBM (meta)data. The aim is to develop a global, host-independent HBM registry framework based on the use of harmonised open-access protocol templates from designing, undertaking of an HBM study to the use and possible reuse of the resulting HBM (meta)data. This framework should apply FAIR (Findable, Accessible, Interoperable and Reusable) principles as a core data management strategy to enable the (re)use of HBM (meta)data to its full potential through the data value chain. Moreover, we believe that implementation of FAIR principles is a fundamental enabler for digital transformation within environmental health. The HBM GRF would encompass internationally harmonised and agreed open access templates for HBM study protocols, structured web-based functionalities to deposit, find, and access harmonised protocols of HBM studies. Registration of HBM studies using the HBM GRF is anticipated to increase FAIRness of the resulting (meta)data. It is also considered that harmonisation of existing data sets could be performed retrospectively. As a consequence, data wrangling activities to make data ready for analysis will be minimised. In addition, this framework would enable the HBM (inter)national community to trace new HBM studies already in the planning phase and their results once finalised. The HBM GRF could also serve as a platform enhancing communication between scientists, risk assessors, and risk managers/policy makers. The planned European Partnership for the Assessment of Risk from Chemicals (PARC) work along these lines, based on the experience obtained in previous joint European initiatives. Therefore, PARC could very well bring a first demonstration of first essential functionalities within the development of the HBM GRF
    corecore