1,969 research outputs found

    Subduction Duration and Slab Dip

    Get PDF
    The dip angles of slabs are among the clearest characteristics of subduction zones, but the factors that control them remain obscure. Here, slab dip angles and subduction parameters, including subduction duration, the nature of the overriding plate, slab age, and convergence rate, are determined for 153 transects along subduction zones for the present day. We present a comprehensive tabulation of subduction duration based on isotopic ages of arc initiation and stratigraphic, structural, plate tectonic and seismic indicators of subduction initiation. We present two ages for subduction zones, a long‐term age and a reinitiation age. Using cross correlation and multivariate regression, we find that (1) subduction duration is the primary parameter controlling slab dips with slabs tending to have shallower dips at subduction zones that have been in existence longer; (2) the long‐term age of subduction duration better explains variation of shallow dip than reinitiation age; (3) overriding plate nature could influence shallow dip angle, where slabs below continents tend to have shallower dips; (4) slab age contributes to slab dip, with younger slabs having steeper shallow dips; and (5) the relations between slab dip and subduction parameters are depth dependent, where the ability of subduction duration and overriding plate nature to explain observed variation decreases with depth. The analysis emphasizes the importance of subduction history and the long‐term regional state of a subduction zone in determining slab dip and is consistent with mechanical models of subduction

    Functional inspiratory muscle training (IMT) improves load carriage performance greater than traditional IMT techniques: 1652 Board #305 June 2, 9: 00 AM - 10: 30 AM.

    Get PDF
    The addition of external thoracic loads is common in occupational groups such as the military. The positioning upon the thorax poses a unique challenge to breathing mechanics and causes respiratory muscle fatigue (RMF) following exercise. IMT techniques provide a positive impact to exercise performance as well as attenuating RMF in both health and athletic populations. However in occupational groups, despite increased inspiratory muscle strength and performance, IMT has so far failed to attenuate RMF, potentially limiting the performance enhancement of IMT. It has been suggested that functional inspiratory muscle training (IMTF) may elicit performance adaptations above that of traditional IMT techniques as it targets the inspiratory muscles throughout the length-tension range adopted during exercise.N/

    Automatic classification of field-collected dinoflagellates by artificial neural network

    Get PDF
    Automatic taxonomic categorisation of 23 species of dinoflagellates was demonstrated using field-collected specimens. These dinoflagellates have been responsible for the majority of toxic and noxious phytoplankton blooms which have occurred in the coastal waters of the European Union in recent years and make severe impact on the aquaculture industry. The performance by human 'expert' ecologists/taxonomists in identifying these species was compared to that achieved by 2 artificial neural network classifiers (multilayer perceptron and radial basis function networks) and 2 other statistical techniques, k-Nearest Neighbour and Quadratic Discriminant Analysis. The neural network classifiers outperform the classical statistical techniques. Over extended trials, the human experts averaged 85% while the radial basis network achieved a best performance of 83%, the multilayer perceptron 66%, k-Nearest Neighbour 60%, and the Quadratic Discriminant Analysis 56%

    Treating and Preventing Influenza in Aged Care Facilities: A Cluster Randomised Controlled Trial

    Get PDF
    PMCID: PMC3474842This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited

    Testing Reactive Probabilistic Processes

    Full text link
    We define a testing equivalence in the spirit of De Nicola and Hennessy for reactive probabilistic processes, i.e. for processes where the internal nondeterminism is due to random behaviour. We characterize the testing equivalence in terms of ready-traces. From the characterization it follows that the equivalence is insensitive to the exact moment in time in which an internal probabilistic choice occurs, which is inherent from the original testing equivalence of De Nicola and Hennessy. We also show decidability of the testing equivalence for finite systems for which the complete model may not be known

    Dark matter and non-Newtonian gravity from General Relativity coupled to a fluid of strings

    Get PDF
    An exact solution of Einstein's field equations for a point mass surrounded by a static, spherically symmetric fluid of strings is presented. The solution is singular at the origin. Near the string cloud limit there is a 1/r1/r correction to Newton's force law. It is noted that at large distances and small accelerations, this law coincides with the phenomenological force law invented by Milgrom in order to explain the flat rotation curves of galaxies without introducing dark matter. When interpreted in the context of a cosmological model with a string fluid, the new solution naturally explains why the critical acceleration of Milgrom is of the same order of magnitude as the Hubble parameter.Comment: 12 pages, REVTeX, no figure

    Real time THz imaging - opportunities and challenges for skin cancer detection

    Get PDF
    It was first suggested that terahertz imaging has the potential to detect skin cancer twenty years ago. Since then, THz instrumentation has improved significantly: real time broadband THz imaging is now possible and robust protocols for measuring living subjects have been developed. Here, we discuss the progress that has been made as well as highlight the remaining challenges for applying THz imaging to skin cancer detection

    Primordial black hole constraints in cosmologies with early matter domination

    Get PDF
    Moduli fields, a natural prediction of any supergravity and superstring-inspired supersymmetry theory, may lead to a prolonged period of matter domination in the early Universe. This can be observationally viable provided the moduli decay early enough to avoid harming nucleosynthesis. If primordial black holes form, they would be expected to do so before or during this matter dominated era. We examine the extent to which the standard primordial black hole constraints are weakened in such a cosmology. Permitted mass fractions of black holes at formation are of order 10−810^{-8}, rather than the usual 10−2010^{-20} or so. If the black holes form from density perturbations with a power-law spectrum, its spectral index is limited to nâ‰Č1.3n \lesssim 1.3, rather than the nâ‰Č1.25n \lesssim 1.25 obtained in the standard cosmology.Comment: 7 pages RevTeX file with four figures incorporated (uses RevTeX and epsf). Also available by e-mailing ARL, or by WWW at http://star-www.maps.susx.ac.uk/papers/infcos_papers.htm

    Do we really need Confidence Intervals in the new statistics?

    Get PDF
    This paper compares the use of confidence intervals (CIs) and a sensitivity analysis called the number needed to disturb (NNTD), in the analysis of research findings expressed as ‘effect’ sizes. Using 1,000 simulations of randomised trials with up to 1,000 cases in each, the paper shows that both approaches are very similar in outcomes, and each one is highly predictable from the other. CIs are supposed to be a measure of likelihood or uncertainty in the results, showing a range of possible effect sizes that could have been produced by random sampling variation alone. NNTD is supposed to be a measure of the robustness of the effect size to any variation, including that produced by missing data. Given that they are largely equivalent and interchangeable under the conditions tested here, the paper suggests that both are really measures of robustness. It concludes that NNTD is to be preferred because it requires many fewer assumptions, is more tolerant of missing data, is easier to explain, and directly addresses the key question of whether the underlying effect size is zero or not

    HELIUM PHOTODISINTEGRATION AND NUCLEOSYNTHESIS: IMPLICATIONS FOR TOPOLOGICAL DEFECTS, HIGH ENERGY COSMIC RAYS, AND MASSIVE BLACK HOLES

    Get PDF
    We consider the production of 3^3He and 2^2H by 4^4He photodisintegration initiated by non-thermal energy releases during early cosmic epochs. We find that this process cannot be the predominant source of primordial 2^2H since it would result in anomalously high 3^3He/D ratios in conflict with standard chemical evolution assumptions. We apply this fact to constrain topological defect models of highest energy cosmic ray (HECR) production. Such models have been proposed as possible sources of ultrahigh energy particles and gamma-rays with energies above 102010^{20}eV. The constraints on these models derived from 4^4He-photodisintegration are compared to corresponding limits from spectral distortions of the cosmic microwave background radiation (CMBR) and from the observed diffuse gamma-ray background. It is shown that for reasonable primary particle injection spectra superconducting cosmic strings, unlike ordinary strings or annihilating monopoles, cannot produce the HECR flux at the present epoch without violating at least the 4^4He-photodisintegration bound. The constraint from the diffuse gamma-ray background rules out the dominant production of HECR by the decay of Grand Unification particles in models with cosmological evolution assuming standard fragmentation functions. Constraints on massive black hole induced photodisintegration are also discussed.Comment: 20 latex pages, 1 figure added via figures comman
    • 

    corecore