177 research outputs found

    THE RELATIONSHIP BETWEEN PHYSICAL FACTORS TO AGILITY PERFORMANCE IN COLLEGIATE TENNIS PLAYERS

    Get PDF
    ABSTRACT THE RELATIONSHIP BETWEEN PHYSICAL FACTORS TO AGILITY PERFORMANCE IN COLLEGIATE TENNIS PLAYERS Ian McKinley and Dr. Kimitake Sato, Department of Exercise and Sport Sciences, College of Education, East Tennessee State University, Johnson City, Tennessee Tennis players change direction numerous times within a tennis match and game making agility an important skill for them to possess. The purpose of this study was to investigate at the significance of physical factors as they relate to agility performance in collegiate tennis players. The physical factors looked at were anthropomorphic measurements; isometric peak force, rate of force development, and force scale; countermovement jump performance, and squat jump performance. The participants were seventeen (Male: N = 8, Female: N=9) NCAA Division I collegiate level tennis players. Anthropomorphic measurements included height, body mass, and body fat percentages were also considered. Strength was measured by an isometric mid-thigh pull, and lastly power was measured by vertical jumps. Significance was set at 0.05 for statistical analysis. Correlation analysis showed that isometric rate of force development from isometric mid-thigh pull was significant (p = 0.033). In conclusion both anthropomorphic measurements and vertical jumps have little effect on agility performance but the rate of force development in the isometric mid-thigh pull test has statistically significant relationship to agility performance in tennis players, indicating agility movement is influenced by how fast you can develop force against ground

    Tracer Applications of Radiocaesium in a Coastal Marine Environment

    Get PDF
    In this study radiocaesium is confirmed as a versatile tracer and its use is demonstrated in the investigation of a wide range of processes occuring in the coastal marine environment. By utilising an analogue model, radiocaesium transport from Windscale to the North Channel may be characterised by a 'residence half-time' of ~12 months compounded with a 'lag time' of ~6 months. This transport is, however, shown to be variable - in 1977 a greatly increased Atlantic influx to the Irish Sea through the St. George's Channel was evident, resulting in an accelerated radiocaesium transport rate northwards through the North Channel. By further use of this model, the waters of the Clyde Sea Area may be shown to have a 'residence half-time' of ~3 months with the lag between the North Channel site and the Clyde being ~1 month. Additionally, ~40% of the northwards water flux from the Irish Sea may be shown to pass through the Clyde Sea Area. Transport from the North Channel to the Minch is also shown to change considerably between 1976 (advection rate 5km/day) and 1977 (advection rate ~1.5 km/day) associated with a marked widening of the coastal water plume in the Hebridean Sea Area during this period

    Synthesis and mass spectrometry of phenylboronate esters of polyhydric alcohols

    Get PDF
    1. Phenylboronates of acyclic diols, triols and polyhydric alcohols have been synthesised. 2. Mass spectrometry of phenylboronates of acyclic diols has revealed that the following four processes occur as a result of ionisation by electron impact: (a) Elimination of exocyclic groups by cleavage of C-C bonds. (b) Skeletal rearrangement giving rise to highly unsaturated hydrocarbon ions containing 7~10 carbon atoms. (c) Elimination of oxo molecules. (d) A double elimination exclusive to six-membered phenylboronate rings, which provides a means of detecting this structural unit in compounds of hitherto unknown structure. 3. The mass spectra of phenylboronates of triols, tetritols, pentitols and hexitols were interpreted on the grounds of the processes outlined above, and structures assigned accordingly. 4. The hitherto unsuccessful methylation of hydroxyl groups in phenyl-boronates has been achieved using diazomethane and boron trifluoride etherate as reagents. 5. Methylation of phenylboronates followed by hydrolysis of the ester, acetylation and analysis of the product by gas-liquid chromatography combined with mass spectrometry revealed that the phenylboronates of triols are, in fact, mixtures of different boronate ring modifications. The compositions of such mixtures have been compared and rationalised in terms of the structues.<p

    Microbiology in nuclear waste disposal: interfaces and reaction fronts

    Get PDF
    It is now generally acknowledged that microbial populations will be present within nuclear waste repositories and that the consequences of such activity on repository performance must be assessed. Various modelling approaches - based either on mass balance/thermodynamics or on kinetics - have been developed to provide scoping estimates of the possible development of these populations. Past work has focused on particular areas of the repository which can be considered relatively homogeneous and hence can be represented by some kind of ‘box' or ‘mixing tank'. In reality, however, waste repositories include a range of engineering materials (steel, concrete, etc.) which are emplaced at depth in a rock formation. Strong chemical gradients - of the type which may be exploited by lithoautotrophic microbial populations - are likely to be found at the contacts between different materials and at the interface between the engineered structures and the host rock. Over the long timescales considered, solute transport processes will cause the locations of strong chemical gradients to move, forming reaction fronts. The high-pH plume resulting from the leaching of cement/concrete in some repository types is a particularly important example of such a reaction front. Redox fronts, which may occur in different areas of all kinds of repositories, also play an important role and would be locations where microbial activity is likely to be significant. In this paper, the key microbial processes expected at (or around) interfaces and fronts will be discussed, with particular emphasis on the development of quantitative models. The applicability of the models used will be tested by considering similar fronts which can be found in natural system

    The Active CryoCubeSat Project: Testing and Preliminary Results

    Get PDF
    The Center for Space Engineering at Utah State University and NASA’s Jet Propulsion Laboratory have jointly developed an active thermal control technology to better manage thermal loads and enable cryogenic instrumentation for CubeSats. The Active CryoCubeSat (ACCS) project utilizes a two-stage active thermal control architecture with the first stage consisting of a single-phase mechanically pumped fluid loop, which circulates coolant between a cold plate rejection heat exchanger and a deployed radiator. The second stage relies upon a miniature tactical cryocooler, which provides sub 110 K thermal management. This research details the experimental setup for a groundbased prototype demo which was tested in an appropriate, and relevant thermal vacuum environment. The preliminary results, which include the input power required by the system, rejection and environmental temperatures and the total thermal dissipation capabilities of the ACCS system, are presented along with a basic analysis and a discussion of the results

    Emulation and History Matching using the hmer Package

    Full text link
    Modelling complex real-world situations such as infectious diseases, geological phenomena, and biological processes can present a dilemma: the computer model (referred to as a simulator) needs to be complex enough to capture the dynamics of the system, but each increase in complexity increases the evaluation time of such a simulation, making it difficult to obtain an informative description of parameter choices that would be consistent with observed reality. While methods for identifying acceptable matches to real-world observations exist, for example optimisation or Markov chain Monte Carlo methods, they may result in non-robust inferences or may be infeasible for computationally intensive simulators. The techniques of emulation and history matching can make such determinations feasible, efficiently identifying regions of parameter space that produce acceptable matches to data while also providing valuable information about the simulator's structure, but the mathematical considerations required to perform emulation can present a barrier for makers and users of such simulators compared to other methods. The hmer package provides an accessible framework for using history matching and emulation on simulator data, leveraging the computational efficiency of the approach while enabling users to easily match to, visualise, and robustly predict from their complex simulators.Comment: 40 pages, 11 figures; submitted to Journal of Statistical Software: author order correcte

    Bayesian history matching of complex infectious disease models using emulation: A tutorial and a case study on HIV in Uganda

    Get PDF
    Advances in scientific computing have allowed the development of complex models that are being routinely applied to problems in disease epidemiology, public health and decision making. The utility of these models depends in part on how well they can reproduce empirical data. However, fitting such models to real world data is greatly hindered both by large numbers of input and output parameters, and by long run times, such that many modelling studies lack a formal calibration methodology. We present a novel method that has the potential to improve the calibration of complex infectious disease models (hereafter called simulators). We present this in the form of a tutorial and a case study where we history match a dynamic, event-driven, individual-based stochastic HIV simulator, using extensive demographic, behavioural and epidemiological data available from Uganda. The tutorial describes history matching and emulation. History matching is an iterative procedure that reduces the simulator's input space by identifying and discarding areas that are unlikely to provide a good match to the empirical data. History matching relies on the computational efficiency of a Bayesian representation of the simulator, known as an emulator. Emulators mimic the simulator's behaviour, but are often several orders of magnitude faster to evaluate. In the case study, we use a 22 input simulator, fitting its 18 outputs simultaneously. After 9 iterations of history matching, a non-implausible region of the simulator input space was identified that was times smaller than the original input space. Simulator evaluations made within this region were found to have a 65% probability of fitting all 18 outputs. History matching and emulation are useful additions to the toolbox of infectious disease modellers. Further research is required to explicitly address the stochastic nature of the simulator as well as to account for correlations between outputs
    • …
    corecore