208 research outputs found

    The effect of frequency on psychophysical responses to lifting.

    Get PDF
    One of the main goals of ergonomics is to establish task requirements in order to prevent injury. This is of particular importance in manual materials handling (MMH), as manual lifting represents a major cause of injury to workers and a significant cost to industry. In spite of the existence of established guidelines, there is evidence that the majority of injuries are caused by overexertion (Ayoub, 1992). In support of Oborne (1987), Charteris and Scott (1993) have argued that frequency is one of several task-related variables that have an influence on the demands of a lifting task. Clearly, all other factors being equal, as the pace of the work is increased, so the load should decrease. Utilising fifteen male volunteer subjects (average age 21.3 years), this study investigated psychophysical responses to a lifting task at three different frequencies - 5, 10 and 15 lifts per minute - with the total task duration being 15 minutes. Heart rate (HR) and Ratings of Perceived Exertion (RPE) data were recorded, and the tasks were analysed using LIFTRISK and NIOSH. There were significant differences between successive frequencies for all data recorded. As expected, HR and RPE increased with lifting frequency. There was no significant correlation between HR and RPE for any of the conditions. Correlations were, however, higher at low work intensities. RPE Central values were somewhat lower than Overall values; this finding seems to lend support to the contention that Central factors do not play as important a role in the perceptions of exertion as was originally thought (Pandolf, 1982; Olivier and Scott, 1993). In conclusion, this study investigated the effect of frequency on psychophysical responses for a task with inherent high-risk characteristics. As lift frequency increased, the correlation between HR and RPE decreased. With low subject numbers precluding a firm conclusion, this study tentatively proposes that caution should be exercised when using RPE in self-determination of task limitations for a MMH task

    Arm versus leg stressed differentiated ratings of perceived exertion.

    Get PDF
    Although Borg's RPE scale is a universally accepted assessment of an individual's subjective rating of the physical demands of the task on hand, the differentiated scale does not appear to have been put to much practical use in working situations, particularly manual materials-handling tasks. The object of this project was to demonstrate the importance of differentiated RPE scores. A homogeneous sample of well-conditioned male marathon runners participated in two contrasting working conditions: in one, the emphasis was on the lower limbs (treadmill running), whereas the other was a predominately upper-limb activity (arm cranking ergometer). Subjects were required to work for 20 minutes with heart rate and differentiated RPE scores collected every 2 minutes. The same working heart rate was recorded in both sessions and yet there were significant increases in all RPE ratings in the arm cranking condition. It is evident that, although the cardiovascular output was similar in both conditions, the subjects perceived the unfamiliar arm cranking task to be more demanding. While recognising the value of the use of RPE scores in assessing the worker's perception of the demands of working tasks, the results of this study would strongly indicate the use of local, central and overall ratings of perceived demands in order to maximise the benefits of using RPE scores when acquiring the total perception of the work demands

    Generation of abstract programming interfaces from syntax definitions

    Get PDF
    Tree Manipulation; Using the ATerm-Library to represent tree-like data structures has become a popular activity, especially amongst developers of e.g. lexical scanners, parsers, rewrite engines and model checkers. Practical experience with the ATerm-Library in the ASF+SDF Meta-Environment has shown that the development and maintenance of tools that access ATerms using handcrafted code is involved and very error prone. Both the make and match paradigm and the direct manipulation of ATerms suffer from the fact that the programmer uses knowledge about the underlying structure (a.k.a. the signature) of the ATerm that represents the data type being accessed. Hard-wiring this knowledge in various tools makes it difficult to maintain the tools with respect to changes in the data structure. By lifting the data definition and its mapping to an ATerm representation away from the level of tool implementation, it becomes possible to generate a library of access functions on the data type, which can be used by all the tools that need access to it. The tools no longer directly manipulate the ATerm representation, but rather invoke methods described by the API of the library, thus abstracting from the fact that ATerms are used for the implementation of the data type. This paper describes how an API and its implementation can be generated from a syntax definition of the data type. In particular we describe how a grammar (in SDF) can be used to generate a library of access functions that manipulate the parse trees of terms over this syntax. Application of this technique in the ASF+SDF Meta-Environment has resulted in a spectacular elimination of 47 of the handwritten code, thus greatly improving both maintainability of the tools and their flexibility with respect to changes in the parse tree format

    Compiling language definitions: the ASF+SDF compiler

    Get PDF
    The ASF+SDF Meta-Environment is an interactive language development environment whose main application areas are definition of domain-specific languages, generation of program analysis and transformation tools, production of software renovation tools, and general specification and prototyping. It uses conditional rewrite rules to define the dynamic semantics and other tool-oriented aspects of languages, so the effectiveness of the generated tools is critically dependent on the quality of the rewrite rule implementation. The ASF+SDF rewrite rule compiler generates C code, thus taking advantage of C's portability and the sophisticated optimization capabilities of current C compilers as well as avoiding potential abstract machine interface bottlenecks. It can handle large(10 000+ rule) language definitions and uses an efficient run-time storage scheme capable of handling large (1 000 000+ node) terms. Term storage uses maximal subterm sharing (hash-consing), which turns out to be more effective in the case of ASF+SDF than in Lisp or SML. Extensive benchmarking has shown the time and space performance of the generated code to be as good as or better than that of the best current rewrite rule and functional language compilers

    Nicotiana glauca poisoning in ostriches (Struthio camelus)

    Get PDF
    Putative Nicotiana glauca (wild tobacco) poisoning was diagnosed in a flock of ostriches near Oudtshoorn, South Africa. Post mortem examinations (n = 7) were performed on ostriches (Struthio camelus) that had died. Suspicious leaf remnants (weighing 80–770 g), packed in a layer on top of other plant material, were carefully separated from the proventricular content and submitted for chemical determination of anabasine, the major toxic principle contained by this plant. A standard solid phase extraction method was used followed by an optimised liquid chromatography-mass spectrometry procedure. Anabasine was detected in the leaf remnants (114–177 μg/g dry weight) removed from the proventriculus of the ostriches that succumbed as well as in control N. glauca leaves (193 μg/g dry weight). The analytical methods used in this study revealed the presence of anabasine in the suspicious leaf remnants, indicating that the birds had been exposed to N. glauca and had died of this poisoning.http://www.journals.co.za/ej/ejour_savet.htm

    The evolution of mobile bed tests: a step towards the future of coastal engineering

    Get PDF
    Coastal Engineering still presents significant levels of uncertainty, much larger for sediment transport and morphodynamics than for the driving hydrodynamics. Because of that there is still a need for experimental research that addresses the water and sediment fluxes occurring at multiple scales in the near shore and for some of which there are still not universally accepted equations or closure sub-models. Large scale bed tests offer the possibility to obtain undistorted results under controlled conditions that may look at sediment transport and associated bed evolution under a variety of wave and mean water level conditions. The present limitations in conventional observation equipment preclude a clear advancement in knowledge or model calibration. However the new developments in opto-acoustic equipment should allow such an advancement to take place provided the new experimental equipment becomes more robust in parallel with a protocol for deployment and data processing. This paper will present the experimental approach to erosive and accretive beach dynamics, with emphasis on the accretive experiments. These accretive tests still present further uncertainties and sometimes cannot be explained with the present state of the art. Following this there is a presentation of the novel development of an acoustic bed form and suspended sediment imager, able to monitor bed forms near bed sediment transport and their corresponding dynamics. The next section deals with an acoustic high resolution concentration and velocity profiler that is able to infer even the elusive bed level, together with the near bed concentrated sediment transport and the details of fluxes on the stoss and lee sides of moving bed forms. This is followed by a discussion on the merits of novel optic techniques, using structured and unstructured light sources. There is also some remarks on new approaches. Illustrated by the use of ferro-fluids to obtain directly the shear stresses acting on a wall even under the presence of “some” sediment. The paper ends with some conclusions on the use of such mobile bed tests in present and future Coastal Engineering.Postprint (published version

    The subaru coronagraphic extreme AO (SCExAO) system: Wavefront control and detection of exoplanets with coherent light modulation in the focal plane

    Get PDF
    The Subaru Coronagraphic Extreme-AO (SCExAO) system is designed for high contrast coronagraphic imaging at small angular separations, and is scheduled to see first light on the Subam Telescope in early 2011. The wavefront control architecture for SCExAO is optimized for scattered light control and calibration at small angular separations, and is described in this paper. Key subsystems for the SCExAO wavefront control architecture have been successfully demonstrated, and we report results from these tests and discuss their role in the SCExAO system. Among these subsystems, a technique which can calibrate and remove static and slow speckles which traditionally limit high contrast detections is discussed. A visible light lab prototype system at Subam Telescope recently demonstrated speckle halo reduction to 2e-7 contrast within 2 2λ/D, and removal of static coherent speckles to 3e-9 contrast

    The Subaru Coronographic Extreme AO (SCExAO) system: Implementation and performances of the Coronographic Low Order WaveFront Sensor

    Get PDF
    The Subaru Coronagraphic Extreme AO project (SCExAO) is a high performance coronagraph designed to deliver high contrast at small angular separation. For the detection of structures near the diffraction limit, an accurate control of low order wavefront aberrations - tip-tilt and focus - is essential as these aberrations create light leaks that are the source of confusion in the final science image. To address this major difficulty, we have equipped SCExAO with a specially designed Coronagraphic Low Order WaveFront Sensor (CLOWFS) using defocused images of a reflective ring located in the focal plane, that can track tip-tilt errors as small as 10-3?/D. CLOWFS was originally designed to drive actuators in a closed-loop. Here, we show that it can also be used in post-processing to efficiently subtract the tip-tilt induced coronagraphic leaks in the final science image

    The Subaru coronagraphic extreme AO (SCExAO) system: Visible imaging mode

    Get PDF
    The Subaru Coronagraphic Extreme Adaptive Optics (SCExAO) system is an instrument designed to be inserted between the Subaru AO188 system and the infrared HiCIAO camera in order to greatly improve the contrast in the very close (less than 0.5") neighbourhood of stars. Next to the infrared coronagraphic path, a visible scientific path, based on a EMCCD camera, has been implemented. Benefiting from both Adaptive Optics (AO) correction and new data processing techniques, it is a powerful tool for high angular resolution imaging and opens numerous new science opportunities. We propose here a new image processing algorithm, based on the selection of the best signal for each spatial frequency. A factor 2 to 3 in Strehl ratio is obtained compared to the AO long exposure time depending on the image processing algorithm used and the seeing conditions. The system is able to deliver diffraction limited images at 650 nm (17 mas FWHM).We also demonstrate that this approach offers significantly better results than the classical select, shift and add approach (lucky imaging)
    • …
    corecore