5,191 research outputs found

    Development of a low profile laser Doppler probe for monitoring perfusion at the patient – mattress interface

    Get PDF
    The clinical importance of pressure ulcers is reviewed confirming the need for continuous monitoring of skin blood perfusion at the patient – mattress interface. The design of a low profile (H≈1mm) laser Doppler probe is then described together with the experimental setup used for evaluation. The results show that the performance of the new sensor does not vary significantly from that of currently available probes over a wide range of operating parameters. The authors conclude that the sensor design provides a low cost perfusion monitoring solution with potential to significantly reduce the risk of bed sores in hospital patients

    Derivatisation of Polyphenols

    Get PDF
    Polyphenols, such as tannins, offer potential as a bio-derived chemical feedstock. Their present utilisation is limited mainly to leather tanning and wood panel adhesives. However, appropriate derivatisation may alter both the chemical and physical properties and thereby allow further utilisation of polyphenols. Derivatisation of polyphenols was achieved by esterification and etherification of the phenol groups. Esterification was achieved by alcoholysis of acid chlorides and transesterification with vinyl esters, while etherification was achieved by the ring opening of propylene oxide. The polyphenols used were resorcinol, catechin, Pinus radiata bark tannin, and Schinopsis lorentzii tannin. The products were characterised using a range of techniques including NMR (1H, 13C and 2D NMR in both the solution and solid state), ESI-MS, GPC, DSC, TGA, and rheology. The preparation of polyphenolic esters by alcoholysis provided model compounds to establish the key chemical, spectroscopic, and physical features. A range of simple polyphenol esters such as resorcinol dilaurate and catechin pentalaurate were prepared using lauroyl chloride. Furthermore, tannin lauroyl esters were prepared with varying degrees of substitution. A transesterification method was developed for the preparation of polyphenol esters. Ester interchange occurred effectively in the presence of base catalyst in aqueous solution or dimethyl sulfoxide with short or long chain vinyl esters. This included the first report of the base-catalysed transesterification of flavonoids by vinyl esters to give products such as catechin mono- and di-laurate. Transesterification occurred preferentially at the B-ring as shown by NMR spectroscopy. Subsequently, this transesterification procedure was used to prepare tannin esters. The chemical and physical properties of polyphenol esters were assessed using thermal, antioxidant, and UV/VIS light absorption analysis. Thermal analysis indicated melt/flow properties for some of the polyphenol esters. In some cases, the thermal stability was also shown to increase upon esterification. The antioxidant activity was shown to decrease upon transesterification of pine bark tannin with vinyl laurate, while the UV/VIS absorption was shown to increase. These properties may lend the products towards applications as polymer additives or pharmaceuticals. Polyphenol ethers were prepared by the Williamson ether synthesis and ring opening of propylene oxide. However, the Williamson ether synthesis, a common route to prepare ethers, proved unsuitable for flavonoids. Catechin and tannin hydroxypropyl ether derivatives of varying substitution were prepared by the ring-opening of propylene oxide in the presence of triethylamine. Upon hydroxypropylation the thermal properties of the polyphenol were altered. For example, catechin hydroxypropyl ethers showed a glass transition, which was dependent upon the molar substitution, while rheology showed melt behaviour for several of the tannin hydroxypropyl ethers

    Early Christian Perceptions of Sacred Spaces

    Get PDF
    Previous studies of early Christian beliefs have portrayed the community as being highly anti-materialist and anti-social. It was argued that Christians rejected the category of “sacred space” and exhibited only secular and functional behavior regarding place. Beginning in the late 1970s a growing body of scientific literature has questioned the veracity of these claims. Reviewing the material culture record in the first four centuries of the Christian community (architecture, objects, art), this article proposes that Christians were far more culturally homogeneous in late antiquity, and accepted in large part the material mediation of the divine.De prĂ©cĂ©dentes Ă©tudes au sujet des croyances des premiers chrĂ©tiens ont brossĂ© le portrait d’une communautĂ© hautement antimatĂ©rialiste et antisociale. Elles soutenaient que les chrĂ©tiens rejetaient la catĂ©gorie « d’espace sacrĂ© » pour ne retenir que les aspects sĂ©culiers et fonctionnels de l’espace. À partir de la fin des annĂ©es 1970, un ensemble grandissant de littĂ©rature scientifique a remis en question la vĂ©racitĂ© de ces affirmations. En faisant l’examen de la culture matĂ©rielle des quatre premiers siĂšcles d’existence de la communautĂ© chrĂ©tienne (architecture, objets, arts), cet article Ă©met l’hypothĂšse que les chrĂ©tiens de l’AntiquitĂ© tardive avaient une culture bien plus homogĂšne et qu’ils ont acceptĂ© en grande partie l’expĂ©rience matĂ©rielle de la mĂ©diation avec le divin

    Theorizing in Unfamiliar Contexts: New Directions in Translation Studies

    Get PDF
    This thesis attempts to offer a reconceptualization of translation analysis. It argues that there is a growing interest in examining translations produced outside the discipline‟s historical field of focus. However, the tools of analysis employed may not have sufficient flexibility to examine translation if it is conceived more broadly. Advocating the use of abductive logic, the thesis infers translators‟ probable understandings of their own actions, and compares these with the reasoning provided by contemporary theories. It finds that it may not be possible to rely on common theories to analyse the work of translators who conceptualize their actions in radically different ways from that traditionally found in translation literature. The thesis exemplifies this issue through the dual examination of Geoffrey Chaucer‟s use of translation in the Canterbury Tales and that of Japanese storytellers in classical Kamigata rakugo. It compares the findings of the discipline‟s most pervasive theories with those gained through an abductive analysis of the same texts, finding that the results produced by the theories are invariably problematic. The thesis demonstrates that understandings of translation practice have been given to change over time, and vary substantially across cultures. Therefore, an individual theory is unlikely to be able to rationalize particular practices or features of translations irrespective of the cultural context in which they are found. Abductive logic aims to describe translations in particular, rather than translation in general. It can be used to infer factors that may have influenced translators‟ understandings of the roles their texts will take, and hence, their aims in translating. Many theories tend to be underpinned by inductive logic, which essentially restricts textual analysis to the application of pre-defined labels of translation phenomena. Abductive logic forms hypotheses based on the context in question, going far beyond this kind of textual categorization

    A tool for facilitating OCR postediting in historical documents

    Get PDF
    Optical character recognition (OCR) for historical documents is a complex procedure subject to a unique set of material issues, including inconsistencies in typefaces and low quality scanning. Consequently, even the most sophisticated OCR engines produce errors. This paper reports on a tool built for postediting the output of Tesseract, more specifically for correcting common errors in digitized historical documents. The proposed tool suggests alternatives for word forms not found in a specified vocabulary. The assumed error is replaced by a presumably correct alternative in the post-edition based on the scores of a Language Model (LM). The tool is tested on a chapter of the book An Essay Towards Regulating the Trade and Employing the Poor of this Kingdom. As demonstrated below, the tool is successful in correcting a number of common errors. If sometimes unreliable, it is also transparent and subject to human intervention

    Multiple segmentations of Thai sentences for neural machine translation

    Get PDF
    Thai is a low-resource language, so it is often the case that data is not available in sufficient quantities to train an Neural Machine Translation (NMT) model which perform to a high level of quality. In addition, the Thai script does not use white spaces to delimit the boundaries between words, which adds more complexity when building sequence to sequence models. In this work, we explore how to augment a set of English–Thai parallel data by replicating sentence-pairs with different word segmentation methods on Thai, as training data for NMT model training. Using different merge operations of Byte Pair Encoding, different segmentations of Thai sentences can be obtained. The experiments show that combining these datasets, performance is improved for NMT models trained with a dataset that has been split using a supervised splitting tool

    Development and in-vitro evaluation of a potentially implantable fibre-optic glucose sensor probe

    Get PDF
    Type I diabetics need regular injections of insulin to survive. Insulin allows the cells of the body to extract glucose from the blood supply to use as fuel. Without insulin the cells turn to other backup fuel sources,this can cause side effects that are quickly fatal or gradual wasting of the bodies tissues. The use of insulin, however, is not danger free, as an incorrect dosage can quickly lead to the reduction of glucose circulating in the blood to drop to a dangerously low level. Without glucose circulating in the blood supply the brain quickly runs out of fuel causing coma and death. Because of this, a means to constantly monitor blood glucose levels has been sought for the last two decades. With such a device, diabetics could judge the correct amount of insulin to inject and be warned of low blood glucose levels. However, to date no reliable portable system has been produced. Recent developments in fibre optic biosensor technology, suggested a possible route to achieves this goal. The work in this thesis presents the development and testing of such a sensor. The sensor presented in this thesis is based around a commercial fibre optic blood gas sensor, the Paratrend 7. The oxygen-sensing element of this device was modified into a glucose sensor using polymer membranes incorporating the enzymes glucose oxidase and catalase. The research was aimed at building a glucose sensor that could be developed into a working blood glucose sensor in the minimum amount of time if the research proved successful. For this reason the Paratrend 7 sensor system was chosen to provide a clinically tested sensor core around which the glucose sensor could be built. The initial experiment, which used a Paratrend7 sensor coated in polyHEMA and glucose oxidase, produced a sensor of diameter of 700”m with a range of 0 to 4mM/1 of glucose and a 90% response time of <100 seconds in a solution with a 15% oxygen tension. The sensor design was then developed to incorporate the enzyme catalase to protect the glucose oxidase and an outer diffusion limiting polyHEMA membrane. This produced a sensor with a range of 0 to 6 mM/l and a response time of <100 seconds. The method of coating the sensors was'then improved, through a series of stages, until an optomised dip coating technique was developed. This technique produced sensors with ranges (in 7.5KPa oxygen tension solutions) between 0 to 3mM/l and 0 to lOmM/1, responsetimes of <100 seconds in some cases and with diameters of 300”m. By using a partial polyurethane outer coat the range of the sensors was increased form 0 to 4mM/l up to 0 to 24mM/1, in one case, with 90% response times in the 100to 500 second range. The sensors were then sterilised using gamma radiation and their performance before and after sterilisation examined. The gamma sterilisation was found to cause a reduction in the range of the sensors,for example 0 to 24 m /I down to 0 to 14mM/l in one case. The affect of 24 hour operation in a 5mM/1 solution of glucose and storage, for up to three months, was then investigated. Both processes were found to reduce the operational range of the sensors,0 to 20 reduced to 0 to 15 mM/i, in one case,for 24 hour operation and form 0 to 15mM/1 reduced to 0 to 11mM/1in one case for a storage time of three months. The use of the enzymes glucose oxidase and catalase together in a fibre optic as can sensor has not been previously reported in the literature as far be ascertained. The comparison of sensor performance before and after gamma sterilisation also appears to be unique as does the gamma sterilisation of a fibre optic glucose sensor

    Are Photogrammetry and 3D Scanning a real alternative to 3D modelling for Virtual Heritage applications?

    Get PDF
    Photogrammetry is promoted as a quick way to create realistic 3D models, while handheld 3D scanners are advertised for projects that require greater accuracy. Both technologies are increasingly being targeted for virtual Heritage applications. This paper presents the initial findings of a project that compares both methods for creating assets to be used in game engines to make interactive presentations. A range of test objects were chosen from small artefacts of about 10 cm up to ground features of about 8m in length. Hardware included the Faro Freestyle handset, while photographs were taken using a Nikon digital camera. An iPhone was also found to take adequate images and had an advantage in confined spaces. The time it takes to capture data was equivalent for both methods. More photographs and data points improve the accuracy of models. Bright sunshine was a problem for both methods: the 3D scanner was unable to pick up any data, while hard shadows in photographs produced artefacts in the resulting model. Processing software included Scene, Meshlab and ReCap. The scanner software was quicker to process but stitching together multiple scans can lead to inaccuracies. The polygon count of the resulting models is too high to use in UE4 so further manipulation was required using Maya and Z-Brush. The creation of Normal maps can help preserve detail, but the accuracy of textures is diminished. To modify the models to enable them to be used within interactive game engines still requires a high degree of 3D modelling expertise

    The effect of exercise on venous gas emboli and decompression sickness in human subjects at 4.3 psia

    Get PDF
    The contribution of upper body exercise to altitude decompression sickness while at 4.3 psia after 3.5 or 4.0 hours of 100% oxygen prebreathing at 14.7 psia was determined by comparing the incidence and patterns of venous gas emboli (VGE), and the incidence of Type 1 decompression sickness (DCS) in 43 exercising male subjects and 9 less active male Doppler Technicians (DT's). Each subject exercised for 4 minutes at each of 3 exercise stations while at 4.3 psia. An additional 4 minutes were spent monitoring for VGE by the DT while the subject was supine on an examination cot. In the combined 3.5 and 4.0 hour oxygen prebreathe data, 13 subjects complained of Type 1 DCS compared to 9 complaints from DT's. VGE were detected in 28 subjects compared to 14 detections from DT's. A chi-square analysis of proportions showed no statistically significantly difference in the incidence of Type 1 DCS or VGE between the two groups; however, the average time to detect VGE and to report Tyep 1 DCS symptoms were statistically different. It was concluded that 4 to 6 hours of upper body exercise at metabolic rates simulating EVA metabolic rates hastens the initial detection of VGE and the time to report Type 1 DCS symptoms as compared to DT's
    • 

    corecore