31 research outputs found

    The application of non-linear curve fitting routines to the analysis of mid-infrared images obtained from single polymeric microparticles

    Get PDF
    For the first time, we report a series of time resolved images of a single PLGA microparticle undergoing hydrolysis at 70 °C that have been obtained using attenuated total reflectance-Fourier transform infrared spectroscopic (ATR-FTIR) imaging. A novel partially supervised non-linear curve fitting (NLCF) tool was developed to identify and fit peaks to the infrared spectrum obtained from each pixel within the 64 × 64 array. The output from the NLCF was evaluated by comparison with a traditional peak height (PH) data analysis approach and multivariate curve resolution alternating least squares (MCR-ALS) analysis for the same images, in order to understand the limitations and advantages of the NLCF methodology. The NLCF method was shown to facilitate consistent spatial resolution enhancement as defined using the step-edge approach on dry microparticle images when compared to images derived from both PH measurements and MCR-ALS. The NLCF method was shown to improve both the S/N and sharpness of images obtained during an evolving experiment, providing a better insight into the magnitude of hydration layers and particle dimension changes during hydrolysis. The NLCF approach facilitated the calculation of hydrolysis rate constants for both the glycolic (kG) and lactic (kL) acid segments of the PLGA copolymer. This represents a real advantage over MCR-ALS which could not distinguish between the two segments due to colinearity within the data. The NLCF approach made it possible to calculate the hydrolysis rate constants from a single pixel, unlike the peak height data analysis approach which suffered from poor S/N at each pixel. These findings show the potential value of applying NLCF to the study of real-time chemical processes at the micron scale, assisting in the understanding of the mechanisms of chemical processes that occur within microparticles and enhancing the value of the mid-IR ATR analysis

    In-Datacenter Performance Analysis of a Tensor Processing Unit

    Full text link
    Many architects believe that major improvements in cost-energy-performance must now come from domain-specific hardware. This paper evaluates a custom ASIC---called a Tensor Processing Unit (TPU)---deployed in datacenters since 2015 that accelerates the inference phase of neural networks (NN). The heart of the TPU is a 65,536 8-bit MAC matrix multiply unit that offers a peak throughput of 92 TeraOps/second (TOPS) and a large (28 MiB) software-managed on-chip memory. The TPU's deterministic execution model is a better match to the 99th-percentile response-time requirement of our NN applications than are the time-varying optimizations of CPUs and GPUs (caches, out-of-order execution, multithreading, multiprocessing, prefetching, ...) that help average throughput more than guaranteed latency. The lack of such features helps explain why, despite having myriad MACs and a big memory, the TPU is relatively small and low power. We compare the TPU to a server-class Intel Haswell CPU and an Nvidia K80 GPU, which are contemporaries deployed in the same datacenters. Our workload, written in the high-level TensorFlow framework, uses production NN applications (MLPs, CNNs, and LSTMs) that represent 95% of our datacenters' NN inference demand. Despite low utilization for some applications, the TPU is on average about 15X - 30X faster than its contemporary GPU or CPU, with TOPS/Watt about 30X - 80X higher. Moreover, using the GPU's GDDR5 memory in the TPU would triple achieved TOPS and raise TOPS/Watt to nearly 70X the GPU and 200X the CPU.Comment: 17 pages, 11 figures, 8 tables. To appear at the 44th International Symposium on Computer Architecture (ISCA), Toronto, Canada, June 24-28, 201

    Three principles for the progress of immersive technologies in healthcare training and education

    Get PDF

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    Impact of Fiber Treatment on the Oil Absorption Characteristics of Plant Fibers

    No full text
    Most plant fibers are good sorbents of oil; however, synthetic sorbents have a much higher sorption capacity (SC) than plant fibers. This study evaluated the effect of fiber treatments, specifically hot-water treatment and mercerization, on the absorption characteristics of selected plant fibers. Five common plant fibers—corn residues, soybean residues, cotton burr and stem (CBS), cattail, and oak—were evaluated for their absorption characteristics in crude oil, motor oil, deionized (DO) water, and a 80:20 mix of DO water. The fiber treatments included ground fiber (control), hot-water treatment at 80 °C for 4 h and 125 °C for 4 h, mercerization at room temp for 48 h, and mercerization at 300 °C for 1 h. The absorption capacity (AC) varied with fiber type, absorption medium, and fiber treatment. Mercerization at 300 °C increased the water absorption of soybean residue up to 8 g/g. Mercerization at room temperature and the hot-water treatment at 125 °C increased the crude oil absorption capacity. After certain treatments, the crude oil absorption capacity of CBS and corn fibers increased over 5 g/g, and the motor oil absorption capacity of cattail, corn, and soybean also increased to 4 to 5 g/g

    The application of attenuated total reflectance Fourier transform infrared spectroscopy to monitor the concentration and state of wate in solutions of a thermally responsive cellulose ether during gelation

    No full text
    This paper reports the use of ATR-FTIR with PLS data analysis to probe the thermal gelation behaviour of aqueous solutions of the cellulose ether, hydroxypropyl methylcellulose (HPMC). Spectroscopic changes in the ν(CO) region of the infrared spectra (collected using ATR) were shown to mark the onset of gelation and information about the temperature of gelation and the effect of the gel structure on the water hydrogen bonding network was elucidated. The use of PLS data analysis to quantify the water concentration within the gel at the ATR interface is highlighted. The dominance of intermolecular H-bonding over intramolecular H-bonding within the cellulose ether in solution was also observed. The ATR-FTIR data was in good agreement with rheological and DSC measurements conducted on the same systems. A discussion regarding the changes in shape of the ν(OH) band of the water within the gel is provided and an interpretation of these changes in terms of modifications of the hydrogen bond strength of associated water during syneresis is given.</p

    Weed seed spread and its prevention: The role of roadside wash down

    No full text
    Vehicles are one of the major vectors of long-distance weed seed spread. Viable seed removed from vehicles at roadside wash down facilities was studied at five locations in central Queensland, Australia over a 3-year period. Seed from 145 plant species, belonging to 34 different families, were identified in the sludge samples obtained from the wet particulate matter collection pit of the wash down facilities. Most of the species were annual forbs (50%) with small or very small seed size

    Transient Region Coverage in the Propulsion IVHM Technology Experiment

    No full text
    Over the last several years researchers at NASA Glenn and Ames Research Centers have developed a real-time fault detection and isolation system for propulsion subsystems of future space vehicles. The Propulsion IVHM Technology Experiment (PITEX), as it is called follows the model-based diagnostic methodology and employs Livingstone, developed at NASA Ames, as its reasoning engine. The system has been tested on,flight-like hardware through a series of nominal and fault scenarios. These scenarios have been developed using a highly detailed simulation of the X-34 flight demonstrator main propulsion system and include realistic failures involving valves, regulators, microswitches, and sensors. This paper focuses on one of the recent research and development efforts under PITEX - to provide more complete transient region coverage. It describes the development of the transient monitors, the corresponding modeling methodology, and the interface software responsible for coordinating the flow of information between the quantitative monitors and the qualitative, discrete representation Livingstone
    corecore