19 research outputs found

    Spinal implants - the problems of debris

    Get PDF
    Wear debris are known to incite a variety of biological responses when released from a joint replacement device. One such response is known as osteolysis-pathological destruction of bone. Osteolysis is the major cause of failure in joint replacements. The loss of bone around a joint replacement may cause an aseptic loosening of the implant and reduce options for revision surgery. The intervertebral disc may be replaced with a joint replacement device. Often, this is done with a ball on socket joint using a metal-on-polymer material combination. Ultra-high molecular weight polyethylene (UHMWPE), inherited from hip and knee implants, is a common choice in lumbar disc replacements. The wear debris from a Charité implant, tested in vitro, was characterised using computer vision techniques and machine learning. It was found that wear debris from this UHMWPE and metal implant produce debris that are particularly prone to illicit an immune reaction that could lead to osteolysis. To counter the release of wear debris into periprosthetic tissue where it can do harm, laser sintered Polyetherketoneketone (PEKK) was wear tested in an attempt to capture wear debris in the surface voids formed by the manufacturing process. Despite literature suggesting this could work, wear tests showed sintered PEKK is unsuitable as a bearing material

    Empirical studies in end-user computer-generated music composition systems

    Get PDF
    Computer music researchers dream of the perfect algorithm, in which the music generated is indistinguishable from, or even superior to, that composed by the world’s most talented composers. However, the fulfilment of this aim remains ambitious. This thesis pursues a different direction, proposing instead that computer-generated music techniques can be used as tools to support human composers, acting as a catalyst for human creativity, rather than a replacement.Computer-generated music remains a challenge. Techniques and systems are abundant, yet there has been little exploration of how these might be useful for end-users looking to compose with generative and algorithmic music techniques. User interfaces for computer-generated music systems are often inaccessible to non-programmers as they frequently neglect established composition workflow and design paradigms that are familiar to composers in the digital age. For this research, the Interactive Generative Music Environment (IGME) was developed for studying interaction and composition; building on the foundations established in modern music sequencing software, whilst integrating various computer-generated music techniques.Three original studies are presented, based on participatory design principles, and evaluated with a mix-methods approach that involved studying end-users engaged with the IGME software. Two studies were group sessions where 54 participants spent an hour with IGME, in either a controlled (lab) environment or remotely as part of a conference workshop. The third study provided users more time with the software, with interactions studied and analysed with the use of screen recording technologies. In total, over 80 hours of interaction data was captured.It was discovered that users need to understand several threshold concepts before engaging with computer-generated music, and have the necessary skills to debug musical problems within the generative output. The ability to do this requires pre-existing knowledge of music theory. The studies support the conclusion that computer-generated music is used more as a catalyst for composition than as a replacement for it.A range of recommendations and requirements for building computer-generated music systems are presented, and summarise the contributions to knowledge, along with signposts for future work

    Real-time data flow models and congestion management for wire and wireless IP networks

    Get PDF
    Includes abstract.Includes bibliographical references (leaves 103-111).In video streaming, network congestion compromises the video throughput performance and impairs its perceptual quality and may interrupt the display. Congestion control may take the form of rate adjustment through mechanisms by attempt to minimize the probability of congestion by adjusting the rate of the streaming video to match the available capacity of the network. This can be achieved either by adapting the quantization parameter of the video encoder or by varying the rate through a scalable video technique. This thesis proposes a congestion control protocol for streaming video where an interaction between the video source and the receiver is essential to monitor the network state. The protocol consists of adjusting the video transmission rate at the encoder whenever a change in the network conditions is observed and reported back to the sender

    Digital encoding of black and white facsimile signals

    Get PDF
    As the costs of digital signal processing and memory hardware are decreasing each year compared to those of transmission, it is increasingly economical to apply sophisticated source encoding techniques to reduce the transmission time for facsimile documents. With this intent, information lossy encoding schemes have been investigated in which the encoder is divided into two stages. Firstly, preprocessing, which removes redundant information from the original documents, and secondly, actual encoding of the preprocessed documents. [Continues.

    Distributed Optimisation in Wireless Sensor Networks: A Hierarchical Learning Approachs

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Self-organising maps : statistical analysis, treatment and applications.

    Get PDF
    This thesis presents some substantial theoretical analyses and optimal treatments of Kohonen's self-organising map (SOM) algorithm, and explores the practical application potential of the algorithm for vector quantisation, pattern classification, and image processing. It consists of two major parts. In the first part, the SOM algorithm is investigated and analysed from a statistical viewpoint. The proof of its universal convergence for any dimensionality is obtained using a novel and extended form of the Central Limit Theorem. Its feature space is shown to be an approximate multivariate Gaussian process, which will eventually converge and form a mapping, which minimises the mean-square distortion between the feature and input spaces. The diminishing effect of the initial states and implicit effects of the learning rate and neighbourhood function on its convergence and ordering are analysed and discussed. Distinct and meaningful definitions, and associated measures, of its ordering are presented in relation to map's fault-tolerance. The SOM algorithm is further enhanced by incorporating a proposed constraint, or Bayesian modification, in order to achieve optimal vector quantisation or pattern classification. The second part of this thesis addresses the task of unsupervised texture-image segmentation by means of SOM networks and model-based descriptions. A brief review of texture analysis in terms of definitions, perceptions, and approaches is given. Markov random field model-based approaches are discussed in detail. Arising from this a hierarchical self-organised segmentation structure, which consists of a local MRF parameter estimator, a SOM network, and a simple voting layer, is proposed and is shown, by theoretical analysis and practical experiment, to achieve a maximum likelihood or maximum a posteriori segmentation. A fast, simple, but efficient boundary relaxation algorithm is proposed as a post-processor to further refine the resulting segmentation. The class number validation problem in a fully unsupervised segmentation is approached by a classical, simple, and on-line minimum mean-square-error method. Experimental results indicate that this method is very efficient for texture segmentation problems. The thesis concludes with some suggestions for further work on SOM neural networks

    Damage detection and damage evolution monitoring of composite materials for naval applications using acoustic emission testing

    Get PDF
    Maritime transport has profound importance for the world economy. Vessels of all sizes constantly transport large numbers of passengers and goods across the sea, often under adverse operational conditions. Vessels need to exhibit high levels of reliability, availability, maintainability and safety (RAMS). However, at the same time their performance needs to be optimised ensuring the lowest possible fuel consumption with the maximum operational capacity and range without compromising RAMS. Sweating of naval assets and profitability should be maximised for the operator ensuring investment in future projects and supporting the growth of maritime transport and world economy as a whole. Vessels have been traditionally manufactured using naval steel grades such AH, DH and EH. Smaller leisure and specialised purpose vessels such as patrol boats, etc. have been built using fibre-reinforced composite (FRC) materials. This trend is gradually penetrating the market of larger commercial vessels including freight and cruise ships. However, these are still the early days and further investigation of the optimum FRC manufacturing techniques and mechanical properties together with an in-depth understanding of the damage mechanics are required before such materials can become more commonplace. This project has investigated different glass FRCs using different manufacturing techniques. Glass fibres are preferred due to their lower cost in comparison with carbon fibres. The use of carbon FRCs in maritime applications is limited to the fabrication of racing and high performance speedboat vessels. Samples manufactured under laboratory conditions have been compared with those manufactured by a shipyard. It has been seen that the in-house samples had generally superior performance. Steel-to-composite joints have also been assessed including different designs. The effect of different features in the design such as drilled holes and bolts on the mechanical performance of the manufactured samples has also been evaluated. The damage mechanisms involved during damage propagation and features causing damage initiation have been considered. Damage initiation and subsequent evolution have been monitored using acoustic emission (AE). Various signal processing approaches have been employed (manual and automatic) for optimum evaluation of the AE data obtained in a semiquantitative manner. It has been shown that AE could be applied effectively for structural health monitoring of naval structures in the field. Several factors and parameters that need to be considered during acquisition and analysis have been successfully determined. The key results of the study together with mechanical testing and characterisation of samples employed are presented in summarised form within the present thesis
    corecore