19 research outputs found
Spinal implants - the problems of debris
Wear debris are known to incite a variety of biological responses when released from a joint replacement device. One such response is known as osteolysis-pathological destruction of bone. Osteolysis is the major cause of failure in joint replacements. The loss of bone around a joint replacement may cause an aseptic loosening of the implant and reduce options for revision surgery. The intervertebral disc may be replaced with a joint replacement device. Often, this is done with a ball on socket joint using a metal-on-polymer material combination. Ultra-high molecular weight polyethylene (UHMWPE), inherited from hip and knee implants, is a common choice in lumbar disc replacements.
The wear debris from a Charité implant, tested in vitro, was characterised using computer vision techniques and machine learning. It was found that wear debris from this UHMWPE and metal implant produce debris that are particularly prone to illicit an immune reaction that could lead to osteolysis.
To counter the release of wear debris into periprosthetic tissue where it can do harm, laser sintered Polyetherketoneketone (PEKK) was wear tested in an attempt to capture wear debris in the surface voids formed by the manufacturing process. Despite literature suggesting this could work, wear tests showed sintered PEKK is unsuitable as a bearing material
Empirical studies in end-user computer-generated music composition systems
Computer music researchers dream of the perfect algorithm, in which the music generated is indistinguishable from, or even superior to, that composed by the world’s most talented composers. However, the fulfilment of this aim remains ambitious. This thesis pursues a different direction, proposing instead that computer-generated music techniques can be used as tools to support human composers, acting as a catalyst for human creativity, rather than a replacement.Computer-generated music remains a challenge. Techniques and systems are abundant, yet there has been little exploration of how these might be useful for end-users looking to compose with generative and algorithmic music techniques. User interfaces for computer-generated music systems are often inaccessible to non-programmers as they frequently neglect established composition workflow and design paradigms that are familiar to composers in the digital age. For this research, the Interactive Generative Music Environment (IGME) was developed for studying interaction and composition; building on the foundations established in modern music sequencing software, whilst integrating various computer-generated music techniques.Three original studies are presented, based on participatory design principles, and evaluated with a mix-methods approach that involved studying end-users engaged with the IGME software. Two studies were group sessions where 54 participants spent an hour with IGME, in either a controlled (lab) environment or remotely as part of a conference workshop. The third study provided users more time with the software, with interactions studied and analysed with the use of screen recording technologies. In total, over 80 hours of interaction data was captured.It was discovered that users need to understand several threshold concepts before engaging with computer-generated music, and have the necessary skills to debug musical problems within the generative output. The ability to do this requires pre-existing knowledge of music theory. The studies support the conclusion that computer-generated music is used more as a catalyst for composition than as a replacement for it.A range of recommendations and requirements for building computer-generated music systems are presented, and summarise the contributions to knowledge, along with signposts for future work
Real-time data flow models and congestion management for wire and wireless IP networks
Includes abstract.Includes bibliographical references (leaves 103-111).In video streaming, network congestion compromises the video throughput performance and impairs its perceptual quality and may interrupt the display. Congestion control may take the form of rate adjustment through mechanisms by attempt to minimize the probability of congestion by adjusting the rate of the streaming video to match the available capacity of the network. This can be achieved either by adapting the quantization parameter of the video encoder or by varying the rate through a scalable video technique. This thesis proposes a congestion control protocol for streaming video where an interaction between the video source and the receiver is essential to monitor the network state. The protocol consists of adjusting the video transmission rate at the encoder whenever a change in the network conditions is observed and reported back to the sender
Digital encoding of black and white facsimile signals
As the costs of digital signal processing and memory hardware are
decreasing each year compared to those of transmission, it is
increasingly economical to apply sophisticated source encoding
techniques to reduce the transmission time for facsimile documents.
With this intent, information lossy encoding schemes have been
investigated in which the encoder is divided into two stages.
Firstly, preprocessing, which removes redundant information from
the original documents, and secondly, actual encoding of the preprocessed
documents. [Continues.
Distributed Optimisation in Wireless Sensor Networks: A Hierarchical Learning Approachs
Ph.DDOCTOR OF PHILOSOPH
Self-organising maps : statistical analysis, treatment and applications.
This thesis presents some substantial theoretical analyses and optimal treatments
of Kohonen's self-organising map (SOM) algorithm, and explores the practical
application potential of the algorithm for vector quantisation, pattern classification,
and image processing. It consists of two major parts. In the first part, the SOM
algorithm is investigated and analysed from a statistical viewpoint. The proof of its
universal convergence for any dimensionality is obtained using a novel and
extended form of the Central Limit Theorem. Its feature space is shown to be an
approximate multivariate Gaussian process, which will eventually converge and
form a mapping, which minimises the mean-square distortion between the feature
and input spaces. The diminishing effect of the initial states and implicit effects of
the learning rate and neighbourhood function on its convergence and ordering are
analysed and discussed. Distinct and meaningful definitions, and associated
measures, of its ordering are presented in relation to map's fault-tolerance. The
SOM algorithm is further enhanced by incorporating a proposed constraint, or
Bayesian modification, in order to achieve optimal vector quantisation or pattern
classification. The second part of this thesis addresses the task of unsupervised
texture-image segmentation by means of SOM networks and model-based
descriptions. A brief review of texture analysis in terms of definitions, perceptions,
and approaches is given. Markov random field model-based approaches are
discussed in detail. Arising from this a hierarchical self-organised segmentation
structure, which consists of a local MRF parameter estimator, a SOM network, and
a simple voting layer, is proposed and is shown, by theoretical analysis and
practical experiment, to achieve a maximum likelihood or maximum a posteriori
segmentation. A fast, simple, but efficient boundary relaxation algorithm is
proposed as a post-processor to further refine the resulting segmentation. The class
number validation problem in a fully unsupervised segmentation is approached by
a classical, simple, and on-line minimum mean-square-error method. Experimental
results indicate that this method is very efficient for texture segmentation
problems. The thesis concludes with some suggestions for further work on SOM
neural networks
Recommended from our members
Object-oriented analysis and design of computational intelligence systems
Machine learning from data, neuro-fuzzy information processing, approximate reasoning and genetic and evolutionary computation are all aspects of computational intelligence (also called soft computing methods). Soft computing methods differ from conventional computing in that they are tolerant of imprecision, uncertainty and partial truths. These characteristics can be exploited to achieve tractability, robustness and low solution costs when the solution to a complex (in machine terms) problem is required. The principal constituents of soft computing include: Neural Networks, Fuzzy Logic and Probabilistic Reasoning Systems. Genetic Algorithms (GAs), Evolutionary Algorithms, Chaos Theory', Complexity Theory and parts of Learning Theory all come under Probabilistic Reasoning Systems. Hybrid systems can be designed incorporating 2 or more aspects of soft computing that are more powerful than any of the components used in a stand alone fashion. A unified framework is needed to implement and manipulate such systems. Such a framework will allow for easy visualisation of the underlying concepts and easy modification of the resulting computer models. In this thesis, an investigation of the major aspects of computational intelligence has been carried out. The main emphasis has been placed on developing an object-oriented framework for architecting computational intelligence systems. Object models for Neural Networks, Fuzzy Logic Systems and Evolutionary Computation systems have been developed. Software has been written in C++ to realise sample implementations of the various systems. Finally, practical applications and the results of using the Neural Networks, Fuzzy Logic systems and Genetic Algorithms developed in solving real world problems are presented. A consistent notation based on the Object Modelling Technique (OMT) is used throughout the thesis to describe the software architectures from which the computer implementation models have been derived
Damage detection and damage evolution monitoring of composite materials for naval applications using acoustic emission testing
Maritime transport has profound importance for the world economy. Vessels of all sizes constantly transport large numbers of passengers and goods across the sea, often under adverse operational conditions. Vessels need to exhibit high levels of reliability, availability, maintainability and safety (RAMS). However, at the same time their performance needs to be optimised ensuring the lowest possible fuel consumption with the maximum operational capacity and range without compromising RAMS. Sweating of naval assets and profitability should be maximised for the operator ensuring investment in future projects and supporting the growth of maritime transport and world economy as a whole.
Vessels have been traditionally manufactured using naval steel grades such AH, DH and EH. Smaller leisure and specialised purpose vessels such as patrol boats, etc. have been built using fibre-reinforced composite (FRC) materials. This trend is gradually penetrating the market of larger commercial vessels including freight and cruise ships. However, these are still the early days and further investigation of the optimum FRC manufacturing techniques and mechanical properties together with an in-depth understanding of the damage mechanics are required before such materials can become more commonplace.
This project has investigated different glass FRCs using different manufacturing techniques. Glass fibres are preferred due to their lower cost in comparison with carbon fibres. The use of carbon FRCs in maritime applications is limited to the fabrication of racing and high performance speedboat vessels. Samples manufactured under laboratory conditions have been compared with those manufactured by a shipyard. It has been seen that the in-house samples had generally superior performance. Steel-to-composite joints have also been assessed including different designs. The effect of different features in the design such as drilled holes and bolts on the mechanical performance of the manufactured samples has also been evaluated.
The damage mechanisms involved during damage propagation and features causing damage initiation have been considered. Damage initiation and subsequent evolution have been monitored using acoustic emission (AE). Various signal processing approaches have been employed (manual and automatic) for optimum evaluation of the AE data obtained in a semiquantitative manner. It has been shown that AE could be applied effectively for structural health monitoring of naval structures in the field. Several factors and parameters that need to be considered during acquisition and analysis have been successfully determined. The key results of the study together with mechanical testing and characterisation of samples employed are presented in summarised form within the present thesis
Recommended from our members
The mobile information access experience - A user perspective
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Mobile technologies, such as mobile phones, smartphones and Palmtop computers,
are in an upwards trend and earliest models of such devices are already available to
end-users to communicate and access multimedia content on-the-move. As a logical
outcome of this development in mobile technologies and devices, content provider
companies have already started investing and piloting mobile multimedia content
distribution and broadcasting technologies. Nevertheless, no matter how cutting-edge
technology is and no matter how stylish the mobile devices are, the ultimate success
of wireless communication technologies and devices are directly associated with the
user adoption and embrace of these new equipment and technologies. In this perspective, since multimedia content, for mobile or not, is ultimately
produced for the education and/or enjoyment of viewers, the user's perspective
concerning the presentation quality is surely of equal importance as objective Quality
of Service (QoS) technical parameters, to defining distributed multimedia quality. In
order to comprehensively understand user experiences whilst accessing information
using mobile devices and technologies, we investigate user-mobile device interaction
and look into the surrounding issues in a uniform manner by combining multiple
aspects: user initial device experience (Out-of-Box Experience), mobile information
access in a real-world context, device impact on user information access and
perceptually tailored multimedia content impact on user information assimilation and
satisfaction. Accordingly, an extensive experimental investigation has been
undertaken to see how user experiences varied based on device familiarity, device
type, real-world context and variable locations. The findings has shown that the
overall perception, and effectively the user information access experience, is affected
and improved when multimedia content is tailored according to user device type and
context. Thus highlights that the future of mobile computing necessitates two-faceted
research, which should combine both a user as well as a technical perspective