2,607 research outputs found

    Automated NDT inspection for large and complex geometries of composite materials

    Get PDF
    Large components with complex geometries, made of composite materials, have become very common in modern structures. To cope with future demand projections, it is necessary to overcome the current non-destructive testing (NDT) bottlenecks encountered during the inspection phase of manufacture. This thesis investigates several aspects of the introduction of automation within the inspection process of complex parts. The use of six-axis robots for product inspection and non-destructive testing systems is the central investigation of this thesis. The challenges embraced by the research include the development of a novel controlling approach for robotic manipulators and of novel path-planning strategies. The integration of robot manipulators and NDT data acquisition instruments is optimized. An effective and reliable way to encode the NDT data through the interpolated robot feedback positions is implemented. The viability of the new external control method is evaluated experimentally. The observed maximum position and orientation errors are respectively within 2mm and within 1 degree, over an operating envelope of 3m³. A new software toolbox (RoboNDT), aimed at NDT technicians, has been developed during this work. RoboNDT is intended to transform the robot path-planning problem into an easy step of the inspection process. The software incorporates the novel path-planning algorithms developed during this research and is shaped to overcome practical limitations of current OLP software. The software has been experimentally validated using scans on real high value aerospace components. RoboNDT delivers tool-path errors that are lower than the errors given by commercial off-line path-planning software. For example the variability of the standoff is within 10 mm for the tool-paths created with the commercial software and within 4.5 mm for the RoboNDT tool-paths, over a scanned area of 1.6m². The output of this research was used to support a 3-year industrial project, called IntACom and led by TWI on behalf of major aerospace sponsors. The result is a demonstrator system, currently in use at TWI Technology Centre, which is capable of inspecting complex geometries with high throughput. The IntACom system can scan real components 2.8 times faster than traditional 3-DoF scanners deploying phased-array inspection and 6.7 times faster than commercial gantry systems deploying traditional single-element inspection.Large components with complex geometries, made of composite materials, have become very common in modern structures. To cope with future demand projections, it is necessary to overcome the current non-destructive testing (NDT) bottlenecks encountered during the inspection phase of manufacture. This thesis investigates several aspects of the introduction of automation within the inspection process of complex parts. The use of six-axis robots for product inspection and non-destructive testing systems is the central investigation of this thesis. The challenges embraced by the research include the development of a novel controlling approach for robotic manipulators and of novel path-planning strategies. The integration of robot manipulators and NDT data acquisition instruments is optimized. An effective and reliable way to encode the NDT data through the interpolated robot feedback positions is implemented. The viability of the new external control method is evaluated experimentally. The observed maximum position and orientation errors are respectively within 2mm and within 1 degree, over an operating envelope of 3m³. A new software toolbox (RoboNDT), aimed at NDT technicians, has been developed during this work. RoboNDT is intended to transform the robot path-planning problem into an easy step of the inspection process. The software incorporates the novel path-planning algorithms developed during this research and is shaped to overcome practical limitations of current OLP software. The software has been experimentally validated using scans on real high value aerospace components. RoboNDT delivers tool-path errors that are lower than the errors given by commercial off-line path-planning software. For example the variability of the standoff is within 10 mm for the tool-paths created with the commercial software and within 4.5 mm for the RoboNDT tool-paths, over a scanned area of 1.6m². The output of this research was used to support a 3-year industrial project, called IntACom and led by TWI on behalf of major aerospace sponsors. The result is a demonstrator system, currently in use at TWI Technology Centre, which is capable of inspecting complex geometries with high throughput. The IntACom system can scan real components 2.8 times faster than traditional 3-DoF scanners deploying phased-array inspection and 6.7 times faster than commercial gantry systems deploying traditional single-element inspection

    FPGA-Based Portable Ultrasound Scanning System with Automatic Kidney Detection

    Get PDF
    Bedsides diagnosis using portable ultrasound scanning (PUS) offering comfortable diagnosis with various clinical advantages, in general, ultrasound scanners suffer from a poor signal-to-noise ratio, and physicians who operate the device at point-of-care may not be adequately trained to perform high level diagnosis. Such scenarios can be eradicated by incorporating ambient intelligence in PUS. In this paper, we propose an architecture for a PUS system, whose abilities include automated kidney detection in real time. Automated kidney detection is performed by training the Viola–Jones algorithm with a good set of kidney data consisting of diversified shapes and sizes. It is observed that the kidney detection algorithm delivers very good performance in terms of detection accuracy. The proposed PUS with kidney detection algorithm is implemented on a single Xilinx Kintex-7 FPGA, integrated with a Raspberry Pi ARM processor running at 900 MHz

    Computational ultrasound tissue characterisation for brain tumour resection

    Get PDF
    In brain tumour resection, it is vital to know where critical neurovascular structuresand tumours are located to minimise surgical injuries and cancer recurrence. Theaim of this thesis was to improve intraoperative guidance during brain tumourresection by integrating both ultrasound standard imaging and elastography in thesurgical workflow. Brain tumour resection requires surgeons to identify the tumourboundaries to preserve healthy brain tissue and prevent cancer recurrence. Thisthesis proposes to use ultrasound elastography in combination with conventionalultrasound B-mode imaging to better characterise tumour tissue during surgery.Ultrasound elastography comprises a set of techniques that measure tissue stiffness,which is a known biomarker of brain tumours. The objectives of the researchreported in this thesis are to implement novel learning-based methods for ultrasoundelastography and to integrate them in an image-guided intervention framework.Accurate and real-time intraoperative estimation of tissue elasticity can guide towardsbetter delineation of brain tumours and improve the outcome of neurosurgery. We firstinvestigated current challenges in quasi-static elastography, which evaluates tissuedeformation (strain) by estimating the displacement between successive ultrasoundframes, acquired before and after applying manual compression. Recent approachesin ultrasound elastography have demonstrated that convolutional neural networkscan capture ultrasound high-frequency content and produce accurate strain estimates.We proposed a new unsupervised deep learning method for strain prediction, wherethe training of the network is driven by a regularised cost function, composed of asimilarity metric and a regularisation term that preserves displacement continuityby directly optimising the strain smoothness. We further improved the accuracy of our method by proposing a recurrent network architecture with convolutional long-short-term memory decoder blocks to improve displacement estimation and spatio-temporal continuity between time series ultrasound frames. We then demonstrateinitial results towards extending our ultrasound displacement estimation method toshear wave elastography, which provides a quantitative estimation of tissue stiffness.Furthermore, this thesis describes the development of an open-source image-guidedintervention platform, specifically designed to combine intra-operative ultrasoundimaging with a neuronavigation system and perform real-time ultrasound tissuecharacterisation. The integration was conducted using commercial hardware andvalidated on an anatomical phantom. Finally, preliminary results on the feasibilityand safety of the use of a novel intraoperative ultrasound probe designed for pituitarysurgery are presented. Prior to the clinical assessment of our image-guided platform,the ability of the ultrasound probe to be used alongside standard surgical equipmentwas demonstrated in 5 pituitary cases

    An Inertial-Optical Tracking System for Quantitative, Freehand, 3D Ultrasound

    Get PDF
    Three dimensional (3D) ultrasound has become an increasingly popular medical imaging tool over the last decade. It offers significant advantages over Two Dimensional (2D) ultrasound, such as improved accuracy, the ability to display image planes that are physically impossible with 2D ultrasound, and reduced dependence on the skill of the sonographer. Among 3D medical imaging techniques, ultrasound is the only one portable enough to be used by first responders, on the battlefield, and in rural areas. There are three basic methods of acquiring 3D ultrasound images. In the first method, a 2D array transducer is used to capture a 3D volume directly, using electronic beam steering. This method is mainly used for echocardiography. In the second method, a linear array transducer is mechanically actuated, giving a slower and less expensive alternative to the 2D array. The third method uses a linear array transducer that is moved by hand. This method is known as freehand 3D ultrasound. Whether using a 2D array or a mechanically actuated linear array transducer, the position and orientation of each image is known ahead of time. This is not the case for freehand scanning. To reconstruct a 3D volume from a series of 2D ultrasound images, assumptions must be made about the position and orientation of each image, or a mechanism for detecting the position and orientation of each image must be employed. The most widely used method for freehand 3D imaging relies on the assumption that the probe moves along a straight path with constant orientation and speed. This method requires considerable skill on the part of the sonographer. Another technique uses features within the images themselves to form an estimate of each image\u27s relative location. However, these techniques are not well accepted for diagnostic use because they are not always reliable. The final method for acquiring position and orientation information is to use a six Degree-of-Freedom (6 DoF) tracking system. Commercially available 6 DoF tracking systems use magnetic fields, ultrasonic ranging, or optical tracking to measure the position and orientation of a target. Although accurate, all of these systems have fundamental limitations in that they are relatively expensive and they all require sensors or transmitters to be placed in fixed locations to provide a fixed frame of reference. The goal of the work presented here is to create a probe tracking system for freehand 3D ultrasound that does not rely on any fixed frame of reference. This system tracks the ultrasound probe using only sensors integrated into the probe itself. The advantages of such a system are that it requires no setup before it can be used, it is more portable because no extra equipment is required, it is immune from environmental interference, and it is less expensive than external tracking systems. An ideal tracking system for freehand 3D ultrasound would track in all 6 DoF. However, current sensor technology limits this system to five. Linear transducer motion along the skin surface is tracked optically and transducer orientation is tracked using MEMS gyroscopes. An optical tracking system was developed around an optical mouse sensor to provide linear position information by tracking the skin surface. Two versions were evaluated. One included an optical fiber bundle and the other did not. The purpose of the optical fiber is to allow the system to integrate more easily into existing probes by allowing the sensor and electronics to be mounted away from the scanning end of the probe. Each version was optimized to track features on the skin surface while providing adequate Depth Of Field (DOF) to accept variation in the height of the skin surface. Orientation information is acquired using a 3 axis MEMS gyroscope. The sensor was thoroughly characterized to quantify performance in terms of accuracy and drift. This data provided a basis for estimating the achievable 3D reconstruction accuracy of the complete system. Electrical and mechanical components were designed to attach the sensor to the ultrasound probe in such a way as to simulate its being embedded in the probe itself. An embedded system was developed to perform the processing necessary to translate the sensor data into probe position and orientation estimates in real time. The system utilizes a Microblaze soft core microprocessor and a set of peripheral devices implemented in a Xilinx Spartan 3E field programmable gate array. The Xilinx Microkernel real time operating system performs essential system management tasks and provides a stable software platform for implementation of the inertial tracking algorithm. Stradwin 3D ultrasound software was used to provide a user interface and perform the actual 3D volume reconstruction. Stradwin retrieves 2D ultrasound images from the Terason t3000 portable ultrasound system and communicates with the tracking system to gather position and orientation data. The 3D reconstruction is generated and displayed on the screen of the PC in real time. Stradwin also provides essential system features such as storage and retrieval of data, 3D data interaction, reslicing, manual 3D segmentation, and volume calculation for segmented regions. The 3D reconstruction performance of the system was evaluated by freehand scanning a cylindrical inclusion in a CIRS model 044 ultrasound phantom. Five different motion profiles were used and each profile was repeated 10 times. This entire test regimen was performed twice, once with the optical tracking system using the optical fiber bundle, and once with the optical tracking system without the optical fiber bundle. 3D reconstructions were performed with and without the position and orientation data to provide a basis for comparison. Volume error and surface error were used as the performance metrics. Volume error ranged from 1.3% to 5.3% with tracking information versus 15.6% to 21.9% without for the version of the system without the optical fiber bundle. Volume error ranged from 3.7% to 7.6% with tracking information versus 8.7% to 13.7% without for the version of the system with the optical fiber bundle. Surface error ranged from 0.319 mm RMS to 0.462 mm RMS with tracking information versus 0.678 mm RMS to 1.261 mm RMS without for the version of the system without the optical fiber bundle. Surface error ranged from 0.326 mm RMS to 0.774 mm RMS with tracking information versus 0.538 mm RMS to 1.657 mm RMS without for the version of the system with the optical fiber bundle. The prototype tracking system successfully demonstrated that accurate 3D ultrasound volumes can be generated from 2D freehand data using only sensors integrated into the ultrasound probe. One serious shortcoming of this system is that it only tracks 5 of the 6 degrees of freedom required to perform complete 3D reconstructions. The optical system provides information about linear movement but because it tracks a surface, it cannot measure vertical displacement. Overcoming this limitation is the most obvious candidate for future research using this system. The overall tracking platform, meaning the embedded tracking computer and the PC software, developed and integrated in this work, is ready to take advantage of vertical displacement data, should a method be developed for sensing it

    Automated and Standardized Tools for Realistic, Generic Musculoskeletal Model Development

    Get PDF
    Human movement is an instinctive yet challenging task that involves complex interactions between the neuromusculoskeletal system and its interaction with the surrounding environment. One key obstacle in the understanding of human locomotion is the availability and validity of experimental data or computational models. Corresponding measurements describing the relationships of the nervous and musculoskeletal systems and their dynamics are highly variable. Likewise, computational models and musculoskeletal models in particular are vitally dependent on these measurements to define model behavior and mechanics. These measurements are often sparse and disparate due to unsystematic data collection containing variable methodologies and reporting conventions. To date, there is not a framework to concatenate and manage musculoskeletal data (muscle moment arms and lengths). These morphological measurements need to be assembled to manage, compare, and analyze these data to develop comprehensive musculoskeletal models. Such a framework would enable researchers to select and update the posture-dependent relationships necessary to describe musculoskeletal dynamics, which are essential for simulation of muscle and joint torques in movement. Analogous to all simulations, these models require rigorous validation to ensure their accuracy. This is particularly important for musculoskeletal models that represent high-dimensional, posture-dependent relationships developed from limited and variable datasets. Here, I developed a computational workflow to collect and manage moment arm datasets from available published literature for the development of a human lower-limb musculoskeletal model. The moment arm relationships from multiple datasets were then used to create complete moment arm descriptions for all major leg muscles and were validated within a generic musculoskeletal model. These developments are crucial in advancing musculoskeletal modeling by providing standardized software and workflows for managing high-dimensional and posture-dependent morphological data to creating realistic and robust musculoskeletal models

    Kinematics and Robot Design II (KaRD2019) and III (KaRD2020)

    Get PDF
    This volume collects papers published in two Special Issues “Kinematics and Robot Design II, KaRD2019” (https://www.mdpi.com/journal/robotics/special_issues/KRD2019) and “Kinematics and Robot Design III, KaRD2020” (https://www.mdpi.com/journal/robotics/special_issues/KaRD2020), which are the second and third issues of the KaRD Special Issue series hosted by the open access journal robotics.The KaRD series is an open environment where researchers present their works and discuss all topics focused on the many aspects that involve kinematics in the design of robotic/automatic systems. It aims at being an established reference for researchers in the field as other serial international conferences/publications are. Even though the KaRD series publishes one Special Issue per year, all the received papers are peer-reviewed as soon as they are submitted and, if accepted, they are immediately published in MDPI Robotics. Kinematics is so intimately related to the design of robotic/automatic systems that the admitted topics of the KaRD series practically cover all the subjects normally present in well-established international conferences on “mechanisms and robotics”.KaRD2019 together with KaRD2020 received 22 papers and, after the peer-review process, accepted only 17 papers. The accepted papers cover problems related to theoretical/computational kinematics, to biomedical engineering and to other design/applicative aspects

    SONAR SYSTEM SOFTWARE AND HARDWARE DEVELOPMENT: FISH FINDER APPLICATION

    Get PDF
    A study of how to develop a sonar system is presented in this thesis. A typical sonar system has two sub circuits, namely, transmit path and receive path. On transmit path, HV (high voltage) pulses are generated and transmitted into the water using a transducer. On the receive path, the echo of the transmitted signal is received. This received signal is then amplified and converted into digital data using an ADC. The digital data is then sent to a PC for signal processing. After signal processing, the results are shown in a GUI. Software technologies used in this study include Fish Finder GUI, Visual C++ GUI and back-end code, AFE5809 GUI, and High Speed Data Converter Pro (by Texas Instruments). Matlab was used to develop the Fish Finder GUI similar to commercial fish finders. The Fish Finder GUI also does noise filtering as well as calculating the depth and distance to the detected objects. Visual C++ was used to develop a software, called, Control GUI. This software, receives the data from the TSW1400, saves the data as a .CSV file. It also connects all used software programs together to work as a complete fish finder system. The developed sonar system was tested in 1-meter water, as well as 10-meter shore side test. First, HV pulse was generated and transmitted into the water. After that, the echo of the pulse was captured. Then, the echo was converted into digital data. Finally, the digital data was the digital data results were successfully shown in the GUIs.Chapter 1. Introduction 12 Chapter 2. Theory of Sonar Systems 15 2.1 Overview 15 2.2 Traditional pulse and Chirp 15 2.3 Sonar receive sub-path 16 2.4 Beamforming 17 2.5 Signal-to-Noise Ratio (SNR) 18 2.6 Velocity of sound in water 19 2.7 Sonar Transducer 20 Chapter 3. Sonar Prototype Hardware 21 3.1 Sonar System Prototype 21 3.2. Sonar Transmit Path Sub-Circuit 22 3.2.1 OMAP-L138: 22 3.2.2 HV (High-Voltage) Pulse Generator: 23 3.2.3 Transducer: 23 3.3. Sonar Receive Path Sub-Circuit 23 3.3.1 TX810 24 3.3.2 AFE5809 25 3.3.3 TSW1400 25 3.2.3 Transducer 25 3.3.4 PC 26 3.4. Details of each sonar component 26 3.4.1 OMAP-L138 26 3.4.2 OMAP-L138’s Enhanced High-Resolution Pulse-Width Modulator (eHRPWM) 27 3.4.3. HV (High-Voltage) Pulse Generator 29 3.4.4. Transducer 31 3.4.5. TX810: 32 3.4.6. AFE5809 33 3.4.7 TSW1400 36 Chapter 4. Sonar Prototype Software 39 4.1 SW.1 CCS: Control of A.1 OMAP-L138 39 4.2 SW.2 AFE5809 GUI: Control of B.2 AFE5809 40 4.3 SW.3 HSDC Pro: Control of B.3 TSW1400 41 4.4 SW.4 Control GUI: 42 4.5 SW.5 Fish Finder GUI: Signal Processing and GUI 43 Chapter 5. Experiments 47 5.1 Transmit Sub-Circuit Test 47 5.2 Receive Sub-Circuit Test 47 5.3 1-meter test 49 5.4 10-meter shore test 57 Chapter 6. Conclusion 60 Future direction 60 References 61 Appendix A. Control GUI 63 Appendix B. The Schematics of the custom HV Pulse Generator 73 Appendix C. Prototype Photos 74 Appendix D. Snippet of C++ code used to configure A.1 OMAP-L138’s eHRPWM 76 Appendix E. Snippet of the code behind Fish Finder GUI (Matlab) 78 Appendix F. Schematics of AFE5809 EVM 80 Appendix G. Schematic of TX810 EVM 90 Appendix H. Schematic of OMAP-L138 EVM 91Maste
    corecore