100 research outputs found

    A spatial impedance controller for robotic manipulation

    Get PDF
    Mechanical impedance is the dynamic generalization of stiffness, and determines interactive behavior by definition. Although the argument for explicitly controlling impedance is strong, impedance control has had only a modest impact on robotic manipulator control practice. This is due in part to the fact that it is difficult to select suitable impedances given tasks. A spatial impedance controller is presented that simplifies impedance selection. Impedance is characterized using ¿spatially affine¿ families of compliance and damping, which are characterized by nonspatial and spatial parameters. Nonspatial parameters are selected independently of configuration of the object with which the robot must interact. Spatial parameters depend on object configurations, but transform in an intuitive, well-defined way. Control laws corresponding to these compliance and damping families are derived assuming a commonly used robot model. While the compliance control law was implemented in simulation and on a real robot, this paper emphasizes the underlying theor

    Mechatronics of systems with undetermined configurations

    Get PDF
    This work is submitted for the award of a PhD by published works. It deals with some of the efforts of the author over the last ten years in the field of Mechatronics. Mechatronics is a new area invented by the Japanese in the late 1970's, it consists of a synthesis of computers and electronics to improve mechanical systems. To control any mechanical event three fundamental features must be brought together: the sensors used to observe the process, the control software, including the control algorithm used and thirdly the actuator that provides the stimulus to achieve the end result. Simulation, which plays such an important part in the Mechatronics process, is used in both in continuous and discrete forms. The author has spent some considerable time developing skills in all these areas. The author was certainly the first at Middlesex to appreciate the new developments in Mechatronics and their significance for manufacturing. The author was one of the first mechanical engineers to recognise the significance of the new transputer chip. This was applied to the LQG optimal control of a cinefilm copying process. A 300% improvement in operating speed was achieved, together with tension control. To make more efficient use of robots they have to be made both faster and cheaper. The author found extremely low natural frequencies of vibration, ranging from 3 to 25 Hz. This limits the speed of response of existing robots. The vibration data was some of the earliest available in this field, certainly in the UK. Several schemes have been devised to control the flexible robot and maintain the required precision. Actuator technology is one area where mechatronic systems have been the subject of intense development. At Middlesex we have improved on the Aexator pneumatic muscle actuator, enabling it to be used with a precision of about 2 mm. New control challenges have been undertaken now in the field of machine tool chatter and the prevention of slip. A variety of novel and traditional control algorithms have been investigated in order to find out the best approach to solve this problem

    Parallel algorithms for three dimensional electrical impedance tomography

    Get PDF
    This thesis is concerned with Electrical Impedance Tomography (EIT), an imaging technique in which pictures of the electrical impedance within a volume are formed from current and voltage measurements made on the surface of the volume. The focus of the thesis is the mathematical and numerical aspects of reconstructing the impedance image from the measured data (the reconstruction problem). The reconstruction problem is mathematically difficult and most reconstruction algorithms are computationally intensive. Many of the potential applications of EIT in medical diagnosis and industrial process control depend upon rapid reconstruction of images. The aim of this investigation is to find algorithms and numerical techniques that lead to fast reconstruction while respecting the real mathematical difficulties involved. A general framework for Newton based reconstruction algorithms is developed which describes a large number of the reconstruction algorithms used by other investigators. Optimal experiments are defined in terms of current drive and voltage measurement patterns and it is shown that adaptive current reconstruction algorithms are a special case of their use. This leads to a new reconstruction algorithm using optimal experiments which is considerably faster than other methods of the Newton type. A tomograph is tested to measure the magnitude of the major sources of error in the data used for image reconstruction. An investigation into the numerical stability of reconstruction algorithms identifies the resulting uncertainty in the impedance image. A new data collection strategy and a numerical forward model are developed which minimise the effects of, previously, major sources of error. A reconstruction program is written for a range of Multiple Instruction Multiple Data, (MIMD), distributed memory, parallel computers. These machines promise high computational power for low cost and so look promising as components in medical tomographs. The performance of several reconstruction algorithms on these computers is analysed in detail

    Dynamic Power Management for Neuromorphic Many-Core Systems

    Full text link
    This work presents a dynamic power management architecture for neuromorphic many core systems such as SpiNNaker. A fast dynamic voltage and frequency scaling (DVFS) technique is presented which allows the processing elements (PE) to change their supply voltage and clock frequency individually and autonomously within less than 100 ns. This is employed by the neuromorphic simulation software flow, which defines the performance level (PL) of the PE based on the actual workload within each simulation cycle. A test chip in 28 nm SLP CMOS technology has been implemented. It includes 4 PEs which can be scaled from 0.7 V to 1.0 V with frequencies from 125 MHz to 500 MHz at three distinct PLs. By measurement of three neuromorphic benchmarks it is shown that the total PE power consumption can be reduced by 75%, with 80% baseline power reduction and a 50% reduction of energy per neuron and synapse computation, all while maintaining temporary peak system performance to achieve biological real-time operation of the system. A numerical model of this power management model is derived which allows DVFS architecture exploration for neuromorphics. The proposed technique is to be used for the second generation SpiNNaker neuromorphic many core system

    15 years of experience with mechatronics research and education

    Get PDF
    This paper describes the experiences with mechatronic research projects and several educational structures in the University of Twente since 1989. Education took place in a two-year Mechatronic Designer programme, in specialisations in Electrical and Mechanical Engineering and in an (international) MSc programme. There are two-week mechatronic projects in the BSc curricula of EE and ME. Many of the PhD and MSc projects were done in projects sponsored by the industry or by application-oriented research programs. Research topics included modelling and simulation (learning) control, embedded systems and mechatronic design

    Current status and new horizons in Monte Carlo simulation of X-ray CT scanners

    Get PDF
    With the advent of powerful computers and parallel processing including Grid technology, the use of Monte Carlo (MC) techniques for radiation transport simulation has become the most popular method for modeling radiological imaging systems and particularly X-ray computed tomography (CT). The stochastic nature of involved processes such as X-ray photons generation, interaction with matter and detection makes MC the ideal tool for accurate modeling. MC calculations can be used to assess the impact of different physical design parameters on overall scanner performance, clinical image quality and absorbed dose assessment in CT examinations, which can be difficult or even impossible to estimate by experimental measurements and theoretical analysis. Simulations can also be used to develop and assess correction methods and reconstruction algorithms aiming at improving image quality and quantitative procedures. This paper focuses mainly on recent developments and future trends in X-ray CT MC modeling tools and their areas of application. An overview of existing programs and their useful features will be given together with recent developments in the design of computational anthropomorphic models of the human anatomy. It should be noted that due to limited space, the references contained herein are for illustrative purposes and are not inclusive; no implication that those chosen are better than others not mentioned is intende

    Analysis of Some Textured Images by Transputer

    Get PDF
    Texture, as a visual perception, can be easily seen by eye and often described without much difficulty. However, textural recognition and measurement by machine is a very different issue and has only recently been developed. In this thesis, a whole set of new algorithms have been developed to analyse textured images with particular reference to the requirements of soil microstructural applications. The new technology of parallel processing is used to implement and improve the complicated computations

    Swarms on Continuous Data

    Full text link
    While being it extremely important, many Exploratory Data Analysis (EDA) systems have the inhability to perform classification and visualization in a continuous basis or to self-organize new data-items into the older ones (evenmore into new labels if necessary), which can be crucial in KDD - Knowledge Discovery, Retrieval and Data Mining Systems (interactive and online forms of Web Applications are just one example). This disadvantge is also present in more recent approaches using Self-Organizing Maps. On the present work, and exploiting past sucesses in recently proposed Stigmergic Ant Systems a robust online classifier is presented, which produces class decisions on a continuous stream data, allowing for continuous mappings. Results show that increasingly better results are achieved, as demonstraded by other authors in different areas. KEYWORDS: Swarm Intelligence, Ant Systems, Stigmergy, Data-Mining, Exploratory Data Analysis, Image Retrieval, Continuous Classification.Comment: 6 pages, 3 figures, at http://alfa.ist.utl.pt/~cvrm/staff/vramos/ref_45.htm

    Towards Solving the Dopamine G Protein Coupled Receptor Modelling Problem

    Get PDF
    The overall aim of this work has been to furnish a model of the dopamine (DA) receptor D2. There are currently two sub-groups within the DA family of G protein coupled receptors (GPCRs): D1 sub-group (includes D1 and D5) and the D2 sub-group (includes D2, D3 and D4). Organon (UK) Ltd. supplied a disk containing the PDB atomic co-ordinates of the integral membrane protein bacteriorhodopsin (bRh; Henderson et al., 1975 and 1990) to use as a template to model D2 - the aim being to generate a model of D2 by simply mutating the side-residues of bRh. The assumption being that bRh had homology with members of the supergene class of GPCRs. However, using the GCG Wisconsin GAP algorithm (Devereux et al., 1984) no significant homology was detected between the primary structures of any member of the DA family of GPCRs and bRh. However, given the original brief to carry out homology modelling using bRh as a template (see appendix 1) I felt obliged to carry out further alignments using a shuffling technique and a standard statistical test to check for significant structural homology. The results clearly showed that there is no significant structural homology, on the basis of sequence similarity, between bRh and any member of the DA family of GPCRs. Indeed, the statistical analysis clearly demonstrated that while there is significant structural homology between every catecholamine binding GPCR, there is no structural homology what so ever between any catecholamine binding GPCR and bRh. Hydropathy analysis is frequently used to identify the location of putative transmembrane segments. However, is difficult to predict the end positions of each ptms. To this end a novel alignment algorithm (DH Scan) was coded to exploit transparallel supercomputer technology to provide a basis for identifying likely helix end points and to pinpoint areas of local homology between GPCRs. DH Scan clearly demonstrated characteristic transmembrane homology between different subtype DA GPCRs. Two further homology algorithms were coded (IH Scan and RH Scan) which provided evidence of internal homology. In particular IH Scan independently revealed a repeat region in the 3rd intracellular loop (iIII) of D4 and RH Scan revealed palindromic like short stretches of amino acids which were found to be particularly well represented in predicted ?-helices in each DA receptor subtype. In addition, the profile network prediction algorithm (PHD; Rost et al., 1994) predicted a short alpha-helix at greater than 80% probablility at each end of the third intracellular loop and between the carboxy terminal end of transmembrane VII and a conserved Cys residue in the forth intracellular loop. Fourier analysis of catecholamine binding GPCR primary structures in the form of a multiple-sequence file suggested that the consensus view that only those residues facing the protein interior are conserved is not entirely correct. In particular, transmembrane helices II and III do not exhibit residue conservancy characteristic of an amphipathic helix. It is proposed that these two helices undergo a form of helix interface shear to assist agonist binding to a Asp residue on helix II. This data in combination with information from a number of papers concerning helix shear interface mechanism and molecular dynamic studies of proline containing ?-helices suggested a physically plausible binding mechanism for agonists. While it was evident that homology modelling could not be scientifically justified, the combinatorial approach to protein modelling might be successfully applied to the transmembrane region of the D2 receptor. The probable arrangement of helices in the transmembrane region of GPCRs (Baldwin, 1993) which was based on a careful analysis of a low resolution projection map of rhodopsin (Gebhard et ah, 1993) was used as a guide to model the transmembrane region of D2. The backbone torsion angles of a helix with a middle Pro residue (Sankararamakrishnan et al., 1991) was used to model transmembrane helix V. Dopamine was successfully docked to the putative binding pocket of D2. Using this model as a template, models of D3 and D4 were produced. A separate model of Di was then produced and this in turn was used as a template to model D5
    corecore