60 research outputs found
General Rotation Invariance Learning for Point Clouds via Weight-Feature Alignment
Compared to 2D images, 3D point clouds are much more sensitive to rotations.
We expect the point features describing certain patterns to keep invariant to
the rotation transformation. There are many recent SOTA works dedicated to
rotation-invariant learning for 3D point clouds. However, current
rotation-invariant methods lack generalizability on the point clouds in the
open scenes due to the reliance on the global distribution, \ie the global
scene and backgrounds. Considering that the output activation is a function of
the pattern and its orientation, we need to eliminate the effect of the
orientation.In this paper, inspired by the idea that the network weights can be
considered a set of points distributed in the same 3D space as the input
points, we propose Weight-Feature Alignment (WFA) to construct a local
Invariant Reference Frame (IRF) via aligning the features with the principal
axes of the network weights. Our WFA algorithm provides a general solution for
the point clouds of all scenes. WFA ensures the model achieves the target that
the response activity is a necessary and sufficient condition of the pattern
matching degree. Practically, we perform experiments on the point clouds of
both single objects and open large-range scenes. The results suggest that our
method almost bridges the gap between rotation invariance learning and normal
methods.Comment: 4 figure
Event-Driven Technologies for Reactive Motion Planning: Neuromorphic Stereo Vision and Robot Path Planning and Their Application on Parallel Hardware
Die Robotik wird immer mehr zu einem Schlüsselfaktor des technischen Aufschwungs. Trotz beeindruckender Fortschritte in den letzten Jahrzehnten, übertreffen Gehirne von Säugetieren in den Bereichen Sehen und Bewegungsplanung
noch immer selbst die leistungsfähigsten Maschinen. Industrieroboter sind sehr schnell und präzise, aber ihre Planungsalgorithmen sind in hochdynamischen Umgebungen, wie sie für die Mensch-Roboter-Kollaboration (MRK) erforderlich sind, nicht leistungsfähig genug. Ohne schnelle und adaptive Bewegungsplanung kann sichere MRK nicht garantiert werden. Neuromorphe Technologien, einschließlich visueller Sensoren und Hardware-Chips, arbeiten asynchron und verarbeiten so raum-zeitliche Informationen sehr effizient. Insbesondere ereignisbasierte visuelle Sensoren sind konventionellen, synchronen Kameras bei vielen Anwendungen bereits überlegen. Daher haben ereignisbasierte Methoden
ein großes Potenzial, schnellere und energieeffizientere Algorithmen zur Bewegungssteuerung in der MRK zu ermöglichen. In dieser Arbeit wird ein Ansatz zur flexiblen reaktiven Bewegungssteuerung eines Roboterarms vorgestellt. Dabei
wird die Exterozeption durch ereignisbasiertes Stereosehen erreicht und die Pfadplanung ist in einer neuronalen Repräsentation des Konfigurationsraums implementiert. Die Multiview-3D-Rekonstruktion wird durch eine qualitative Analyse in Simulation evaluiert und auf ein Stereo-System ereignisbasierter Kameras übertragen. Zur Evaluierung der reaktiven kollisionsfreien Online-Planung wird ein Demonstrator mit einem industriellen Roboter genutzt. Dieser wird auch für eine vergleichende Studie zu sample-basierten Planern verwendet. Ergänzt wird
dies durch einen Benchmark von parallelen Hardwarelösungen wozu als Testszenario Bahnplanung in der Robotik gewählt wurde. Die Ergebnisse zeigen, dass die vorgeschlagenen neuronalen Lösungen einen effektiven Weg zur Realisierung einer Robotersteuerung für dynamische Szenarien darstellen. Diese Arbeit schafft eine Grundlage für neuronale Lösungen bei adaptiven Fertigungsprozesse, auch in Zusammenarbeit mit dem Menschen, ohne Einbußen bei Geschwindigkeit und Sicherheit. Damit ebnet sie den Weg für die Integration von dem Gehirn nachempfundener Hardware und Algorithmen in die Industrierobotik und MRK
Recommended from our members
Detailed population balance modelling of industrial titania synthesis
This thesis presents an efficient and robust detailed population balance framework for simulating aerosol synthesis of structured particles using a stochastic method. This is developed in the context of the industrial titania (TiO2) process to enable extensive numerical characterisation of the pigmentary product.
A reactor network model is used to provide a modular treatment of the reactor and account for key features, including multiple reactant injections, and tubular reaction and cooling zones. This approach simplifies the flow field in order to focus computational effort on resolving particle structure using a high-dimensional particle model and its modularity offers flexibility to investigate different configurations. Initial results are presented using a pre-defined temperature profile in the network, and the particulate product is characterised by its property distributions. Numerical performance is studied, highlighting the high computational cost of simulating strong phase-coupling, fast process rates, and broad particle size distributions.
A novel hybrid particle model is developed to address these challenges. The hybrid particle model employs a univariate description of small particles and switches to a detailed particle model to resolve morphology of more complicated, aggregate particles. New simulation algorithms are presented to manage interactions between particles of each type. The hybrid model is shown to improve efficiency (resolution versus computational cost) and robustness (sensitivity to numerical parameters), while generating the same solutions and convergence behaviour as earlier models.
The reactor model is extended, utilizing the superior numerical performance of the new hybrid particle model to enable inclusion of a system energy balance for more accurate study of a broad range of process conditions, and a more sophisticated particle model to resolve particle geometry. These contributions facilitate the study of particle structure and its sensitivity to reactor design and operational choices, providing insight into how operation affects characteristics of the particles and allowing direct comparison with experimental images of the pigmentary product.This research was supported by the National Research Foundation, Prime Minister's Office, Singapore under its Campus for Research Excellence and Technological Enterprise (CREATE) programme, and by Venator
Multi-Scale Force Transmission to and Within the Nucleus.
PhD Theses.The mechanical state of cells, controlled primarily by cytoskeletal (CSK) networks
(actin, microtubules and intermediate filaments) is a critical component of maintaining
healthy function. Forces transmitted through the cytoskeleton influence
the organisation and state of nuclear material, leading to changes in gene expression.
This thesis aims to increase our understanding of the role of the CSK
networks, specifically the intermediate filament keratin, and their interplay in integrating
mechanical forces. We primarily use immunofluorescence imaging of the
CSK networks and the nucleus, supported by Atomic Force Microscopy. We work
in human epidermal keratinocytes (HEKs), as they are rich in keratin, whose role
in cytoskeletal force transmission is under-studied.
Since drugs to disrupt keratin are scarce, we first established that Withaferin-A, a
compound previously used to disrupt vimentin intermediate filaments, can disrupt
keratin at non cyto-toxic doses; impacting cell mechanics and migration.
Following from this, Withaferin-A was used alongside established cyto-modulatory
drugs to disrupt CSK networks, quantifying a range of properties describing their
organisation. These data were fitted to nuclear parameters that described opposing
functions on the nuclear state of HEKs for keratin and tubulin, with keratin
protecting the nucleus from mechanical force.
Finally, machine and deep learning techniques were used to expand the mathematical
modelling of data. By training networks to predict nuclear location from
only CSK images, a causative relationship between CSK organisation and nuclear
location can be derived. In addition, we develop new models to rapidly analyse
Atomic Force Microscopy curves and generate synthetic cell images.
These results demonstrate the important role of keratin in protecting the nucleus
from mechanical force and that deep learning techniques can be used in the study
of cell mechanics to gain new insights
An Energy-Efficient and Reliable Data Transmission Scheme for Transmitter-based Energy Harvesting Networks
Energy harvesting technology has been studied to overcome a limited power resource problem for a sensor network. This paper proposes a new data transmission period control and reliable data transmission algorithm for energy harvesting based sensor networks. Although previous studies proposed a communication protocol for energy harvesting based sensor networks, it still needs additional discussion. Proposed algorithm control a data transmission period and the number of data transmission dynamically based on environment information. Through this, energy consumption is reduced and transmission reliability is improved. The simulation result shows that the proposed algorithm is more efficient when compared with previous energy harvesting based communication standard, Enocean in terms of transmission success rate and residual energy.This research was supported by Basic Science Research Program through the National Research Foundation by Korea (NRF) funded by the Ministry of Education, Science and Technology(2012R1A1A3012227)
Retinal Fundus Image Analysis for Diagnosis of Glaucoma: A Comprehensive Survey
© 2016 IEEE. The rapid development of digital imaging and computer vision has increased the potential of using the image processing technologies in ophthalmology. Image processing systems are used in standard clinical practices with the development of medical diagnostic systems. The retinal images provide vital information about the health of the sensory part of the visual system. Retinal diseases, such as glaucoma, diabetic retinopathy, age-related macular degeneration, Stargardt's disease, and retinopathy of prematurity, can lead to blindness manifest as artifacts in the retinal image. An automated system can be used for offering standardized large-scale screening at a lower cost, which may reduce human errors, provide services to remote areas, as well as free from observer bias and fatigue. Treatment for retinal diseases is available; the challenge lies in finding a cost-effective approach with high sensitivity and specificity that can be applied to large populations in a timely manner to identify those who are at risk at the early stages of the disease. The progress of the glaucoma disease is very often quiet in the early stages. The number of people affected has been increasing and patients are seldom aware of the disease, which can cause delay in the treatment. A review of how computer-aided approaches may be applied in the diagnosis and staging of glaucoma is discussed here. The current status of the computer technology is reviewed, covering localization and segmentation of the optic nerve head, pixel level glaucomatic changes, diagonosis using 3-D data sets, and artificial neural networks for detecting the progression of the glaucoma disease
Cementitious Permeable Pavement as a Passive Unit Operation and Process for Stormwater Quality and Quantity Control
With respect to the problems caused by impervious pavements, cementitous permeable pavement (CPP) functions as a passive unit operation and process for stormwater quality and quantity control through infiltration, evaporation, filtration, absorption and reaction mechanisms. CPP pore characteristics were examined through pore connectivity analysis using X-Ray Tomography (XRT). Image resolution influence on image analysis results was evaluated. Relationships between parameters of pore characteristics were evaluated. Factors that significantly influence fluid flow in CPP media include effective porosity, pore connectivity and pore size distribution. A modified Kozeny-Carman model in which effective porosity, specific surface area based on effective pores (SSA)pe, and weighted tortuosity (Le/L)w were employed was developed and demonstrated applicable for CPP hydraulic conductivity estimation. Both the k-total porosity relationship and k-effective porosity relationship were developed with a power law model. Filtration of CPP subject to different particle loadings for a constant particle size gradation was investigated experimentally. Removal efficiencies for both total particles and for each size fraction were examined. A power law model was developed for the relationship between suspended solid concentration (SSC) and turbidity. CPP clogging potential was evaluated by measuring the temporal hydraulic conductivity, k(t) as well as the particles strained on CPP surface. Two CPP cleaning methods, vacuuming and sonicating followed by backwashing, were evaluated and found capable of recovering k0 up to 96%. A method for scheduling of CPP maintainance was presented. 3 groups of CPP specimens were sued to evaluate the capability of pH and alkalinity elevation and phosphorus removal functions of CPP. The removal efficiencies of total phosphorus (TP), total dissolved phosphorus (TDP) and total particulate phosphorus (TPP) were evaluated through experimental measurements. Factors that influence CPP strength and porosity, including water to cement ratio (w/c), aggregate to cement ratio (a/c), aggregate gradation and the degree of compaction, were evaluated through 6 mix designs with different design parameters. Based on test results, an optimized mix design was recommended, and a CPP structural with fc¡¯ \u3e 25 MPa (3500 psi), fs \u3e 2.76 MPa (400 psi), total porosity \u3e 20%, and permeability k \u3e 0.3 cm/s is desirable
Computational Physics: An Introduction to Monte Carlo Simulations of Matrix Field Theory
This book is divided into two parts. In the first part we give an elementary
introduction to computational physics consisting of 21 simulations which
originated from a formal course of lectures and laboratory simulations
delivered since 2010 to physics students at Annaba University. The second part
is much more advanced and deals with the problem of how to set up working Monte
Carlo simulations of matrix field theories which involve finite dimensional
matrix regularizations of noncommutative and fuzzy field theories, fuzzy spaces
and matrix geometry. The study of matrix field theory in its own right has also
become very important to the proper understanding of all noncommutative, fuzzy
and matrix phenomena. The second part, which consists of 9 simulations, was
delivered informally to doctoral students who are working on various problems
in matrix field theory. Sample codes as well as sample key solutions are also
provided for convenience and completness. An appendix containing an executive
arabic summary of the first part is added at the end of the book.Comment: 350 pages, v2: slight change in titl
- …