Skip to main content
Article thumbnail
Location of Repository

Automatic landmark extraction from a class of hands using growing neural gas

By Anastassia Angelopoulou, Jose Garcia Rodriguez and Alexandra Psarrou


A new method for automatically building statistical\ud shape models from a set of training examples and in\ud particular from a class of hands. In this method, landmark\ud extraction is achieved using a self-organising neural\ud network, the Growing Neural Gas (GNG), which is used\ud to preserve the topology of any input space. Using GNG,\ud the topological relations of a given set of deformable\ud shapes can be learned. We describe how shape models can\ud be built automatically by posing the correspondence\ud problem on the behaviour of self-organising networks that\ud are capable of adapting their topology to an input\ud manifold, and due to their dynamic character to readapt it\ud to the shape of the objects. Results are given for the\ud training set of hand outlines, showing that the proposed\ud method preserves accurate models

Topics: UOW3
Publisher: IAPR MVA Conference Committee
OAI identifier:
Provided by: WestminsterResearch

Suggested articles


  1. (2000). A Framework for Automatic Landmark Identification Using a New Method of Nonrigid Correspondence.
  2. (1995). A Growing Neural Gas Network Learns Topologies.
  3. (2002). A Minimum Description Length Approach to Statistical Shape Modeling.
  4. (1995). Active shape models - their training and application.
  5. (2002). Automatic Landmarking for Building Biological Shape Models.
  6. (2004). Evaluating statistical shape models for automatic landmark generation a class of human hands.
  7. (1994). Learning Flexible Models from Image Sequences.
  8. (2002). Modelo de representaciĆ³n y procesamiento de movimiento para diseƱo de arquitecturas de tiempo real especializadas.
  9. (1995). Self-Organizing Maps.
  10. (1997). Some competitive learning methods.
  11. (1994). Topology Representing Networks.

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.