3 research outputs found

    i3PosNet: Instrument Pose Estimation from X-Ray in temporal bone surgery

    Full text link
    Purpose: Accurate estimation of the position and orientation (pose) of surgical instruments is crucial for delicate minimally invasive temporal bone surgery. Current techniques lack in accuracy and/or line-of-sight constraints (conventional tracking systems) or expose the patient to prohibitive ionizing radiation (intra-operative CT). A possible solution is to capture the instrument with a c-arm at irregular intervals and recover the pose from the image. Methods: i3PosNet infers the position and orientation of instruments from images using a pose estimation network. Said framework considers localized patches and outputs pseudo-landmarks. The pose is reconstructed from pseudo-landmarks by geometric considerations. Results: We show i3PosNet reaches errors less than 0.05mm. It outperforms conventional image registration-based approaches reducing average and maximum errors by at least two thirds. i3PosNet trained on synthetic images generalizes to real x-rays without any further adaptation. Conclusion: The translation of Deep Learning based methods to surgical applications is difficult, because large representative datasets for training and testing are not available. This work empirically shows sub-millimeter pose estimation trained solely based on synthetic training data.Comment: Accepted at International journal of computer assisted radiology and surgery pending publicatio

    Marker detection evaluation by phantom and cadaver experiments for C-arm pose estimation pattern

    No full text
    C-arm fluoroscopy is used for guidance during several clinical exams, e.g. in bronchoscopy to locate the bronchoscope inside the airways. Unfortunately, these images provide only 2D information. However, if the C-arm pose is known, it can be used to overlay the intrainterventional fluoroscopy images with 3D visualizations of airways, acquired from preinterventional CT images. Thus, the physician's view is enhanced and localization of the instrument at the correct position inside the bronchial tree is facilitated. We present a novel method for C-arm pose estimation introducing a marker-based pattern, which is placed on the patient table. The steel markers form a pattern, allowing to deduce the C-arm pose by use of the projective invariant cross-ratio. Simulations show that the C-arm pose estimation is reliable and accurate for translations inside an imaging area of 30 cm x 50 cm and rotations up to 30°. Mean error values are 0:33 mm in 3D space and 0:48 px in the 2D imag ing plane. First tests on C-arm images resulted in similarly compelling accuracy values and high reliability in an imaging area of 30 cm x 42:5 cm. Even in the presence of interfering structures, tested both with anatomy phantoms and a turkey cadaver, high success rates over 90% and fully satisfying execution times below 4 sec for 1024 px x 1024 px images could be achieved
    corecore