One of the fundamental elements of both traditional and certain deep learning
medical image registration algorithms is measuring the similarity/dissimilarity
between two images. In this work, we propose an analytical solution for
measuring similarity between two different medical image modalities based on
the Hessian of their intensities. First, assuming a functional dependence
between the intensities of two perfectly corresponding patches, we investigate
how their Hessians relate to each other. Secondly, we suggest a closed-form
expression to quantify the deviation from this relationship, given arbitrary
pairs of image patches. We propose a geometrical interpretation of the new
similarity metric and an efficient implementation for registration. We
demonstrate the robustness of the metric to intensity nonuniformities using
synthetic bias fields. By integrating the new metric in an affine registration
framework, we evaluate its performance for MRI and ultrasound registration in
the context of image-guided neurosurgery using target registration error and
computation time