Hierarchical Fusion Based Deep Learning Framework for Lung Nodule Classification

Abstract

Lung cancer is the leading cancer type that causes the mortality in both men and women. Computer aided detection (CAD) and diagnosis systems can play a very important role for helping the physicians in cancer treatments. This dissertation proposes a CAD framework that utilizes a hierarchical fusion based deep learning model for detection of nodules from the stacks of 2D images. In the proposed hierarchical approach, a decision is made at each level individually employing the decisions from the previous level. Further, individual decisions are computed for several perspectives of a volume of interest (VOI). This study explores three different approaches to obtain decisions in a hierarchical fashion. The first model utilizes the raw images. The second model uses a single type feature images having salient content. The last model employs multi-type feature images. All models learn the parameters by means of supervised learning. In addition, this dissertation proposes a new Trilateral Filter to extract salient content of 2D images. This new filter includes a second anisotropic Laplacian kernel in addition to the Bilateral filter’s range kernel. The proposed CAD frameworks are tested using lung CT scans from the LIDC/IDRI database. The experimental results showed that the proposed multi-perspective hierarchical fusion approach significantly improves the performance of the classification

    Similar works