this paper. For authenticating a particular volume data-set, the following steps are performed: Affine transformation parameters recovery: Since, one of the benign manipulations could be the affine transformation of the volume, the transform parameters are computed first. Matching: The content features of the transformed volume are compared with the content features of the original data-set (obtained from the digital signature after decryption using the owner's public key). A match value between the original features and the transformed volume features is computed. If this match value exceeds a certain threshold, then the volume is certified as genuine else it is considered untrustworthy. We have conducted experiments on two volume data sets, SKULL (## # ## # ##) and TOMATO (## # ### # ###) (Figure 2). In the selection of key voxels, we used a windowed lowpass filtering for five times with the window size 9 and the threshold 1.5. The resulting numbers of key voxels are 25 for SKULL and 124 for TOMATO. The sizes of the signatures are 8KB and 19KB respectively. Five experiments were done with these two volume data sets. The first three experiments examine the signature robustness under global manipulation like low-pass filtering, sharpening, and lossy compression, whereas the last two experiments consider local manipulation like cropping and localized modification. One experiment result of adding Gaussian noise is shown in Figure 3. In this experiment, for SNR of the noise as low as 14.70dB, the volume was still authenticated. Other experimental results are also promisin
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.