Vector quantization (VQ) techniques are widely used in similarity search for
data compression, fast metric computation and etc. Originally designed for
Euclidean distance, existing VQ techniques (e.g., PQ, AQ) explicitly or
implicitly minimize the quantization error. In this paper, we present a new
angle to analyze the quantization error, which decomposes the quantization
error into norm error and direction error. We show that quantization errors in
norm have much higher influence on inner products than quantization errors in
direction, and small quantization error does not necessarily lead to good
performance in maximum inner product search (MIPS). Based on this observation,
we propose norm-explicit quantization (NEQ) --- a general paradigm that
improves existing VQ techniques for MIPS. NEQ quantizes the norms of items in a
dataset explicitly to reduce errors in norm, which is crucial for MIPS. For the
direction vectors, NEQ can simply reuse an existing VQ technique to quantize
them without modification. We conducted extensive experiments on a variety of
datasets and parameter configurations. The experimental results show that NEQ
improves the performance of various VQ techniques for MIPS, including PQ, OPQ,
RQ and AQ