An inequality concerning Kullback's I-divergence is applied to obtain a necessary condition for the possibility of encoding symbols of the alphabet of a discrete memoryless source of entropy H by sequences of symbols of another alphabet of size D in such a way that the average code length be close to the optimum H/log D. The same idea is applied to the problem of maximizing entropy per second for unequal symbol lenghts, too