Toward infinite-scale 3D city synthesis, we propose a novel framework,
InfiniCity, which constructs and renders an unconstrainedly large and
3D-grounded environment from random noises. InfiniCity decomposes the seemingly
impractical task into three feasible modules, taking advantage of both 2D and
3D data. First, an infinite-pixel image synthesis module generates
arbitrary-scale 2D maps from the bird's-eye view. Next, an octree-based voxel
completion module lifts the generated 2D map to 3D octrees. Finally, a
voxel-based neural rendering module texturizes the voxels and renders 2D
images. InfiniCity can thus synthesize arbitrary-scale and traversable 3D city
environments, and allow flexible and interactive editing from users. We
quantitatively and qualitatively demonstrate the efficacy of the proposed
framework. Project page: https://hubert0527.github.io/infinicity