Remote sensing hyperspectral images (HSI) are quite often locally low rank, in the sense that the spectral vectors acquired from a given spa-tial neighborhood belong to a low dimensional subspace/manifold. This has been recently exploited for the fusion of low spatial resolution HSI with high spatial resolution multispectral images (MSI) in order to ob-tain super-resolution HSI. Most approaches adopt an unmixing or a ma-trix factorization perspective. The derived methods have led to state-of-the-art results when the spectral information lies in a low dimensional subspace/manifold. However, if the subspace/manifold dimensionality spanned by the complete data set is large, the performance of these meth-ods decrease mainly because the underlying sparse regression is severely ill-posed. In this paper, we propose a local approach to cope with this difficulty. Fundamentally, we exploit the fact that real world HSI are locally low rank, to partition the image into patches and solve the data fusion problem independently for each patch. This way, in each patch the subspace/manifold dimensionality is low enough to obtain useful super-resolution. We explore two alternatives to define the local regions, using sliding windows and binary partition trees. The effectiveness of the pro-posed approach is illustrated with synthetic and semi-real data.