Let F and G be linear recurrences over a number field K, and
let R be a finitely generated subring of K.
Furthermore, let N be the set of positive integers n such that
G(n)=0 and F(n)/G(n)∈R. Under mild hypothesis,
Corvaja and Zannier proved that N has zero asymptotic density. We
prove that #(N∩[1,x])≪x⋅(loglogx/logx)h
for all x≥3, where h is a positive integer that can be computed in
terms of F and G. Assuming the Hardy-Littlewood k-tuple conjecture, our
result is optimal except for the term loglogx