Some prominent discretisation methods such as finite elements provide a way to approximate a function of d variables from n values it takes on the nodes xi of the corresponding mesh. The accuracy is n−sa/d in L2-norm, where sa is the order of the underlying method. When the data are measured or computed with systematical experimental noise, some statistical regularisation might be desirable, with a smoothing method of order sr (like the number of vanishing moments of a kernel). This idea is behind the use of some regularised discretisation methods, whose approximation properties are the subject of this paper. We decipher the interplay of sa and sr for reconstructing a smooth function on regular bounded domains from n measurements with noise of order σ. We establish that for certain regimes with small noise σ depending on n, when sa>sr, statistical smoothing is not necessarily the best option and {\it not regularising} is more beneficial than {\it statistical regularising}. We precisely quantify this phenomenon and show that the gain can achieve a multiplicative order n(sa−sr)/(2sr+d). We illustrate our estimates by numerical experiments conducted in dimension d=1 with P1 and P2 finite elements