Hyperdiffusion is used in atmospheric General Circulation Models to account for turbulent dissipation at subgrid scale and its intensity affects the efficiency of poleward heat transport by the atmospheric circulation. We perform sensitivity simulations with a dynamic-core GCM to investigate the effects of different intensities of hyperdiffusion and different model resolutions on the simulated equator-pole temperature gradient. We examine the different simulations using entropy production as a measure of baroclinic activity and show that there is a maximum in entropy production. In comparison to the climate at a state of maximum entropy production, every other simulated climate at a given resolution leads to an increased equator-pole temperature gradient. We then demonstrate that maximum entropy production can be used to tune low-resolution models to closely resemble the simulated climate of a high-resolution simulation. We conclude that tuning a GCM to a state of maximum entropy production is an efficient tool to tune low-resolution climate system models to adequately simulate the equator-pole temperature gradient