We present Erato, a framework designed to facilitate the automated evaluation
of poetry, including that generated by poetry generation systems. Our framework
employs a diverse set of features, and we offer a brief overview of Erato's
capabilities and its potential for expansion. Using Erato, we compare and
contrast human-authored poetry with automatically-generated poetry,
demonstrating its effectiveness in identifying key differences. Our
implementation code and software are freely available under the GNU GPLv3
license