Recent developments in mmWave technology allow the detection and classification of dynamic arm gestures. However, achieving a high accuracy and generalization requires a lot of samples for the training of a machine learning model. Furthermore, in order to capture variability in the gesture class, the participation of many subjects and the conduct of many gestures with different arm speed are required. In case of macro-gestures, the position of the subject must also vary inside the field of view of the device. This would require a significant amount of time and effort, which needs to be repeated in case that the sensor hardware or the modulation parameters are modified. In order to reduce the required manual effort, here we developed a synthetic data generator that is capable of simulating seven arm gestures by utilizing Blender, an open-source 3D creation suite. We used it to generate 600 artificial samples with varying speed of execution and relative position of the simulated subject, and used them to train a machine learning model. We tested the model using a real dataset recorded from ten subjects, using an experimental sensor. The test set yielded 84.2% accuracy, indicating that synthetic data generation can significantly contribute in the pre-training of a model