Federated Learning (FL) has lately gained traction as it addresses how
machine learning models train on distributed datasets. FL was designed for
parametric models, namely Deep Neural Networks (DNNs).Thus, it has shown
promise on image and text tasks. However, FL for tabular data has received
little attention. Tree-Based Models (TBMs) have been considered to perform
better on tabular data and they are starting to see FL integrations. In this
study, we benchmark federated TBMs and DNNs for horizontal FL, with varying
data partitions, on 10 well-known tabular datasets. Our novel benchmark results
indicates that current federated boosted TBMs perform better than federated
DNNs in different data partitions. Furthermore, a federated XGBoost outperforms
all other models. Lastly, we find that federated TBMs perform better than
federated parametric models, even when increasing the number of clients
significantly.Comment: 8 pages, 6 figures, 6 tables, FMEC 2023 (best paper