Improving BERT Pretraining with Syntactic Supervision

Published in ArXiV, 2021

Recommended citation: Tziafas, G. and Kogkalidis, K. and Wijnholds, G. and Moortgat, M. (2021). "Improving BERT Pretraining with Syntactic Supervision." ArXiV preprint. [paper]

We train a BERT model from scratch for Dutch, while incorporating a supertagging objective to induce a syntactic bias. Initial experiments hint at improved or equal performance on a number of tasks, despite pretraining on a small amount of data.