Rangachari Anand, Kishan Mehrotra, et al.
IEEE Transactions on Neural Networks
We introduce SPUD (Semantically Perturbed Universal Dependencies), a framework for creating nonce treebanks for the multilingual Universal Dependencies (UD) corpora. SPUD data satisfies syntactic argument structure, provides syntactic annotations, and ensures grammaticality via language-specific rules. We create nonce data in Arabic, English, French, German, and Russian, and demonstrate two use cases of SPUD treebanks. First, we investigate the effect of nonce data on word co-occurrence statistics, as measured by perplexity scores of autoregressive (ALM) and masked language models (MLM). We find that ALM scores are significantly more affected by nonce data than MLM scores. Second, we show how nonce data affects the performance of syntactic dependency probes. We replicate the findings of Müller-Eberstein et al. (2022) on nonce test data and show that the performance declines on both MLMs and ALMs wrt. original test data. However, a majority of the performance is kept, suggesting that the probe indeed learns syntax independently from semantics. Code at https://github.com/davidarps/spud
Rangachari Anand, Kishan Mehrotra, et al.
IEEE Transactions on Neural Networks
Marcel Nawrath, Agnieszka Wiktoria Nowak, et al.
NAACL 2024
Prince Kumar, Srikanth Tamilselvam, et al.
NAACL 2024
Dzung Phan, Vinicius Lima
INFORMS 2023