Scaling Constructional Language Processing: Techniques for Managing Large Search Spaces

Constructionist approaches to language make use of form-meaning pairings, called constructions, to capture all linguistic knowledge that is necessary for comprehending and producing natural language expressions. Language processing consists then in combining the constructions of a grammar in such a way that they solve a given language comprehension or production problem. Finding such an adequate sequence of constructions constitutes a search problem that is combinatorial in nature and becomes intractable as grammars increase in size. In this micro project, we explore a neural methodology for learning heuristics that substantially optimises the search processes involved in constructional language processing.


We have validated the methodology in a case study for the CLEVR benchmark dataset. We have already shown that our novel methodology outperforms state-of-the-art techniques in terms of size of the search space and time of computation, most markedly in the production direction. The results already obtained constitute a crucial contribution towards the development of scalable constructional language processing
systems, thereby overcoming the major efficiency obstacle that hinders current efforts in learning large-scale construction grammars.


Van Eecke, P., Nevens, J., & Beuls, K. (Submitted). Neural Heuristics for Scaling
Constructional Language Processing.

Paul Van Eecke, Jens Nevens and Katrien Beuls (VUB-EHAI)

Subscribe to Our Newsletter:

I agree with the Privacy policy

Meaning and Understanding
in Human-centric
Artificial Intelligence

Follow Us
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 951846