2024
Contribution to conference  Restricted

Distilled neural networks for efficient learning to rank: (Extended Abstract)

Nardini F. M., Rulli C., Trani S., Venturini R.

Distillation  Efficiency  Neural Networks  Pruning  Learning to Rank  Matrix Multiplication 

Recent studies in Learning to Rank (LtR) have shown the possibility of effectively distilling a neural network from an ensemble of regression trees. This fully enables the use of neural-based ranking models in query processors of modern Web search engines. Nevertheless, ensembles of regression trees outperform neural models both in terms of efficiency and effectiveness on CPU. In this paper, we propose a framework to design and train neural networks outperforming ensembles of regression trees. After distilling the networks from tree-based models, we exploit an efficiency-oriented pruning technique that works by sparsifying the most computationally intensive layers of the model. Moreover, we develop inference time predictors, which help devise neural network architectures that match the desired efficiency requirements. Comprehensive experiments on two public learning-to-rank datasets show that the neural networks produced with our novel approach are competitive in terms of effectiveness-efficiency trade-off when compared with tree-based ensembles by providing up to 4x inference time speed-up without degradation of the ranking quality.

Source: PROCEEDINGS - INTERNATIONAL CONFERENCE ON DATA ENGINEERING, pp. 5693-5694. Utrecht, Netherlands, 13-16/05/2024

Publisher: IEEE Computer Society


Metrics



Back to previous page
BibTeX entry
@inproceedings{oai:iris.cnr.it:20.500.14243/525362,
	title = {Distilled neural networks for efficient learning to rank: (Extended Abstract)},
	author = {Nardini F.  M. and Rulli C. and Trani S. and Venturini R.},
	publisher = {IEEE Computer Society},
	doi = {10.1109/icde60146.2024.00478},
	booktitle = {PROCEEDINGS - INTERNATIONAL CONFERENCE ON DATA ENGINEERING, pp. 5693-5694. Utrecht, Netherlands, 13-16/05/2024},
	year = {2024}
}