论文标题

血清:优化序列学习以使用挤压和兴奋网络排名

SERank: Optimize Sequencewise Learning to Rank Using Squeeze-and-Excitation Network

论文作者

Wang, RuiXing, Fang, Kuan, Zhou, RiKang, Shen, Zhan, Fan, LiWen

论文摘要

学习到秩(LTR)是一组有监督的机器学习算法,旨在在项目列表上生成最佳排名顺序。在过去的几十年中,已经研究了许多排名模型。他们中的大多数在培训和推理期间独立治疗每个查询文件对。最近,已经提出了一些方法,该方法集中在排名候选人列表中以进行进一步改进的挖掘信息,例如学习多次评分功能或学习上下文嵌入。但是,这些方法通常会大大提高在线推断期间的计算成本,尤其是在现实世界中的Web搜索系统中的候选人大小时。更重要的是,很少有研究重点介绍模型结构的新设计,用于利用排名候选人的信息。在这项工作中,我们提出了一种名为Serank的有效和高效方法,该方法是通过使用挤压和激发网络利用跨文档信息的序列排名模型。此外,我们在几个公共基准数据集上检查了我们提出的方法,以及从Zhihu的商业问题收集的单击日志。此外,我们还在Zhihu搜索引擎进行在线A/B测试,以进一步验证所提出的方法。离线数据集和在线A/B测试的结果表明,我们的方法有助于显着改善。

Learning-to-rank (LTR) is a set of supervised machine learning algorithms that aim at generating optimal ranking order over a list of items. A lot of ranking models have been studied during the past decades. And most of them treat each query document pair independently during training and inference. Recently, there are a few methods have been proposed which focused on mining information across ranking candidates list for further improvements, such as learning multivariant scoring function or learning contextual embedding. However, these methods usually greatly increase computational cost during online inference, especially when with large candidates size in real-world web search systems. What's more, there are few studies that focus on novel design of model structure for leveraging information across ranking candidates. In this work, we propose an effective and efficient method named as SERank which is a Sequencewise Ranking model by using Squeeze-and-Excitation network to take advantage of cross-document information. Moreover, we examine our proposed methods on several public benchmark datasets, as well as click logs collected from a commercial Question Answering search engine, Zhihu. In addition, we also conduct online A/B testing at Zhihu search engine to further verify the proposed approach. Results on both offline datasets and online A/B testing demonstrate that our method contributes to a significant improvement.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源