论文标题

没有人落后:广告商建模的多幕科多任务元学习方法

Leaving No One Behind: A Multi-Scenario Multi-Task Meta Learning Approach for Advertiser Modeling

论文作者

Zhang, Qianqian, Liao, Xinru, Liu, Quan, Xu, Jian, Zheng, Bo

论文摘要

广告商在淘杯和亚马逊等许多电子商务平台中起着至关重要的作用。满足他们的营销需求并支持其业务增长对于平台经济的长期繁荣至关重要。但是,与广泛的对用户建模的研究(如点击率预测)相比,人们对广告商的关注得多,尤其是在理解其多样化的需求和绩效方面。与用户建模不同,广告客户建模通常涉及多种任务(例如,广告商的支出,主动率或促销产品的全部印象的预测)。此外,主要的电子商务平台通常提供多种营销场景(例如赞助搜索,展示广告,实时流广告),而广告商的行为往往会分散在其中许多方面。这提出了全面广告商建模中多任务和多幕科考虑的必要性,这面临以下挑战:首先,每个方案或每个任务的一个模型或每个任务根本不会扩展;其次,用有限的数据样本建模新的或次要方案特别困难。第三,阶段间的相关性很复杂,并且在不同的任务下可能会有所不同。为了应对这些挑战,我们提出了一种多幕科多任务元学习方法(M2M),该方法同时预测了多个广告方案中的多个任务。

Advertisers play an essential role in many e-commerce platforms like Taobao and Amazon. Fulfilling their marketing needs and supporting their business growth is critical to the long-term prosperity of platform economies. However, compared with extensive studies on user modeling such as click-through rate predictions, much less attention has been drawn to advertisers, especially in terms of understanding their diverse demands and performance. Different from user modeling, advertiser modeling generally involves many kinds of tasks (e.g. predictions of advertisers' expenditure, active-rate, or total impressions of promoted products). In addition, major e-commerce platforms often provide multiple marketing scenarios (e.g. Sponsored Search, Display Ads, Live Streaming Ads) while advertisers' behavior tend to be dispersed among many of them. This raises the necessity of multi-task and multi-scenario consideration in comprehensive advertiser modeling, which faces the following challenges: First, one model per scenario or per task simply doesn't scale; Second, it is particularly hard to model new or minor scenarios with limited data samples; Third, inter-scenario correlations are complicated, and may vary given different tasks. To tackle these challenges, we propose a multi-scenario multi-task meta learning approach (M2M) which simultaneously predicts multiple tasks in multiple advertising scenarios.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源