1、摘要:

提出一个Attentional FM,Attention模型+因子分解机,其通过Attention学习到特征交叉的权重。因为很显然不是所有的二阶特征交互的重要性都是一样的,如何通过机器自动的从中学习到这些重要性是这篇论文解决的最重要的问题,

比如:作者举了一个例子,在句子"US continues taking a leading role on foreign payment transparency"中,除了"foreign payment transparency",其它句子明显与财经新闻无关,它们之间的交叉作用可认为对主题预测是一种噪音。

2、FM

5、AFM(Attention+FM)-----Attentional Factorization Machines:Learning the Weight of Feature Interactions via Attention Network-LMLPHP

3、注意力机制

AFM模型架构:

5、AFM(Attention+FM)-----Attentional Factorization Machines:Learning the Weight of Feature Interactions via Attention Network-LMLPHP

5、AFM(Attention+FM)-----Attentional Factorization Machines:Learning the Weight of Feature Interactions via Attention Network-LMLPHP

5、AFM(Attention+FM)-----Attentional Factorization Machines:Learning the Weight of Feature Interactions via Attention Network-LMLPHP

5、AFM(Attention+FM)-----Attentional Factorization Machines:Learning the Weight of Feature Interactions via Attention Network-LMLPHP

05-14 11:52