4.6 Article

Gate Attentional Factorization Machines: An Efficient Neural Network Considering Both Accuracy and Speed

Journal

APPLIED SCIENCES-BASEL
Volume 11, Issue 20, Pages -

Publisher

MDPI
DOI: 10.3390/app11209546

Keywords

gate; speed; accuracy; attentional factorization machines; controllable

Funding

  1. National Natural Science Foundation of China [61971268]
  2. MOE (Ministry of Education in China) Project of Humanities and Social Sciences [19YJC760049]

Ask authors/readers for more resources

With the increasing complexity of recommendation system models, balancing accuracy and speed has become an urgent issue. While deep neural networks improve accuracy, the Gate Attention Factorization Machine model excels in both speed and accuracy.
Nowadays, to deal with the increasing data of users and items and better mine the potential relationship between the data, the model used by the recommendation system has become more and more complex. In this case, how to ensure the prediction accuracy and operation speed of the recommendation system has become an urgent problem. Deep neural network is a good solution to the problem of accuracy, we can use more network layers, more advanced feature cross way to improve the utilization of data. However, when the accuracy is guaranteed, little attention is paid to the speed problem. We can only pursue better machine efficiency, and we do not pay enough attention to the speed efficiency of the model itself. Some models with advantages in speed, such as PNN, are slightly inferior in accuracy. In this paper, the Gate Attention Factorization Machine (GAFM) model based on the double factors of accuracy and speed is proposed, and the structure of gate is used to control the speed and accuracy. Extensive experiments have been conducted on data sets in various application scenarios, and the results show that the GAFM model is better than the existing factorization machines in both speed and accuracy.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available