I’ve stumbled upon this paper that focused on predicting a response variable on a sparse dataset. In a standard regression problem, we want to find a weight vector that transforms a feature vector to a response value.
This linear equation assumes that each sample, are independent. This assumption may not be valid for the collaborative filtering or ranking problems where the observation is correlated. Furthermore, SVM cannot find a reliable hyperplane due to the data sparsity; thus, we need the model that is reliable on this setting. Factorization Machine (FM) is also a general predictor and  shows that many predicting problems are equivalent to FM.
FM models all interactions in the dataset. This approach increases the number of observations. If we model a pair-wise interaction, there are interactions. The FM model equation is:
The first two terms are regression formulation. The third term is an interaction term. The dot product of feature embedding vectors is a weight of interaction between feature i and feature j. This formulation implies that each feature is not independent.
The author shows that a 2-way FM is equivalent to SVM with polynomial kernel . The only difference is that the interaction weight parameters in SVM is dense matrix, but it is a low-rank matrix under FM framework. This lead to the less parameters to estimate. If where and the number of parameters is v.s. . If k << n, then the term is almost a linear.
Some people pointed out that the computation of FM is , which is not the case. If the input matrix is sparse, then the number of non-zero interactions in this term: is actually much small than . If is the number of non-zero entries, then the computation of the interaction term is . As long as then FM’s computation is almost linear.
Nevertheless, the input feature for FM still needs some feature engineering. The choice of features is important and feature normalization is necessary.
Rendle, Steffen. “Factorization machines.” Data Mining (ICDM), 2010 IEEE 10th International Conference on. IEEE, 2010.