Tree-based models and deep neural networks are two schools of effective classification methods in machine learning. While tree-based models are robust irrespective of data domain, deep neural networks have advantages… Click to show full abstract
Tree-based models and deep neural networks are two schools of effective classification methods in machine learning. While tree-based models are robust irrespective of data domain, deep neural networks have advantages in handling high-dimensional data. Adding a differentiable neural decision forest to the neural network can generally help exploit the benefits of both models. Therefore, traditional decision trees diverge into a bagging version (i.e., random forest) and a boosting version (i.e., gradient boost decision tree). In this work, we aim to harness the advantages of both bagging and boosting by applying gradient boost to a neural decision forest. We propose a gradient boost that can learn the residual using neural decision forest, considering the residual as a part for the final prediction. Besides, we design a structure for learning the parameters of neural decision forest and gradient boost module in contiguous steps, which is extendable to incorporate multiple gradient-boosting modules in an end-to-end manner. Our extensive experiments on several public datasets demonstrate the competitive performance and efficiency of our model against a series of baseline methods in solving various machine learning tasks.
               
Click one of the above tabs to view related content.