An exploration to non-NN deep models based on non-differentiable modules
Professor Zhi-Hua Zhou, Nanjing University
The word "deep learning" is often regarded as a synonym of "deep neural networks (DNNs)". In this talk, we claim that the essential of deep learning lies in the combination of layer-by-layer processing, in-model feature transformation and sufficient model complexity, and it is not that crucial whether deep models are realized by neural networks or not. To verify the conjecture, we will show that it is possible to construct non-NN style deep models without relying on backpropagation training nor gradient-based adjustment. We advocate the exploration to non-NN deep models, because neural network based deep models have already been studied for many years while it is well-known that none model can always be the best.