What exactly Neural Networks are? What are the types of same you are familiar with?
For more context: I know what a neural network is and how backpropagation works. I know that a DNN must have multiple hidden layers. However, 10 years ago in class I learned that having several layers or one layer (not counting the input and output layers) was equivalent in terms of the functions a neural network is able to represent , and that having more layers made it more complex to analyse without gain in performance. Obviously, that is not the case anymore.
I suppose, maybe wrongly, that the differences are in terms of training algorithm and properties rather than structure, and therefore I would really appreciate if the answer could underline the reasons that made the move to DNN possible