 So what is a neural network? We could define a neural network mathematically as a differentiable function that maps one kind of variable to another kind of variable. Like how in classification problems we convert vectors to vectors, and in regression problems we can convert vectors to scalars. Recurrent nets just throw sequences into the mix, and so we end up with numerous architectures that can be used in various applications.