<< Chapter < Page | Chapter >> Page > |
So but let me just go and show you an example of neural network, which was for many years, you know, the most effective learning algorithm before support vector machines were invented. So here’s Yann LeCun’s video, and – well, there’s actually audio on this too, the soundboard. So I’ll just tell you what’s happening. What you’re seeing is a trained neural network, and this display where my mouse pointer is pointing, right, this big three there is the input to the neural network.
So you’re showing the neural network this image, and it’s trying to recognize what is this. The final answer output by the neural network is this number up here, right below where it says LeNet-5, and the neural network correctly recognizes this image as a three, and if you look to the left of this image, what’s interesting about this is the display on the left portion of this is actually showing the intermediate computations of the neural network. In other words, it’s showing you what are the hidden layers of the neural network computing.
And so, for example, if you look at this one, the third image down from the top, this seems to be computing, you know, certain edges into digits, right? We’re just computing digits on the right-hand side of the bottom or something of the input display of the input image, okay? So let me just play this video, and you can see some of the inputs and outputs of the neural network, and those are very different fonts. There’s this robustness to noise. All right. Multiple digits, that’s, kind of, cool. All right.
So, just for fun, let me show you one more video, which was – let’s see. This is another video from the various CV’s, the machine that changed the world, which was produced by WGBH Television in corporation with British Foreclass Incorporation, and it was aired on PBS a few years ago, I think. I want to show you a video describing the NETtalk Neural Network, which was developed by Terry Sejnowski; he’s a researcher. And so NETtalk was actually one of the major milestones in the history of neural network, and this specific application is getting the neural network to read text.
So, in other words, can you show a piece of English to a computer and have the computer read, sort of, verbally produce sounds that could respond to the reading of the text. And it turns out that in the history of AI and the history of machine learning, this video created a lot of excitement about neural networks and about machine learning. Part of the reason was that Terry Sejnowski had the foresight to choose to use, in his video, a child-like voice talking about visiting your grandmother’s house and so on.
You’ll see it in a second, and so this really created the perception of – created the impression of the neural network being like a young child learning how to speak, and talking about going to your grandmothers, and so on. So this actually helped generate a lot of excitement within academia and outside academia on neural networks, sort of, early in the history of neural networks. I’m just gonna show you the video.
Notification Switch
Would you like to follow the 'Machine learning' conversation and receive update notifications?