• The first thing we'll do is show you a number of case studies of the factor convolutional neural networks.


  • So why look at case studies? Last section we learned about the basic building blocks such as convolutional layers, pooling layers and fully connected layers of conv nets.


  • It turns out a lot of the past few years of computer vision research has been on how to put together these basic building blocks to form effective convolutional neural networks.


  • And one of the best ways for you to get intuition yourself is to see some of these examples.


  • I think just as many of you may have learned to write codes by reading other people's codes, I think that a good way to get intuition on how to build conv nets is to read or to see other examples of effective conv nets.


  • And it turns out that a Convolutional neural network architecture that works well on one computer vision task often works well on other tasks as well such as maybe on your task.


  • So if someone else is training neural network that is very good at recognizing cats and dogs and people but you have a different computer vision task like maybe you're trying to sell self-driving car.


  • You might well be able to take someone else's neural network architecture and apply that to your problem.


  • And finally, after the next few section, you'll be able to read some of the research papers from the literature on computer vision and I hope that you might find it satisfying as well. You don't have to do this as a you might find it satisfying to be able to read some of these seminal computer vision research paper and see yourself able to understand them.


  • The LeNEt-5 network which came in 1980s, AlexNet which is often cited and the VGG network and these are examples of pretty effective neural networks.


  • And some of the ideas lay the foundation for modern computer vision.


  • And you see ideas from these papers that were probably be useful for your own work as well. Then I want to show you the ResNet or conv residual network and you might have heard that neural networks are getting deeper and deeper.


  • The ResNet neural network trained a very, very deep 152-layer neural network that has some very interesting tricks, interesting ideas how to do that effectively.


  • And then finally you also see a case study of the Inception neural network.


  • After seeing these neural networks, l think you have much better intuition about how to built effective convolutional neural networks. And even if you end up not working computer vision yourself, I think you find a lot of the ideas from some of these examples, such as ResNet Inception network, many of these ideas are cross-fertilizing on making their way into other disciplines.


  • So even if you don't end up building computer vision applications yourself, I think you'll find some of these ideas very interesting and helpful for your work.