Skip to content

Overview

Introduction

Neural networks approximate complex nonlinear functions by stacking layers of linear transformations and nonlinear activations. Backpropagation efficiently computes gradients for training, enabling deep learning breakthroughs.

Knowledge Points

  • Introduction to Perceptron Algorithm
  • Structure of a feed-forward neural network
  • Activation functions: ReLU, softmax
  • Backpropagation algorithm
  • Implementing a simple NN from scratch (e.g., MNIST/XOR)
  • Deriving gradient of softmax + cross-entropy