Deep Learning — ResNet, Transfer Learning & Contrastive Learning
Summary
Modern deep learning on CIFAR-10 (60,000 32x32 color images, 10 classes): training ResNet18 from scratch versus transfer learning with pretrained ImageNet weights (freezing convolutional layers and replacing the final fully-connected layer), data augmentation with random cropping and horizontal flips, SGD with learning rate scheduling, Supervised Contrastive Learning (SupCon loss from Khosla et al., NeurIPS 2020) for learning representations where same-class images cluster together, and evaluation via confusion matrices.
Materials
Deep Learning — CNNs, Transfer Learning & Contrastive Learning
How CNNs see images, why pretrained features transfer across tasks, and how contrastive learning teaches networks to organize representations.
Deep Learning Notebook — CIFAR-10 with ResNet, Transfer Learning & SupCon
Train ResNet18 from scratch on CIFAR-10, apply transfer learning with frozen ImageNet weights, implement Supervised Contrastive Learning, and analyze with confusion matrices.
Includes notebook