Implement classic papers from scratch
Pick any paper below and PaperNova generates a guided workbook: an outline, exercise-wise explanations from beginner to advanced, and a downloadable Jupyter notebook you can run locally.
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Alexey Dosovitskiy, Lucas Beyer, Alexander Kolesnikov, +3 more
Vision Transformer (ViT). Applies a pure Transformer to image patches and matches CNNs on ImageNet at scale — the paper that unified vision and language architectures.
Deep Residual Learning for Image Recognition
Kaiming He, Xiangyu Zhang, Shaoqing Ren, +1 more
ResNet. Residual connections made it possible to train networks with hundreds of layers and became standard plumbing for nearly every deep architecture since.
ImageNet Classification with Deep Convolutional Neural Networks
Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton
AlexNet. The paper that kicked off the modern deep-learning era in computer vision by winning ImageNet 2012 with a convolutional neural network trained on GPUs.
You Only Look Once: Unified, Real-Time Object Detection
Joseph Redmon, Santosh Divvala, Ross Girshick, +1 more
YOLO. Single-shot object detection that frames detection as a regression problem — fast, end-to-end, and the backbone of every real-time vision system since.
U-Net: Convolutional Networks for Biomedical Image Segmentation
Olaf Ronneberger, Philipp Fischer, Thomas Brox
U-Net. Encoder-decoder with skip connections that became the default architecture for medical imaging and any dense-prediction task on small datasets.
Why implement classic papers?
Reading a paper and implementing it are two very different skills. PaperNova's workbook tool bridges that gap: Gemini turns the paper into a sequence of small, self-contained exercises — from a warm-up reimplementation of the core idea up to advanced extensions — then assembles them into a Jupyter notebook you can run, edit and extend.
Prefer to work from your own paper? Upload a PDF and get the same guided workbook tailored to it.