Project  ·  Hackathon

A Self-Driving Car in a Weekend

Cafe Bazaar Internal Hackathon  ·  2019

Back to blog

CafeBazaar ran internal hackathons, and the brief for this one was open-ended enough that the team decided to try something ambitious: build a self-driving car. A small RC car, fitted with a camera, trained on track footage, steered by a neural network. We had a weekend. We did it.

How we built it

The approach was end-to-end learning: drive the car around the track manually while recording camera footage, then train a model to predict steering angles from image frames. We used a pre-trained ResNet to extract visual features, then a small MLP on top to produce a steering command. Simple in principle; the full pipeline was buildable in the time we had.

The harder part was data collection. A human driving carefully around a track produces a narrow distribution — mostly "drive straight" with occasional gentle corrections. A model trained only on that fails immediately when anything goes slightly off. We ended up collecting a lot more data specifically in the failure states: car drifting toward the edge, coming out of a corner wide. That distribution shift is a recurring lesson in any ML project.

What happened

It worked. The car drove itself around the track for several miles, imperfectly but unmistakably without human intervention. We have video. It's a naive system — very far from anything that would work on a real road — but for a weekend, it was a satisfying result. The experience also made end-to-end learning feel concrete in a way that reading about it doesn't. You can watch exactly where the model is confused. The failure modes are visible and spatial.

Python PyTorch ResNet Computer Vision End-to-end Learning
Previous Next