Doing a master's was one of the more interesting things I've done. Part of what made it interesting was unexpected: growing up, I genuinely hated math. The way it was taught in school — all pressure, no curiosity — made me avoid it. But in my master's, for the first time, I actually enjoyed it. The courses were about ideas rather than procedures, and that made a real difference. My major was in Algorithm and Computation Theory, which is about as close to pure theoretical computer science as you can get, and I'm glad I went that direction.
For my thesis, I worked on spiking neural networks. Partly because that was my supervisor's area, but also because SNNs are more conceptually interesting than immediately useful — which, for a thesis, I think is the right framing.
What spiking networks are
Conventional neural networks pass continuous floating-point activations between layers. Spiking networks don't — they pass discrete events in time, the way biological neurons actually fire. Each neuron accumulates input until it crosses a threshold, fires a spike, and resets. The temporal dynamics are part of the representation, not incidental.
This makes SNNs a natural match for Dynamic Vision Sensors (DVS cameras), which produce discrete events rather than frames: each pixel fires independently, asynchronously, when its local brightness changes. For gesture recognition, this captures motion in a way that conventional cameras don't.
Building the tools from scratch
Standard ML tooling — PyTorch, Keras — doesn't natively support spiking networks. The non-differentiable spike function requires surrogate gradient methods for training, and the temporal dynamics require careful state handling across timesteps. I implemented the core training and simulation tools from scratch. Time-consuming, but it meant I understood what was actually happening at each layer — which is not always true when a framework abstracts everything for you.
The results on gesture recognition datasets were competitive with conventional CNN approaches, with notably lower neural activation density. Whether that translates to real energy efficiency on neuromorphic hardware is a question the thesis raised but didn't fully answer.
Research has a different rhythm than engineering work. The feedback loops are longer, the definitions of success are fuzzier, and you spend a lot of time unsure whether you're on the right track. I found both things to be true: I loved it while I was doing it, and I was also glad to go back to building things afterward.