
Hot Wheels and a Lesson in Cold Starts
TL;DR: I built a classifieds app for Hot Wheels collectors. Validated the problem, ran pilots, launched — and it died quietly from lack of critical mass. This is about doing everything “right” and ...

TL;DR: I built a classifieds app for Hot Wheels collectors. Validated the problem, ran pilots, launched — and it died quietly from lack of critical mass. This is about doing everything “right” and ...
Review of all the books I read in 2025.

And What We Actually Know About Knowledge

TL;DR: Yoshua Bengio’s 2003 paper “A Neural Probabilistic Language Model” is the Genesis of modern NLP. Before this paper, language models were statistical counting machines. After it, they became ...
Review of all the books I read in 2024.
Talk Trois Log

TL;DR: RNNs were the first serious attempt at giving neural networks memory. The idea was elegant — feed the past into the present. But they collapsed under their own weight, literally. Gradients v...
Review of all the books I read in 2023.
TL;DR: Before normalization, training deep networks was like trying to stack cards in a hurricane—one small change would topple everything. BatchNorm (2015) changed the game by stabilizing training...

TL;DR: Dropout started as a simple trick to prevent overfitting—randomly turn off neurons during training. But it evolved into something profound: a gateway to understanding uncertainty in deep lea...