About me
I’m Swayam Bhanded, a research engineer at Luma Labs AI based in India. My work focuses on making diffusion models more efficient to train and run.
Research
I’m the author of Speedrunning ImageNet Diffusion (arXiv 2512.12386), which introduces SR-DiT (Speedrun Diffusion Transformer) — a framework that combines a bunch of training techniques to achieve a ~360x training speedup on ImageNet-256. Getting 3.14 FID with only a 140M parameter model, rivaling models 5x its size whilst training for significantly fewer iterations.
You can see other projects on my GitHub.
I also post thoughts about research on X.
Blockchain
I previously worked at Fuel Labs, where I was a contributor to the Sway smart contract language
Other
I made this AI animated version of Bad Apple a while ago.
It didnt get much views on youtube or X, but managed to get millions of views when reposted by people on chinese social media.
I speak Japanese, and am learning Chinese.
