← Home

Thursday

Moving to software when abstracting it from the hardware, the most general idea was an automated livestreaming system that could manage multiple cameras, automate ingestion, and record structured documentaries or clips.

Similar to how Cursor builds on VS Code, the best path forward seemed to be leveraging existing open-source infrastructure instead of rebuilding from scratch, since our previous platform was too tailored to one use case. I started experimenting with OBS (Open Broadcaster Software) but quickly realized it wasn’t productive. It will face the same challenge as Cursor for Video: in a B2C model, input and output cases are so varied that meaningful automation is nearly impossible. Models aren’t there yet. One user might want a summary of their dog’s day, another might want clips of clouds.

So in the evening after building out my personal fork of OBS, I realised the next step was finding a high-visibility demonstration with almost no budget. That would mean controlling where cameras would be placed, what they would see and what the output should be. It hit me! I’m part of The Residency, a network of hacker houses. What if one became the first AI-directed reality show? A 24/7 livestream for a week where AI automatically chose camera angles and narrated in real time.