AIOS:ThinkTank robot car with mecanum wheels, Raspberry Pi, camera, and ultrasonic sensors

The Prompt Is the Program

⚡ TL;DR Built a robot car controlled entirely by an LLM (Qwen3.5-27B on an RTX 5090) — no traditional control code The AI sees through a camera, reads ultrasonic sensors, and issues direct motor commands every ~1.3 seconds It follows me through the house autonomously for 15+ minute sessions — and has crashed into a plant pot twice Key insight: qualitative reasoning goes to the AI, quantitative enforcement goes to traditional code The operating system is optimized for a user that isn’t there anymore Operating systems were built for humans. What happens when the user is an AI? I built a robot car to find out. ...

March 26, 2026 · 10 min · Brian Hengen

I Let Two AIs Talk to Each Other for 500 Turns. It Got Weird.

⚡ TL;DR Two AIs asked to “help me get rich” spiraled into inventing the Infinite Radiance Cosmic Leitmotif Convergence Harmonizer Two AIs writing a story introduced themselves as “Codex Alpha” and “Lyric Weaver” (adorable) Two AIs in Socratic mode spent 100+ turns philosophizing about chickens crossing roads The actual experiment: getting them to build a text adventure game. It almost worked. Last weekend I built a tool that lets two local LLMs talk to each other. I had serious research goals—testing whether specialized coding models could collaborate with general-purpose models to build working software. ...

December 17, 2025 · 10 min · Brian Hengen

Subscribe to New Posts

Get notified when I publish new AI experiments and research findings.