AIOS:ThinkTank robot car with mecanum wheels, Raspberry Pi, camera, and ultrasonic sensors

The Prompt Is the Program

⚡ TL;DR Built a robot car controlled entirely by an LLM (Qwen3.5-27B on an RTX 5090) — no traditional control code The AI sees through a camera, reads ultrasonic sensors, and issues direct motor commands every ~1.3 seconds It follows me through the house autonomously for 15+ minute sessions — and has crashed into a plant pot twice Key insight: qualitative reasoning goes to the AI, quantitative enforcement goes to traditional code The operating system is optimized for a user that isn’t there anymore Operating systems were built for humans. What happens when the user is an AI? I built a robot car to find out. ...

March 26, 2026 · 10 min · Brian Hengen

Fine-Tuning Qwen Models: From Theory to Practice

Imagine an AI that coordinates your entire cooking process—faster, smarter, and without ChatGPT’s API costs. With my RTX 5090 workstation humming, I’m answering: Can a specialized language model outcook ChatGPT in the kitchen? Over the past few days, I’ve been fine-tuning Qwen’s 32B and 14B parameter models to create ChefBot, an experimental Specialized Language Model (SLM). Think ChatGPT for recipes, but tuned for cooking data and running without per-token API costs. ...

September 29, 2025 · 3 min · Brian Hengen

Subscribe to New Posts

Get notified when I publish new AI experiments and research findings.