<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Gemma on Brian Hengen's AI Experiments</title><link>https://brianhengen.us/tags/gemma/</link><description>Recent content in Gemma on Brian Hengen's AI Experiments</description><generator>Hugo -- 0.150.0</generator><language>en-us</language><lastBuildDate>Sat, 25 Apr 2026 16:34:49 -0400</lastBuildDate><atom:link href="https://brianhengen.us/tags/gemma/index.xml" rel="self" type="application/rss+xml"/><item><title>Robot Car: Origin Story (How Our Little Hero Got His Wheels)</title><link>https://brianhengen.us/posts/02-robot-car-origin-story/</link><pubDate>Sat, 25 Apr 2026 00:00:00 +0000</pubDate><guid>https://brianhengen.us/posts/02-robot-car-origin-story/</guid><description>I have a working robot car in my office. I&amp;#39;ve never built one. I didn&amp;#39;t read the manual. Here&amp;#39;s what the build looked like when the AI had the knowledge and I had the hands.</description><content:encoded><![CDATA[<p><em>There&rsquo;s a working robot car sitting in my office. I&rsquo;ve never built one, and I didn&rsquo;t read the manual. Here&rsquo;s what the build actually looked like when the AI had the knowledge and I had the hands.</em></p>

<div class="tldr-box">
    <div class="tldr-header">
        <span class="tldr-icon">⚡</span>
        <span class="tldr-title">TL;DR</span>
    </div>
    <div class="tldr-content">
        <ul>
<li>I built the physical robot car over two sessions — chassis assembly, motor wiring, first movement</li>
<li>I have zero robotics experience. I didn&rsquo;t read any manuals. Claude Desktop &amp; Claude Code spec&rsquo;d the parts, guided the build, and caught the problems.</li>
<li>Fried two L298N motor controllers learning a lesson about polarity. Saw a spark. Smelled the smoke.</li>
<li>The PM/dev-team metaphor from post #1 holds — but the collaboration is richer than &ldquo;I direct, you do.&rdquo; The people who&rsquo;ll thrive with AI aren&rsquo;t necessarily the deepest technical experts. They&rsquo;re the curious and creative ones.</li>
<li>Also: we&rsquo;ve already cycled through three vision models (Llama → Qwen → Gemma) and three Claudes (Opus 4.5 → 4.6 → 4.7) in three months. Don&rsquo;t get locked in.</li>
</ul>

    </div>
</div>

<hr>
<h2 id="where-we-left-off">Where We Left Off</h2>
<p>In the <a href="https://brianhengen.us/posts/01-the-prompt-is-the-program/">first post</a>, I made the case that operating systems built for humans aren&rsquo;t optimized for AI users, and I described AIOS:ThinkTank — a robot car driven entirely by a local language model — as the experiment testing that idea.</p>
<p>What I glossed over: how the car actually got built.</p>
<p>I&rsquo;m not a roboticist. I&rsquo;ve never assembled a chassis, wired a motor controller, or flashed a Raspberry Pi for a hardware project. Before this build, I couldn&rsquo;t have told you what an L298N was. I did not, at any point, read a manual or even watch YouTube videos about doing a build like this.</p>
<p>There&rsquo;s now a working robot car in my office. If AIOS:ThinkTank ends up being the little hero of a bigger story — the one where we figure out what an AI-native operating system actually looks like — this is his origin story.</p>
<p>That gap — between &ldquo;never done this&rdquo; and &ldquo;working hardware&rdquo; — is what this post is about.</p>
<hr>
<h2 id="naming-the-thing">Naming the Thing</h2>
<p>Before touching hardware, we needed a name. After some back and forth with Claude, we landed on <strong>AIOS:ThinkTank</strong>. AIOS for the thesis (AI Operating System). ThinkTank for the vehicle — it thinks, it&rsquo;s vaguely tank-like with its mecanum wheels. The colon gives it a namespace feel, in case there&rsquo;s ever an AIOS:Drone or AIOS:Arm.</p>
<p>Then came a small decision that felt bigger than it was. Setting up the Pi, I needed a username and password. The whole point of this project is that the AI is the user of this machine, not me. So I let Claude pick:</p>
<ul>
<li><strong>Username:</strong> <code>aios</code></li>
<li><strong>Password:</strong> <code>IAmTheOperatingSystem</code></li>
</ul>
<p>A small thing. But every time I log into that Pi, I&rsquo;m reminded the machine isn&rsquo;t meant for me.</p>
<p>GitHub repo went up the same session: <a href="https://github.com/bjhengen/aios-thinktank">github.com/bjhengen/aios-thinktank</a>. Public from day one. If I was going to test a thesis, I figured I should do it in the open.</p>
<hr>
<h2 id="the-voltage-problem">The Voltage Problem</h2>
<p>The chassis kit arrived. Metal frame, four TT motors, four mecanum wheels, a battery holder, some L298N motor driver boards. Ten minutes into unboxing, Claude asked me to read off the specs on the motors and the battery pack.</p>
<ul>
<li>Battery pack: 6× AA = <strong>9V nominal</strong></li>
<li>TT motors: rated <strong>3–6V</strong></li>
</ul>
<p>I would not have caught that. I would have assembled everything, connected the battery, and burned out four motors in approximately one second.</p>
<p><img alt="Parts laid out, early build" loading="lazy" src="/posts/02-robot-car-origin-story/20260111_150120.jpg"></p>
<p>Claude caught it from a photo and a spec sheet. That became the pattern: when I hit a problem, I didn&rsquo;t Google it or pore through manuals — I&rsquo;d take a picture from my phone, sometimes three, and upload them to Claude. It&rsquo;d review, think for a beat, and come back with next steps. The build had its own debugging loop.</p>
<p>The fix was two-part: cap the PWM duty cycle at 80% in software (giving us ~6V effective voltage through the L298N&rsquo;s natural voltage drop), and order a 4× AA battery holder for a clean hardware fix later. Five dollars and a software cap saved the build.</p>
<p>The part that struck me: the kit <em>shipped</em> with a 6-cell holder and 6V motors. Someone at the manufacturer didn&rsquo;t think that through. And if I&rsquo;d been building alone, I wouldn&rsquo;t have either.</p>
<hr>
<h2 id="the-missing-wires">The Missing Wires</h2>
<p>Next snag, same session. The motors had bare solder tabs. No wires. The chassis kit apparently assumes you&rsquo;ll solder your own leads on.</p>
<p>I didn&rsquo;t own a soldering iron. I have jumper wires for pin headers, but nothing that&rsquo;ll grip a flat solder tab.</p>
<p>Claude&rsquo;s read on it was pragmatic: don&rsquo;t buy specialty connectors and a soldering kit for one task. For $7, you can get replacement TT motors with wires pre-attached, same form factor. Swap them in when they arrive and skip the problem.</p>
<p>That&rsquo;s the call I would have made too — <em>if I&rsquo;d known the option existed</em>. Instead I would have spent an afternoon researching solder alternatives and wired the whole thing worse than the $7 replacement would have come out of the box.</p>
<p>Session one ended with a GitHub repo, a Pi booted and talking to WiFi, a camera tested, a chassis assembled, and two parts ordered to fix problems we&rsquo;d caught before they mattered. The boring parts, done right.</p>
<hr>
<h2 id="the-painful-lesson">The Painful Lesson</h2>
<p>Session two was supposed to be the easy one. Motors arrive, battery holder arrives, wire everything up, spin the wheels.</p>
<p>The L298N H-Bridge motor driver has three power-related terminals: <strong>12V</strong> (input, for motor power), <strong>GND</strong> (common ground), and <strong>5V</strong> (a <em>regulated output</em> from the onboard regulator, meant for powering downstream devices like an Arduino).</p>
<p>I connected the battery positive to the 5V terminal.</p>
<p><img alt="The crime scene" loading="lazy" src="/posts/02-robot-car-origin-story/20260124_163439.jpg"></p>
<p>What happened next, in order:</p>
<ol>
<li>Spark.</li>
<li>Small electric shock through the wire I was still holding.</li>
<li>The board&rsquo;s LED blinked once, bright, and died.</li>
<li>Faint smell of burnt electronics.</li>
</ol>
<p>The 5V terminal is an <em>output</em>. Feeding external power into it backfeeds the voltage regulator and fries it instantly. I had, in a single motion, learned what that terminal was for and also made it stop being for anything.</p>
<p>I checked with Claude. Yep — battery positive goes to <strong>12V</strong>, not 5V. The board was permanently dead.</p>
<p>Fortunately, the kit came with two L298Ns and we only needed two. So I wired up the second one.</p>
<p>I also connected <em>that one</em> wrong, the same way.</p>
<p>Two boards. Same mistake. Two minutes apart. Thankfully, Amazon Prime exists.</p>
<p>Claude&rsquo;s feedback, in retrospect, was correct. It had told me &ldquo;12V terminal, not 5V&rdquo; both times. Both times I looked at the board, saw the labels close together, and connected them anyway. The AI had the knowledge. I had the hands. The hands made a decision the knowledge disagreed with.</p>
<p>Which brings me to the thing I keep thinking about.</p>
<hr>
<h2 id="whos-the-pm-now">Who&rsquo;s the PM Now?</h2>
<p>In the first post, I described working with Claude Code as being the PM of an AI dev team. I define requirements, review output, iterate. Claude writes the code.</p>
<p>That metaphor still fits — but the build modifies it in a way that&rsquo;s worth calling out.</p>
<p>I spent years as a PM for complex enterprise software. I wrote product feature guides, defined requirements, worked through architecture trade-offs with engineering. I didn&rsquo;t write the code. I didn&rsquo;t need to. I knew what the product needed to do and why, and a team of engineers knew how to make the product do it. We shipped a lot of good software that way.</p>
<p>This project is that same division of labor, but stretched further than it used to go. For the software, I define what the car needs to do and Claude implements it. For the hardware build, Claude knew what an L298N was and I didn&rsquo;t, so Claude directed and I was the hands. In both cases, I&rsquo;m bringing the <em>what</em> and the <em>why</em>; the collaboration brings the <em>how.</em></p>
<p>What&rsquo;s different now is how far that pattern reaches. Even just a year or two ago, a PM with an idea for a robot car would have needed to hire a team, fund it, manage it, wait months. I described a thing I wanted to try, and eight weeks later I had a working prototype sitting on my desk. The barrier to &ldquo;I have an idea, let me try it&rdquo; has collapsed.</p>
<p>Which is why I keep telling people: the ones who are going to thrive as AI gets more capable aren&rsquo;t necessarily the deepest technical experts. They&rsquo;re the curious ones. The ones who poke at things, ask what else is possible, try the experiment they couldn&rsquo;t have run six months ago. Technical depth still matters. But creative range — the range to see a problem, imagine a solution, and actually try it — is the thing that scales now in a way it never did before. This applies to software and hardware projects — but the pattern holds well beyond that. Artists, musicians, writers, designers — all now have tools that enable creative range in ways never before possible.</p>
<p>I started out with no real knowledge about how to build a robot car. I had an idea and a reason to try. That turns out to be enough.</p>
<p>(To be fair: the project was my idea. I&rsquo;m holding on to that one.)</p>
<hr>
<h2 id="dont-get-married-to-a-model">Don&rsquo;t Get Married to a Model</h2>
<p>One more thing worth saying, because it&rsquo;s come up in a lot of customer conversations recently.</p>
<p>When I started this project, the vision model was Llama 3.2 Vision. We moved to Qwen 2-VL, then Qwen3.5-27B. We benchmarked that against Qwen3.6-35B (Q5_K_M), and we&rsquo;re currently running <strong>Gemma 4 26B A4B (Q6_K)</strong> for the driving loop. Now that Qwen3.6-27B (dense) has shipped, I&rsquo;ll be testing that next. That&rsquo;s three vision models actively running in three months, plus a handful of serious bakeoffs between them — each swap justified by actual evaluation, not because the new one was trending.</p>
<p>On the dev-team side, the Claude model has cycled too: Opus 4.5 when we started, then 4.6, now 4.7. Same project, different models, noticeable step-ups each time.</p>
<p>The pattern matters. If we&rsquo;d locked our architecture around Llama 3.2 Vision — built all our prompts, our sanitizers, our telemetry around its specific quirks — the switch to Qwen would have been a rebuild instead of a swap. Same on the Claude side. The interfaces stay stable; the models behind them are swappable.</p>
<p>The lesson I&rsquo;ve been giving customers is the same one this project keeps teaching me: <strong>don&rsquo;t chase the hype, and don&rsquo;t get married to an LLM or even a model family.</strong> The best model today is not the best model in six months. Build systems where the model is a <em>component</em>, not the foundation. Build an evaluation harness so you can actually tell when a swap is an upgrade and when it&rsquo;s just new. Be ready to test and be ready to move.</p>
<p>A few months ago, Llama 3.2 Vision was the obvious choice. Today it&rsquo;s Gemma 4. In a week, it may be something else. The hardware will still work. The interfaces will still hold. That&rsquo;s the point.</p>
<hr>
<h2 id="first-movement">First Movement</h2>
<p>Fresh L298N boards wired correctly. Motors swapped in. Battery holder cycled to 4× AA. Pi booted, SSH&rsquo;d in from the workstation, dependencies installed (<code>python3-picamera2</code>, <code>python3-rpi.gpio</code>, a stack of stuff <code>apt</code> pulled in behind them).</p>
<p>The motor test:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">ssh thinktank <span class="s2">&#34;cd ~/robotcar &amp;&amp; python3 -m pi.car_hardware --test-motors&#34;</span>
</span></span></code></pre></div><div class="highlight"><pre tabindex="0" class="chroma"><code class="language-fallback" data-lang="fallback"><span class="line"><span class="cl">2026-01-24 18:40:27 - pi.motor_controller - INFO - Starting motor test sequence...
</span></span><span class="line"><span class="cl">2026-01-24 18:40:27 - pi.motor_controller - INFO - Testing: Forward
</span></span><span class="line"><span class="cl">2026-01-24 18:40:28 - pi.motor_controller - WARNING - EMERGENCY STOP
</span></span><span class="line"><span class="cl">2026-01-24 18:40:28 - pi.motor_controller - INFO - Testing: Backward
</span></span><span class="line"><span class="cl">2026-01-24 18:40:29 - pi.motor_controller - WARNING - EMERGENCY STOP
</span></span><span class="line"><span class="cl">2026-01-24 18:40:30 - pi.motor_controller - INFO - Testing: Rotate Left
</span></span><span class="line"><span class="cl">2026-01-24 18:40:31 - pi.motor_controller - WARNING - EMERGENCY STOP
</span></span><span class="line"><span class="cl">2026-01-24 18:40:31 - pi.motor_controller - INFO - Testing: Rotate Right
</span></span><span class="line"><span class="cl">2026-01-24 18:40:32 - pi.motor_controller - WARNING - EMERGENCY STOP
</span></span><span class="line"><span class="cl">2026-01-24 18:40:33 - pi.motor_controller - INFO - Motor test complete
</span></span></code></pre></div><p>All four wheels spinning. Forward, backward, rotate left, rotate right. Every &ldquo;EMERGENCY STOP&rdquo; in that log is the safety timer doing exactly what it&rsquo;s supposed to — cutting power between test moves so the car doesn&rsquo;t launch itself off the desk.</p>
<p>Then the camera test, 640×480 at 10 frames per second, JPEG sizes between 12 and 29KB. Real captures, not simulated data. The AI would have eyes.</p>
<p>There&rsquo;s something slightly ridiculous about watching four wheels spin for the first time. They&rsquo;re just motors responding to GPIO signals. Nothing fancy. But it meant the wiring was right, the code worked, and the thing I&rsquo;d been describing in prompts for weeks was now a physical object that moved when told to move.</p>
<hr>
<h2 id="whats-next">What&rsquo;s Next</h2>
<p>The hardware layer is done. The Pi captures frames. The motors respond to commands. SSH from the workstation works without a password. Everything underneath the AI is in place.</p>
<p>The next post is the one I&rsquo;ve been waiting to write: the first time the AI actually drove the car. Camera frames went up to the server. Motor commands came back. The car moved because a language model looked at a picture and decided it should.</p>
<p>But before any of that could happen, I had to build the thing. With no experience. Without reading the manual. Guided by an AI that knew more about motor controllers than I did, which turned out to be a surprisingly comfortable way to build hardware.</p>
<p>Working robot car. Zero robotics background. Two dead motor controllers. One good story. And a little hero now sitting on my desk, ready for the next chapter.</p>
<p>On to the drive.</p>
<hr>
<p><em>This is part of the AIOS:ThinkTank build series. Follow along on <a href="https://github.com/bjhengen/aios-thinktank">GitHub</a> or connect on <a href="https://linkedin.com/in/brian-hengen">LinkedIn</a> for shorter-form updates and video clips.</em></p>
<h2 id="about-the-author">About the Author</h2>
<p>Brian Hengen is a Vice President at Oracle, leading technical sales engineering teams. The views and opinions expressed in this blog are his own and do not necessarily reflect those of Oracle.</p>

<div class="author-card">
    <div class="author-avatar">
        <img src="/images/profile.jpg" alt="Brian Hengen">
    </div>
    <div class="author-info">
        <div class="author-name">Brian Hengen</div>
        <div class="author-title">VP at Oracle • AI Researcher • Builder</div>
        <div class="author-bio">
            Building and testing specialized language models on an RTX 5090. 
            Exploring what happens when smaller, focused AI beats larger general-purpose models.
        </div>
        <div class="author-links">
            <a href="https://github.com/bjhengen" target="_blank" rel="noopener">GitHub</a>
            <a href="https://linkedin.com/in/brian-hengen" target="_blank" rel="noopener">LinkedIn</a>
            <a href="https://x.com/bhengen" target="_blank" rel="noopener">X/Twitter</a>
        </div>
    </div>
</div>

]]></content:encoded></item></channel></rss>