Categories
Raspberry Pi Tutorials

Using Linux to create a Cellular WAN Gateway

Today our home Internet went out. It rarely happens, but when it does it’s typically brief – maybe a couple hours max, and I can hobble along with my Mac connected directly to my phone until it comes back. Today’s outage was a lot longer, though, and after a few hours I really wanted to get the entire house back online so all of my Internet dependent devices could get back to normal.

My hope initially was that my MikroTik router would allow me to connect up my iPhone via USB and bridge the hotspot to the rest of my network, but I couldn’t make that work. The router identified that the phone was connected, but it wasn’t treating it like a USB Ethernet interface. I could’ve devoted one of its radios to connect wirelessly to the phone, but I really didn’t want to mess with my router’s configuration knowing that this would be a temporary setup.

Using Linux to do this (in my case a Raspberry Pi) was super simple. Here’s what I had to do to share the iPhone’s Internet connection with my entire network. Afterwards everything worked as it should – wireless and wired devices could all reach the Internet.

On the phone

Specs: iPhone 15 Pro on AT&T

Just make sure Personal Hotspot is enabled (you need a cellular plan that allows tethering), and connect it to a USB port on the Pi. You should get a prompt on both devices to trust the other one. This is likely the same experience if you have an Android device.

On the RPi

Specs: RPi 5 running Bookworm + Gnome

With the iPhone connected, tethering was already working! It showed up as an interface named eth1 and traffic was successfully routed. Kudos to the folks who made this so easy to get going.

The next step was to turn the RPi into a gateway that we could plug into our router’s WAN Ethernet port.

Here are some assumptions I’ll use for the commands below:

  • eth0 is the Raspberry Pi’s Ethernet interface
  • eth0 will have an IP of 192.168.88.1/24
  • eth0 will serve a DHCP range of 192.168.88.10 - 192.168.88.100
  • eth1 is the iPhone’s USB Ethernet interface

With root permissions we need to do the following:

  1. Enable packet forwarding:
    • sysctl -w net.ipv4.ip_forward=1
  2. Set eth0‘s IP to 192.168.88.1 with a netmask of 255.255.255.0.
    • You can use the GUI for this, or /etc/dhcpcd.conf, or whatever’s best for your distro.
  3. Install dnsmasq:
    • apt install dnsmasq
  4. Configure the DHCP range for eth0 by editing /etc/dnsmasq.conf:
    • interface=eth0
      dhcp-range=192.168.88.10,192.168.88.100,255.255.255.0,24h
  5. Restart dnsmasq:
    • systemctl restart dnsmasq
  6. Set some iptables rules:
    • iptables -t nat -A POSTROUTING -o eth1 -j MASQUERADE
    • iptables -A FORWARD -i eth0 -o eth1 -j ACCEPT
    • iptables -A FORWARD -i eth1 -o eth0 -m state --state RELATED,ESTABLISHED -j ACCEPT
  7. Confirm iptables look good:
    • iptables -t nat -L -n -v

Note: These iptables rules won’t survive a reboot, so you’ll want to commit them if you need that. Also keep in mind you’ll likely want to disable dnsmasq and revert eth0‘s network configuration when things get back to normal.

On the router

If your router is configured to get its WAN IP via DHCP there’s nothing to do here – you just need to make sure that the Raspberry Pi’s Ethernet cable is connected to the appropriate WAN port.

If your router isn’t configured to use DHCP, just give it an IP in the range above, a gateway of 192.168.88.1, and some standard DNS servers like 8.8.8.8 and 8.8.4.4.

That’s it

Everything wired or wireless on the network just works – it doesn’t know (or care) where the WAN connection is coming from. A great solution for those days when you need a failover! 🛜

Categories
Coding Video

STEM Day 2026

Dash robot

I had the privilege of talking to first graders at the local elementary school about coding and robots for STEM Day! This was extra special because one of them happens to be my kiddo. 😀 I used Dash robots (graciously provided by the engineering teacher) to demonstrate how we communicate with computers and instruct them – one command at a time – to accomplish tasks.

Dash drove around, played a Xylophone, drew on paper, played sounds, catapulted a ball, and more. Using the drag and drop programming interface of the Wonder app was a great tool for this age range, and I think they had as much fun as I did.

Dash playing the Xylophone
Dash drawing

Categories
Arduino Projects

Using a NeoTrellis to monitor CI build states

I’ve had an Adafruit NeoTrellis M4 Express for a while and it’s a lot of fun to play with. It’s got NeoPixel illuminated buttons, audio capabilities, a powerful SAMD51 CPU, and a lot of potential.

I’m on the Apps Infrastructure team at Automattic, so I thought it would be cool if I could make it represent the states of our Buildkite builds on the board. Green would mean the build passed, red would mean failed, pulsating yellow if running, etc. Pressing the button would open the build directly in my browser.

So that’s exactly what was built! Thanks to AI tooling I was also able to add some goodies like simulators for the Trellis as well as incoming Buildkite data.

Builds can be filtered in a menu bar app by queue or pipeline, as well as states (e.g. just show me failing builds).

Check out the repo here!

Categories
AI Gaming Projects

DarwinDOOM: A weekend hack boosted by AI

Years ago I slapped together a way to run DOOM on Apple devices by wrapping doomgeneric with SpriteKit. The task was relatively straightforward, and I shared some technical details in the original post. What was most exciting to me about the weekend project was witnessing a game that took every bit of my ’90s home computer now running effortlessly on my wrist. It would’ve been even cooler if it had sound on the Watch, but I learned that wasn’t going to come easily and needed more of my time and attention than I had to spare. A common theme with having so many side projects.

Then AI coding tools arrived and they’re the side project game changer. For this particular case I wasn’t in it for the academic exercise – I just wanted to point and get what I wanted, so I did just that.

Presenting DarwinDOOM, formerly known as AppleGenericDoom!

Now with:

  • Sound on the Apple Watch
  • Keyboard and touch controls
  • A signed and notarized Mac app that you can run instantly
  • The ability to run DOOM in your macOS Dock
  • The ability to control DOOM Guy’s face with your face via your camera
  • An iOS / iPadOS app that also supports a connected keyboard
  • MIDI music (except on the Watch, for now)
  • All Apple frameworks, no SDL dependencies
  • Nifty icons

and more!

What a time for software and side projects. 😎

Categories
Gadgets

Digital microscope fun

For ~$50 you can find digital “microscopes” on Amazon that save images to SD cards. Here’s an iPhone OLED screen displaying the letter “T”.

All the colors you see on your phone, computer, TV, etc. are just some additive combo of red, green, and blue. In the manufacturing process of these newer “diamond” layouts you’ll find that the green sub-pixels are physically smaller – this is because our eyes pick up green the best (it’s in the middle of the visible electromagnetic spectrum). ROYGBIV 4lyfe. 🌈

Categories
AI SwiftUI

ChatGPT SwiftUI Christmas Tree 2025

We’re back again for 2025! Here’s the familiar prompt supplied to ChatGPT now running the 5.1 model:

Draw a Christmas tree using SwiftUI

Expand to see the code.
import SwiftUI

struct ChristmasTreeView: View {
    var body: some View {
        ZStack {
            Color.black.opacity(0.9)
                .ignoresSafeArea()

            VStack(spacing: 0) {
                // Star
                StarShape(points: 5)
                    .fill(Color.yellow)
                    .frame(width: 40, height: 40)
                    .shadow(radius: 5)
                    .padding(.bottom, 8)

                // Tree (3 triangles)
                TreeLayer(color: .green, scale: 1.0)
                TreeLayer(color: .green, scale: 1.2)
                TreeLayer(color: .green, scale: 1.4)

                // Trunk
                Rectangle()
                    .fill(Color.brown)
                    .frame(width: 30, height: 50)
                    .padding(.top, -10)
            }
            .overlay(OrnamentsView())
        }
    }
}

// MARK: - Tree layer (triangle)

struct TreeLayer: View {
    let color: Color
    let scale: CGFloat

    var body: some View {
        Triangle()
            .fill(
                LinearGradient(
                    gradient: Gradient(colors: [color.opacity(0.8), color]),
                    startPoint: .top,
                    endPoint: .bottom
                )
            )
            .frame(width: 160 * scale, height: 100 * scale)
            .shadow(radius: 5)
            .offset(y: -20 * (scale - 1))  // stack layers nicely
    }
}

struct Triangle: Shape {
    func path(in rect: CGRect) -> Path {
        var path = Path()
        path.move(to: CGPoint(x: rect.midX, y: rect.minY))      // top
        path.addLine(to: CGPoint(x: rect.minX, y: rect.maxY))   // bottom left
        path.addLine(to: CGPoint(x: rect.maxX, y: rect.maxY))   // bottom right
        path.closeSubpath()
        return path
    }
}

// MARK: - Star shape

struct StarShape: Shape {
    var points: Int

    func path(in rect: CGRect) -> Path {
        guard points >= 2 else { return Path() }

        let center = CGPoint(x: rect.midX, y: rect.midY)
        let radius = min(rect.width, rect.height) / 2
        let innerRadius = radius * 0.4

        var path = Path()
        let angleStep = .pi * 2 / Double(points * 2)

        var angle = -Double.pi / 2 // start pointing up
        var firstPoint = true

        for i in 0 ..< points * 2 {
            let r = (i % 2 == 0) ? radius : innerRadius
            let x = center.x + CGFloat(cos(angle)) * r
            let y = center.y + CGFloat(sin(angle)) * r

            if firstPoint {
                path.move(to: CGPoint(x: x, y: y))
                firstPoint = false
            } else {
                path.addLine(to: CGPoint(x: x, y: y))
            }

            angle += angleStep
        }

        path.closeSubpath()
        return path
    }
}

// MARK: - Ornaments overlay

struct OrnamentsView: View {
    var body: some View {
        ZStack {
            ornament(x: -50, y: -40, color: .red)
            ornament(x: 40,  y: -20, color: .yellow)
            ornament(x: -30, y: 10,  color: .blue)
            ornament(x: 50,  y: 30,  color: .pink)
            ornament(x: 0,   y: -5,  color: .orange)
        }
    }

    private func ornament(x: CGFloat, y: CGFloat, color: Color) -> some View {
        Circle()
            .fill(color)
            .frame(width: 14, height: 14)
            .shadow(radius: 3)
            .offset(x: x, y: y)
    }
}

// MARK: - Preview

struct ChristmasTreeView_Previews: PreviewProvider {
    static var previews: some View {
        ChristmasTreeView()
    }
}

Have we already hit peak tree!? Was it decorated as lazily as possible because the model knows how I’d do it?

Visually things are similar, but maybe a little worse:

  • The tree’s body looks more like three stacked triangles than one contiguous shape.
  • Ornament distribution is clustered to the middle with one ornament attached to air.
  • We still have a floating star.
  • It tried to get fancy with some opacity and shadow tricks, but they fall a bit flat.
  • Once again we have a new background color.

That said, this isn’t too terrible knowing that I didn’t give the model the opportunity to analyze its own visual output and retry.

Code-wise:

  • ✅ It extracted more reusable views so the main body view is easier to understand.
  • ✅ Made good use of MARK annotations to increase readability.
  • ✅ Removed some global “magic numbers” (e.g. offsets) to enable adaptability to other canvas sizes.
  • 🤔 Made the generation of the StarShape path a lot more complicated. It traded simplicity for “flexibility” with ultimately the same output.

Happy Holidays! 🎄🤖

Categories
AI Arduino Projects Video

NeoPixel Christmas Tree 2025

I’ve hit the ground running this holiday season with two updates (so far!) to the NeoPixel Christmas Tree project.

Vibe coded goodies

I’ve been using an LLM to add some experimental goodies to the project such as:

  • A Python script that analyzes WAV files to synchronize light sequences to the beat
  • A local model rendered in Three.js to enable rapid prototyping

https://github.com/twstokes/neopixel-tree/pull/15

ESP32 upgrade

At the end of 2024 I experienced stability issues when updating many LEDs rapidly, e.g. a looping rainbow effect with a really low delay set. After some indeterminate amount of time the tree would crash and restart itself. I initially assumed it was bad C code on my part, but failed to spot anything and eventually put the tree away sometime in January of 2025 because it just got awkward. Now that we’re back to a socially acceptable time to have a Christmas tree out I’ve had a chance to do some deeper digging.

I added a new command to have the ESP return the reason it reset which was really helpful in confirming my suspicion that the ESP8266 just didn’t have enough oomph to drive pixels and run WiFi concurrently:

Fatal exception:4 flag:1 (Hardware Watchdog) epc1:0x40103341 epc2:0x00000000 e    pc3:0x00000000 excvaddr:0x00000000 depc:0x00000000

Long-ish story short I upgraded to an ESP32 with two cores (note: number of cores is model-dependent) and this provides the necessary headroom to bit bang LEDs and run WiFi stably. It required some re-soldering of the Perma-Proto board (since the pin layouts are different) as well as some code changes, but overall it was a relatively quick and straightforward upgrade. The tree hasn’t crashed since!

https://github.com/twstokes/neopixel-tree/pull/14

Categories
AI Fun Linux Nerd Sniped Projects

Snooping USB with GPT

I mentioned in my previous post about the Juicy Crumb DockLite G4 that there were two shortcomings with the product:

  1. Their software to control the brightness of the display was macOS only.
  2. The display doesn’t support sleep, so you’re required to press a physical button to turn off the backlight.

Thanks to some USB packet snooping and GPT, both of these problems were solved quickly. 😎

I wanted Linux software control

What started this exploration was that I wanted to control the display from Linux because it’s currently connected to a Raspberry Pi, not a Mac.

Step 0: Check for an open source repo

Negative. I didn’t spot anything, so it’ll take a little more grunt work. I did run their helper app through a disassembler and spotted functions like getting and setting the backlight via IOKit which were all expected. It wasn’t necessary to go this far, it just confirms that we have some USB packet snooping to do.

Step 1: Get lsusb output

Running lsusb with the display connected to the RPi showed:

Bus 001 Device 009: ID 4a43:1d8a Juicy Crumb Systems DockLite

So I grabbed the output of lsusb -v -d 4a43:1d8a to feed it to the LLM later to provide more details about its USB specs.

Step 2: Capture USB packets on the Mac

The double-edged sword of Apple’s security is that it introduces all sorts of challenges for devs who need to get to lower level functionality, like capturing raw data from interfaces.

To successfully USB capture packets on a Mac I had to:

  1. Disable SIP
  2. Bring up the USB controller interface (in my case there were a couple, so through trial-and-error I found the right one)
  3. Tell Wireshark to capture on that interface
  4. Change the brightness using the Juicy-Link app to generate packets
  5. Save the capture to a file

Step 3: Feed the LLM

With the data from steps 1 and 2 I:

  • Fed them to ChatGPT
  • Mentioned that the capture was from a utility setting the brightness
  • Asked for an equivalent Python script

Here’s what it produced.

Step 4: Run the script in Linux

After installing the pyusb module I could run python set-brightness.py 700 and it worked from Linux! The range from lowest backlight brightness to highest is around 700 – 5000.

The extra bonus (which I’m most excited about) is that sending a value of 0 will truly turn off the backlight! This isn’t even (currently) possible with their software. (Obligatory “do this at your own risk”)

Now I can hook into something like D-Bus signals to potentially toggle the display depending on the desktop’s locked state.

Success

✅ The display’s brightness can be set from any OS, not just macOS.

✅ The display’s backlight can be turned off via software.

Thanks GPT! 🤖

Categories
AI Apple Fun

Apple Watch DOOM + Audio

It took a couple LLM prompts plus a few simple syntax tweaks from my side and now DOOM on the Apple Watch has audio. 🤯 It was almost as easy as typing iddqd. To get around lack of SDL support, AVFoundation is used. Check out the GitHub commits.

Categories
DOS Fun Video

Vibe coding like it’s 1999: Flappy Cow

This weekend I vibe coded a tribute to ’90s Gateways, 3Dfx, and DOS with DJGPP, Glide, and Allegro. I originally set up my build environment in DOSBox-X, but then moved to cross-compilation from Linux.

From there I dropped the game into 86Box, which I already had configured with an emulated Voodoo3 3000, for quicker testing.

Finally, I ran it on my Pentium 3 machine with Windows 98 and a real Voodoo3 3000. ChatGPT gets most of the credit for the C code that runs the game, I get credit for the cow voiceover work. 🐄