I first want to acknowledge that I did the thing that I try to never do: I showed off a snazzy project, left some hints here and there of how it worked, said I would follow up with full details… and never did. That’s lame.
I’ve had multiple people reach out for more info and I’m glad they did, since that’s pushed me to finally get some repos public and this belated follow-up written. Apologies!
To jump straight to it, I’ve published these two repos:
Let’s first go over the hardware involved. The most important piece, of course, is the Alfa-Zeta XY5.
In my case, the 14×28 board was made up of two 7×28 panels connected together via RJ-11.
The panels are pricey, but they can be thought of as “hardware easy-mode”. Alfa-Zeta has done the hard job building the controller that drives the hardware and all we have to do is supply power and an RS-485 signal that abides by their protocol.
If you purchase a panel from them there are two important documents to request:
The main manual that describes the specs, features, and things like the DIP switch settings.
The protocol for sending commands to the controllers (which is really simple).
These can easily found by searching around, but if you own a panel the company should supply them. Most of the protocol can be deduced by looking at open source code.
At the moment there isn’t much to it – you can either compile the firmware to run in a mode that writes data from UDP packets to the board, or you can draw “locally” using Adafruit GFX methods.
See the README in the repo above for more details.
Semi-interestingly I utilized Adafruit GFXagain, this time via swift-gfx-wrapper to draw to the board over UDP. It’s hacky and experimental, but that’s part of the fun.
See the README in the repo above for more details.
Since doomgeneric exposes the framebuffer, I throw that into an SKTexture and that gets added to a node in the SpriteKit scene, which is subclassed to override the update method to call doomgeneric_Tick(). Objective-C is used for interop between C and Swift, and fulfills most of the functions listed here. SwiftUI ultimately outputs the scene.
Very few tweaks needed to be made in doomgeneric itself.
They were basically:
Conditional compilation for a few calls that watchOS didn’t support (and we didn’t need).
Tweaking the 32-bit color bit offsets.
Handling a crash related to passing in arguments.
On watchOS we pass the absolute path of the WAD file in the main bundle to the engine.
Adjusting some SDL2 includes so headers could be found.
I haven’t spent as much time on my Commodore 64 as my other retrocomputers (which can seem modern in comparison), but my explorations over time are trending towards older hardware. I can only assume that my final stop will be an abacus.
I have three C64s all passed down from my dad. One had been devoted to a home alarm system (of course we still have the schematics), but by the time I came around it was only used for playing half-working totally not bootlegged games.
A sampling of some favorite software from my childhood:
Impossible Mission (one of those bootlegs that hilariously kinda worked)
Retrocomputing plans
I hope to be able to fully restore at least one of these machines this year. The one pictured above powers on and is fully functional, but some flakiness at startup tells me that it’s overdue for a recap.
One not-so-smart thing I did when I unpacked all of this equipment was powering it up with the original C64 power supply. That’s a risky move and likely to damage the C64 with bad power, so I’ve since replaced it with a new modern one (see the parts list).
I’m not interested (nor do I have the space) to use these machines in the “pure way” with a CRT and 1541 drives, although I have both. Maybe down the road that would be fun, but for now I’m utilizing modern gadgets from the wonderful C64 aftermarket community.
When the app is told to make it snow it adds full-screen non-interactive windows on each display and inside those windows adds a SpriteKit view with a scene inside that contains emitters.
Thanks to Whisper and this awesome port, the tree is responding to spoken words. 🗣🎄
Since the tree itself only has a low-powered MCU, we need another machine to act as a listener.
The architecture is:
A machine in my office runs the Whisper model and listens for words.
If certain keywords are found it finds a corresponding command to run (e.g. do a theater chase sequence in a green color).
It sends that command to the tree over the network.
For now I’m running it from iOS and macOS, so I wrote the current implementation in Swift. The code is currently still in “hack” status, but working well!
Now it’s time to test it when talking to coworkers at Automattic.
Thanks to this repo you can fire up Windows 3.11 painlessly and even connect to the Internet! Lots of goodies are already installed so it’s a fun virtual trip.
There were some graphical glitches (restarting or dropping down into DOS and back helps) and it crashes when running Netscape, but Microsoft Bob seemed to work fine!
This weekend the NeoPixel tree got many much-needed updates!
Though I have more ideas to implement, the basics of what I wanted to do are complete, like sending commands remotely.
What we can currently do:
Set the brightness
Change the color
Turn the pixels off
Run some built-in sequences, like a nice rainbow
Set repeating color patterns
Set individual pixels
I also threw together a really quick iOS app to set the color with SwiftUI’s built-in ColorPicker view. Thanks to the Rover project (another one that’s been neglected), I had some UDP client code I could borrow to speed up development.
Changing the color of the tree with a SwiftUI ColorPicker view.