Since restoring a couple Mac OS 9 machines and playing with them, I’ve noticed some nice touches in various places.
One is that when you have the Platinum Sounds enabled and drag a window, the sound effect will pan in stereo with the window’s horizontal location.
I probably didn’t pick it up very well by recording with my iPhone, but just imagine it gets louder on the speaker the window is closest to. Pretty cool!
Wired up and runningWith the project box wrapped up like a presentThe housing for the NodeMCUTwo sets of power strands come from the NeoPixels into the box. Both + and – are black to stay concealed.Wiring up the barrel connectorNodeMCU is attached to headersSoldering up the inline powerSoldering quick disconnects on the ring
This part expands on that and tackles the issue: ‘How do I know the state of my garage door if I’m not at home?’
Because the code operates no differently than someone pressing a single button on the remote control, you would normally have to look with your own eyes to see if you were closing, stopping, or opening the garage door. This can be an issue when your eyes are nowhere close to it.
I wanted to come up with a solution that didn’t involve running new equipment such as a switch to detect the door’s orientation. I decided to utilize what I already had in the garage: A camera. Namely, this one: Foscam FI8910W
The idea is to use the camera to grab an image, pipe that image into OpenCV to detect known objects, and then declare the door open or closed based off of those results.
I whipped up a couple of shapes in Photoshop to stick on the inside of my door:
Shapes taped to the inside of the garage door
I then cropped out the shapes from the above picture to make templates for OpenCV to match.
The basic algorithm is this:
Get the latest image from the camera
Look for our templates with OpenCV
If all objects (templates) were detected, the door is closed – otherwise it’s open
Shapes successfully detected
To help make step 3 more accurate, I added a horizontal threshold value which is defined in the configuration file. Basically, we’re using this to make sure we didn’t get a false positive – if the objects we detect are horizontally aligned, we can be pretty certain we have the right ones.
I was happy to find that the shapes worked well in low-light conditions. This may be due to the fact that my garage isn’t very deep so the IR range is sufficient, as well as the high contrast of black shapes on white paper.
Currently I have some experimental code in the project for detecting state changes. This will not only provide more information (e.g. the door is opening because we detect the pentagon has gone up x pixels), but is good for events (e.g. when the alarm system is on, let me know if the door has any state change).
I’ve tested running this on the Raspberry Pi and it works fine, though it can be a good bit slower than a full-blown machine. I have a Raspberry Pi 2 on order and it’ll be interesting to see the difference. Since this code doesn’t need anything specific to the Raspberry Pi, someone may prefer to run it on a faster box to get more info in the short time span it takes for a door to open or close.
I’ve created a video to demo the script in action!
It’s not a serious competition until you’ve put a microcontroller inside your gingerbread house. Since we were going for the Charlie Brown theme, I ripped apart and adapted a musical card that played ‘Linus and Lucy’. I also rigged up some LEDs to blink with the music. Video below the pictures.
I played around tonight with making my iPod touch an auxiliary display. I thought it may be neat to just have random real-time public tweets cycle through on it so I made the following. As you can see there’s nothing too smooth about it yet – no AJAXiness implemented as this was purely proof of concept.