Electroleaf: Nanoleaf-inspired wall decoration / lamp. Powered by an Arduino and controlled over Bluetooth from a custom Android app

After seeing the 3D print designs from Akyelle on Thingiverse I wanted to see if I could make a version which could be controlled from a smartphone over Bluetooth.

I had code for controlling addressable WS2812B LEDs / Neopixels since from previous lamp projects but in this case that wasn’t enough. Code for controlling the LEDs over Bluetooth (HC-05 Bluetooth module) as well as some info on how to make an app that could talk to it was also needed. In the end I used HowtoMechatronics excellent guide on Bluetooth and Arduino apps for that part and combined it with the lamp code to create the below

The modular lamp parts are printed on a Prusa i3 MK3s in black and white PLA. Each LED is soldered together with its neighbor – laborious work as each has 6 connectors (5V, GND, Signal in/out on each end). Currently 6 panels with 9 LEDs each means 342 soldering points just for this small size version 🙂

Wiring diagram

Breadboard schematic showing the wiring and components

Links to resources

A note on programming the Arduino: Please disconnect the TX and RX leads from the Arduino board when programming it or the Arduino sketch download will fail

Android app

The app is made almost exactly according to HowtoMechatronics guide but with some minor changes. If the code for the Arduino is used without any major modifications it’s possible to simply download and use the app as-is from here: link

However, I’d highly recommend creating your own using MIT App Inventor as it’s pretty easy when following the guide. A couple of screenshots of the process are included below for reference but I won’t duplicate the work done in the HowtoMechatronics guide.

The UI layout

The app logic

That’s it 🙂 Have fun creating your own lamps. As someone said: “Why buy something when you can build it for three times the cost?

Japanese input in Openbox

It can be a challenge to get Japanese input working in the Openbox window manager on Debian. Personally I forget how to do it after each OS reinstall, so here’s a quick guide:

Add Japanese locale (ja_JP.UTF-8 UTF-8):

sudo dpkg-reconfigure locales

Install ibus and ibus-anthy:

sudo apt update
sudo apt install ibus ibus-anthy -y

Add environment variables and ibus daemon to Openbox startup:

cat >> .config/openbox/autostart.sh
export GTK_IM_MODULE=ibus
export XMODIFIERS=@im=ibus
export QT_IM_MODULE=ibus
ibus-daemon -d -x

Add Japanese with anthy to the ibus settings (“input-method” -> “add”):

ibus-setup

Restart Openbox (log out and back in again)

Switch between input methods using the keyboard shortcut (super/win + spacebar):

3D printed caterpillar tracks for the Tamiya Konghead 6×6 (C6-01)

Inspired by old Valmet 901 / 911 forest harvesters this is an attempt to create 3D printed caterpillar tracks which envelop the regular tires of a Konghead RC truck

Test drive of caterpillar tracks for the Tamiya Konghead

The track were created in Tinkercad based on an existing design: https://www.tinkercad.com/things/1rrQgXL5KSb-copy-of-copy-of-starting-point-for-tracks

The updated version made for the Konghead can be downloaded from Thingiverse here: https://www.thingiverse.com/thing:4714521

Instructions to create the tracks

  • Download the STL from Thingiverse here: https://www.thingiverse.com/thing:4714521
  • Print 24 copies of the link design per track
  • Use straightened-out paperclips to hold them together (holes for the paperclips are part of the design).
  • Melt some filament to close up the holes on either side to keep the paperclip locked in place.
  • Note: Leave one link un-sealed so the tracks can be removed if needed.
Details of the build and a test drive on YouTube…

Happy Holidays!

The holiday season means different things to different people. In this case it means pointing the light ring, which has been aimed toward me during so many Zoom calls, towards something else. May I offer a nice Calabi-Yau manifold in these trying times? Happy holidays everyone!

Upgraded targeting system: Voice control

To make it more interesting I thought it’d be cool to restrict the machine learning model to a single type of item and add support for voice control to change between types. While I was at it I added the audio from the Portal turrets, because why not?