A wireless LED system

Mapping the LEDs

After the controller was setup, I could start thinking about the mapping and converting the video contents to the LED stripes. TouchDesigner seemed to be a great tool for that task. The group hired a content producer, that also came up with the design of the stripes on the drums and jackets. The design of the jackets was a repeating pattern, while the drums had three different patterns, that alternated. I could make use of those repeating patterns and create 20 tables, that hold the coordinates of each pixels, with a few lines of Python code . 

Creating the tableDATs
Black and white mask for content creation.

I converted these tablesDAT to CHOPS to feed them to a glslTOP as texture buffers. Inside the pixel shader, I checked, if the current uv coordinate (multiplied by the resolution) was present in one of the tables and made that pixel white, if that was the case. This shader created a black and white mask of my pixel mapping. I gave an image of this mask to the content designer, as a template for her videos. When I got the videos, I could overlay the mask myself to see, if the contents were matching my mapping. Indeed they were shifted one pixel to the left, what could easily be fixed within TouchDesigner.

The next task was to create 20 text files, that would hold the video data for each LED stripe. I created a Python extension, that stepped through the video frame by frame and sampled every pixel, that was specified in one of those tables. This processes was repeated for each of the 20 tables. 

In the examples of the Adafruit DotStar library, they showed, how to implement a gamma correction for their LEDs, so the mid-range colors would look right. It is done by creating a lookup array, that later can be used to get corrected color values.  The following code snippets show how to create and use the lookup array.

		self.gamma = bytearray(256)
		
		for i in range(256):
			self.gamma[i] = int(pow(float(i) / 255.0, 2.7) * 255.0 + 0.5)
		for row in range(lookup.numRows):
			sample = self.out.sample(x=lookup[row, 0],y=height-1-lookup[row, 1])
			
			r = self.gamma[int(sample[0]*255)]
			g = self.gamma[int(sample[1]*255)]
			b = self.gamma[int(sample[2]*255)]

			pixs.append([r,g,b])

The color range, that the DotStar library is working with is 0-255, that is why I needed to multiply the sampled color value with 255. Also the order of colors, that those LEDs expect is BGR and before each new pixel there has to be a start marker with the value 255. I formatted each value as hex and ended each frame with a line break.

	def SaveFrame(self, pixs):
		for pix in pixs:
			self.Frames.append('{:02x}'.format(255))
			self.Frames.append('{:02x}'.format(int(pix[2])))
			self.Frames.append('{:02x}'.format(int(pix[1])))
			self.Frames.append('{:02x}'.format(int(pix[0])))

		self.Frames.append('\n')

Those hex values were saved to 20 text files, that I distributed to the Raspberries with FileZilla. (I put them in a directory called frames.)

Sending the timecode

Ableton timecode
MaxForLive plugin

The last thing I needed to do, was building a Max for Live plugin, that would send the timecode. This was fairly easy, I just had to broadcast an OSC message with the address /leds/fone and a value rising from 0 at the start of the audio file to 1 at the end. I extracted the audio from the video file I got, just to make sure the timing would match perfectly. The timecode was sent every 5 milliseconds, what was four times more often, then necessary, since the video was rendered with 50 fps. This should make sure, every frame was played, even if some network packages got lost.

At first I used a “home” WIFI router with very good reviews, but it would sometimes stutter for no good reason. So I switched for a Vigor2925 business router, which was working fine.

Remote Control

As a bonus, I implemented two remote control options, so the group could start or stop the playback without walking away from their drums. The first one was a simple TouchOSC layout, that would send OSC to the Max for Live plugin, which then would start or stop certain scenes in Ableton. The second one was a Bluetooth remote, that would connect to one of the LED controllers.

To get it working, I needed to re-enable Bluetooth on the controller and connect it to the remote with the following commands:

  • sudo bluetoothctl
  • power on
  • agent on
  • default-agent
  • pair 20:76:50:06:54:29
  • trust  20:76:50:06:54:29
  • connect  20:76:50:06:54:29

Then I installed evdev and its dependencies with  “sudo apt-get install python-dev”, “sudo apt-get install python-pip” and “sudo pip install evdev”. 
In a second Python script, I listened for key events on the Bluetooth remote and sent corresponding OSC messages to the Max for Live plugin.

		for event in controller.read_loop():
			if event.type == ecodes.EV_KEY:
				if event.value == 0 or event.value == 1:
					if event.code == 105 or event.code == 165:
						client.send( OSCMessage("/controller/left", [float(event.value)] ) )
					elif event.code == 106 or event.code == 163:
						client.send( OSCMessage("/controller/right", [float(event.value)] ) )
					elif event.code == 103 or event.code == 115:
						client.send( OSCMessage("/controller/up", [float(event.value)] ) )
					elif event.code == 108 or event.code == 114:
						client.send( OSCMessage("/controller/down", [float(event.value)] ) )
					elif event.code == 28 or event.code == 164:
						client.send( OSCMessage("/controller/play", [float(event.value)] ) )
					elif event.code == 1 or event.code == 113:
						client.send( OSCMessage("/controller/mute", [float(event.value)] ) )
					elif event.code == 172:
						client.send( OSCMessage("/controller/record", [float(event.value)] ) )

With some error handling and recursiveness I made sure, that the controller would reconnect, if it’d loose the remote or the OSC target. I added the execution of controller.py to the launcher.sh script and I was done.

Conclusion

Like with every new project, there was a bit of trial and error involved, but in the end, the approach was straight forward and I am pleased with the result. The combination of Raspberry Pi and DotStar LEDs was easy to work with and builtin WIFI, SD card storage and the possibility to script in Python made prototyping very fast. Using a timecode in form of a floating point index proved to be very flexible. It works with different fps and it’s easy to change the tempo of the playback and even cut and rearrange it, while keeping the LED’s in sync. With the use of OSC, there are a lot of tools like TouchOSC, that become compatible and Ableton is of course a great software for audio playback and show control.

Most importantly it will be easy to implement new features like new video contents, different type of LED fixtures and lighting control or reuse the techniques for other projects.

Controller, Powerbank and LEDs