When I tumbled upon the robotic installation at Ars Electronica Festival in autumn 2015, I knew I have to get in touch with someone who´s on hold of these machanic toolarms. Chance was on my side when I connected with Johannes Braumann from Robots in Architecture that day. As it turned out he is giving lectures at the Kunstuniversität Linz and head of their robotic laboratory.

A couple of weeks later we met for the first time in the laboratory and had a chat about the optional collaboration. So it was in the end of November when I took my pixelstick with me and we directly mounted it on the big KUKA robot in the basement. First we had to find a choreography that works with a 2 meter long stick on the robots arm, as the basement doesn´t provide that much space. The big robot is permanently installed in the laboratory and not moveable. Directly at the first test I saw its potential - exact animationpaths in combination with specific designed graphic layouts on the pixelstick blew my mind.

But we were relativly limited to certain timeslots and within the following two months we knew we had to develop a finished working version of the installation for the upcoming exhibition at Ars Electronica Center called "creative robotics". So we switched over to its smaller brother the KUKA KR-16. This robot nearly does the same but due to its weight it´s mobile. Now when it comes to a functional mobile installation version, the task was getting more and more technical. As this one is smaller, we decided to use only the half of the original pixelstick, so the picturesize was reduced to 100 pixels and as I knew from earlier tests,  the half-version works aswell as soon as it gets the powersignal. 


So first of all Johannes created a customized 3D print model for the KUKA-platform  which we could exchange with the original pixelstick one so the shortened pixelstick would fit exactly on the robotarm. Johannes developed a direct powersupply coming from the robot which is also triggering the firebutton of the pixelstick once the robot restarts its choreography.

Meanwhile I was working on the graphics and in a second step on the synchronisation-process between graphic lenghts, pixelstick speed-settings and the animationpath of the robot which I calculated with exel tabloids and a simple stopwatch. The choreography loop was 3:28 minutes long and I had to find out which speed in the pixelstick settings was needed to playback the final picturesize of 3000 x 100 pixels in time.By dividing 3000 (pixel) by 208 (sec) I could precise the "visible-pixels-by-seconds" on the stick and adapt this to the motionpath-sequences of the robot.

When it´s about developing graphic simplicity, the curious thing was to rethink the forms, e.g. using a simple 2D ladder in combination with a spiral move of the robotarm can create a full 3D DNA helix.

Besides the graphic development and choreography I was working on the live-feed capturing of the sticks movement by film. From my permanent research and tests I was aware that none of the nowadays existing lightpainting apps workes smooth enough to come close to the photo-results. First I worked with the LPL Mercury app, but due to the fact of sharpness, cameracontrol and limitations in fadeout-timer, I had to find something else. 

But I´ve to mention there was one very interesting side-effect of the tests with the Mercury App: the output while the studio working-lights were switched on. The robot started to erase itself within the exposure and that style looked phantastic, something like a negative lightpainting that erases itself.

I definetly know that I will have to work on that style again.


Making it customized


That´s why I asked my friend Gerrald van der Kolk from the Netherlands to work on a smooth max patch that has various options to control fadingtime, coloring, saturation, contrast and brightness aswell as various input/output choices so at least a Full HD signal on 50fps could run on the laptop.

The beta version of the brandnew software was finished right in time - one day before the opening - so I was able to test the workflow in the deepspace a night before.  The final show worked out great and the feedback was damn good and motivating myself to continue this research as soon as the four weeks exhibition at AEC is over. As said, it has only just begun.

RELEVANT LINKS:;art16,2108847