In the 80s, my Pascal programming experience offered me the extraordinary professional opportunity to work in the TV broadcasting operation and the effect control company and there with a Quantel Mirage real-time 3D effects and a Quantel Paintbox transmission graphics. Create.
The Quantel Mirage was a million DM computer that consumed 4kw of electricity and stood in a cold storage room. Programming took place via a tiny terminal in Pascal. Extremely tedious and very time-consuming. I was one of the few or even the only operator in Germany at the time. Thanks to my programming skills, I was not only able to use the effects embedded by the manufacturer, but also develop new effects and then apply them to TV shows or trailers. The Quantel Mirage was groundbreaking in its computing power, it was able to map a live video signal to any three-dimensional body in real time and move it in 3D space. Manipulate.
Here are pictures of me and the Quantel Mirage, as well as with the programming terminal from HP.
In addition to several TV shows such as “Solid Gold” on ZDF I created a lot of effects for the then pioneer of SAT 1 “PKS”.
The knighthood and certainly a highlight of my then “VJ” activity were the effects for the music video of the group “QUEEN” for the songs “living on my own” and “One Vision”.
Freddie Mercury personally congratulated me on the effects:
Much more sober is this effect demo, which I created in 1985 by the Mirage for editors in the house.
The background music is also complete by me (early 80s)
In the mid-1990s, I was interested in the emerging real-time animation with super graphics computers such as Silicon Graphics Onyx. One of the first (or the first) performance group for real-time graphics at music events (today you would just say Visual Jockey Group) with the strange-looking name “RasterMasters” from California, USA, just made a guest appearance in Germany (Raster was a technical word as opposed to vector, a new type of screen representation).
A group member of the 4-member “band” fell out due to an accident and the group management was therefore looking acutely in Europe for someone who brought the necessary skills and artistic prerequisites. Then they came up to me and I liked to jump in and into the cold water, because the group members had to deal with the problems of their highly complex technology and their own software. My graphics workstation (Silicon Graphics Onyx) was just free and equipped with its own real-time software and hardware. So I was able to shape the live events, which then came – and I think also quite innovatively.
In contrast to the more psychedelic “californic” 2D image collages of the other group members (Ron Fisher of Silicon Graphics Los Angeles, and David Tristram of NASA Ames Research Center, pictured from the Grammy Award-winning Video artist Maggie Hoppe from San Francisco) I delivered only real-time 3D elements, up to singing real-time heads with virtual muscles, which I controlled live at the events with a self-developed real-time face tracker. This has drastically improved the imageworlds of the other group members and thus the audience experience in my opinion.
For example, we played the Westfalenhalle with 50000 visitors in Dortmund, both in 1995 for Mayday 9 and on 30.04.1996 on the occasion of the legendary techno event Mayday X.
A few hours of material were taken over live by Viva and cut into the live broadcast. It’s really hard to find the material, here’s a dancing 3D real-time figure in the image at WESTBAM Performance:
The group, which is already scattered across two continents, disbanded due to management and financial problems. This cleared the way for me to perform with my own team at the Loveparade in Berlin at Ernst Reuter Platz with a fixed stage without 2D restrictions. Two of our specially trained dancers controlled live-dancing real-time avatars on stage with our self-wrapped data suit. Many other events at home and abroad followed and I must admit this was a very exhausting, but also extremely innovative and very beautiful time.
New visuals from 2015:
I’m experimenting with After Effects, everything needs to be rendered here, but I don’t have the problems I’ve had with real-time processing through processing. (Processing is awesome, but too slow for real time on my computers. I tried everything to get the movies out, but no solution was ultimately in sync with the original audio/midifile.)
Based on user templates, I started creating videos to learn how to use After Effects. Music of course always by me.
Here are three results: