This Christmas at LOVE we decided to take our 2D mural projection mapping a step further. LOVE commissioned an Ian Stevenson to come up with a new Christmas themed mural to cover our 5×3 metre wall by reception. My job was to develop a system that could enable us to projection map animations back onto the wall, and let people create, send and playback personalised greetings onto the wall via a live webcam stream. This was pretty tricky!
Here’s some more about it…
Projector There were several show stoppers I had to overcome at the start. Firstly, where do I find a projector with a short enough throw; can cover a 5×3 metre wall; will stay on 24/7 without overheating; and will be bright and crisp enough to animate, light up and show personalised messages on the wall – bearing in mind users would also need to view these online, day and night (we have clients in China remember!). This took many phone calls, and professionals saying it couldn’t be done for under 4 grand. Well, we proved them wrong 😉 We were projecting on top of a print, so we only needed it bright enough to light up what was already on the wall. I found a pretty descent one for the job in the end. I’m rather knowledgeable with portable projectors now!
Webcam & Live Stream Another potential show stopper was the webcam. We were running OSX, and the only compatible cameras were Logitec, so we bought the best one only to find that the drivers weren’t fully compatible with Mac’s. We had to have control over auto-whitebalance and auto-focus, otherwise the camera wouldn’t focus on the projection. So we ended up installing windows 7 on bootcamp, to gain complete control. Job’s a good-en. These were the two main issues, plus we had to stream at a high enough bit-rate that the text would be legible after encoding for the live stream.
Software I developed the openFrameworks application on top of the previous quad warp projection mapping tool we developed, and added functionality for fading between several videos; connecting to the node.js backend; queuing messages (prioritising clients over users); and elegantly displaying messages on the projection. Nice text in openFrameworks was a bit of a nightmare to begin with. I had to build my own text-align and word wrapping functionality in.
The node.js backend handles all of the message creation and hosts the html/js. I hash the messages so they are gibberish in the query string when someone is sent a link, otherwise it’ll spoil the fun if they’ve seen the message already!
We are using a Flash front-end which uses external interface calls and callbacks to trigger and recieve events from Socket.IO
It’s about time I gave this a go. So with the help of a talented motion graphics guy and openFrameworks, we got a nice demo going on the LOVE. Mural. We want to make this a permanent installation, and add some interactivity later on.
At LOVE I spent the last few weeks developing an art installation for Salford University at MediaCityUK, where people can add their brush strokes to a huge digital canvas using their smartphone and tablet. This involved three massive screens, three rack-mounted Mac’s, a laptop, Socket.IO, node.js, a mobile web app and four AIR apps (using Stage3D & the Starling Framework).
The big screens were all running on separate machines, so I was tossing shapes about over the network which made it even more exciting to build. I used Stage3D and the Starling Framework to leverage GPU accelerated graphics, giving me a clean 60fps on a 1920×1080 viewport.
– Create your scene in Cinema4D
– Make sure all of the normals are correct (to prevent having to double side everything)
– Select all of the objects to bake
– Replace items after bake
– Export as *.3ds
– Open Blender
– Import *.3ds
– Export as Wavefront (.obj)
– Get this python script by AlteredQualia
– Make sure python is installed
– Run python convert_obj_three.py -i myscene.obj -o myscene.js
– Make sure your texture is in the same folder as myscene.js when you load in THREE.js.
(Still to work out: how to import full scenes (with lights & cameras), and use the scene loader in THREE.js)
– Make sure all lights have ‘ambient illumination’ checked. (maybe ‘show illumination’ too).
– Select Objects to bake
– Select render > bake object.
– Check illumination, Single texture. (also replace objects for final version).
– Apply baked jpeg / png in cinema4d as the color texture, and not luminosity of the material.
Import to Away3D 4 (Dark Texture Issue – Solution)
– Open the image in photoshop and re-save, then import the texture. This fixes the issue.
Here’s a little something we put together for SCEE this Christmas. They have been having a free gift givaway. No catch! Pure free gifty goodness! There were some amazing prizes to win, and a big juicy present to throw about to give you a clue to what’s inside!!!!
Hurry, it ends in two days!!
Oh well! It’s all finished now. According to our stats 311,377 people unwrapped 474,436 layers!! Not bad going.
After seeing Surya Buchwald (http://www.mmmlabs.com) demoing Ableton Live to Flash at Flash on the Beach 2011, I had to find out how it was done. So I got in touch and was led to a link to download their ‘Music Event Description System’. It is a ‘Max for Live’ addon that enables Ableton Live to output OSC messages. I then recieve this data in Flosc which converts the OSC messages from UDP to XML so the data can be used in AS3 via an XML Socket. The rest is magic, and a couple of MovieClips 🙂
This installation was setup by DOROTHY and LOVE to mark the anniversary of what would have been John Lennon’s 70th birthday and the 30th anniversary of his death in 1980. It was then screened onto Ocean’s media screen outside Liverpool Lime Street Station. It is using software similar to our Two Fingers to War installation, except with the additional challenge of capturing the content to be displayed on the big screen.
Here’s the result from one Sunday. I haven’t had much time recently, so I have bundled a load of different things I’ve been wanting to experiment with, into one. So it is slightly random, but was fun to make.
To put it simply, this is a motor which is being controlled remotely by my phone over the Internet.
After hours of decoding and cross-referencing hexadecimal numbers with an Ascii table, and studying the LED boards 20-odd page communication protocol – I was able to create a Processing library to communicate with our LED board via the RS-232 protocol. Sound easy? You should have seen the document! Working that out was the hardest part. Once I got past this bit, I used a Java Twitter library to retrieve the latest tweets to send to the LED Board.
We streamed at 20 fps so you can see the text scrolling smoothly. Oh, and it can do crazy blinking and sparkling transitions, but the messages are usually too long for them to work!