Scott Leggo onstage at TEDxCanberra 2012 (Photo credit: Adam Thomas)
When thinking about the staging and projector technology for TEDxCanberra in 2012, we understood that we had a much more impressive theatre and stage due to the move from the National Library of Australia (capacity 300) to the Canberra Theatre Centre (capacity 600). Rather than working on a small lecture theatre type stage (raised about 20cm in front of 300 seats on a gentle incline) we would be in a ‘proper theatre’ with a much larger stage space (about 10m wide by 8m deep, raised 120cm to about 600 seats in a mix of stalls and galleries).
Interior of the National Library theatre in 2010 (Photo credit: TEDxCanberra)
Image showing interior of Canberra Theatre (Photo credit: Paul Hagon)
The other big change was the interior height of the new venue. When combined with the full-on theatre rigging and lighting system – there would be opportunities to hang things above the stage, and probably have them move in and out during the show if we needed.
Given that we had only a small budget, but lots of time – what would we do? Should we hang lots of interesting stuff in the air above the speakers? Bicycles, bookshelves, patterns & shapes? We had the option to do something like this – but I was also hoping to avoid buying, storing, moving, installing, removing lots of material. In addition to this – while we are a creative bunch, I’m not aware that our team has any art/sculpture types that might have been able to produce something with a look that fit with our theme for this year – ‘Optimistic Challenge’.
About 6 months before our event, we saw the impressive staging at the TEDx Organisers event in Doha
and I immediately thought this was an approach we could scale down and use at TEDxCanberra. It appealed for a couple of reasons;
- would not need to buy/store/move/install lots of stage decoration
- while we don’t have any physical artists in the team, we do have a number of very talented digital designers who could produce media
- we are pretty comfortable using computers and learning new software
To scale things down to something we could manage, we decided to go with a ‘double wide’ rectangular screen. This wouldn’t be as visually impressive as the screen used at Doha, but it would be quite different from the typical screens used at events and that would be enough to make it special and interesting.
So I started reading a lot about projector systems, beginning with the term ‘edge blending’ and ‘hippotizer’ which is the brand of ‘media server’ used at the Doha event. Its worth noting at this point that while we do have a budget for staging the show, we aim to do as much as we can through donation or volunteering rather than bringing in paid professionals. A couple of quick calls, and it was starting to look like renting a professional media server was going to be way out of our budget – not just to have it for the days of the show, but for the weeks we would want to have in advance to build the show and learn how to operate the system.
Next step of exploration was looking at software that could be run on a PC or our own. And like the Doha example, we wanted a system where we could incorporate video of the person speaking while they were on stage, along with a slidedeck (powerpoint/keynote) and some nice visual theming. Rather than using physical props and stage decorations – this would be the space we would use to dress up the stage and produce an great looking stage.
First steps – what is edgeblending?
Edge blending is when you use two or more projectors to create bigger projected image with no visible ‘joins’ in the projected image. This lets you project onto screens that are different from the typical 4:3 or 16:9 shapes and lets you project high resolution images onto large screens. To do edge blending using two HD projectors, would result in a canvas size of 3840×1080 (1920×2) – or at least that is what I thought at first. Once the edge blending overlap occurs, this reduces to something like 3440×1080.
Edge blending is interesting – once its working, the media server computer is outputting 2x video signals each with a resolution of 1920×1080, but when they are projected onto the screen, the two images overlap. If they overlap by 400px the final canvas size is 3440×1080 (400px narrower than it would have been if you just put the two projected images end to end).
Hunting around for some suitable media server software, I was looking for something with edge blending in the feature set and found Arkaos Grand VJ. We used this right up until two weeks out from the show, when we realised it wouldn’t do one of the things we needed – but more on that later. Grand VJ allows for a number of media files and video sources to be mixed together – inlcluding footage from web cams, or in our case video capture cards.
My first plan was to use two Black Magic Intensity Pro cards that would be installed in the PC. Each of these can take a HDMI input which could be overlayed on a nice looking background and projected onto our big screen. Using software like this would allow us to create ‘scenes’ that we could switch between for differenct parts of the program. Eg, during a presentation we would show a video feed of the speaker on one part of the screen, and the slide deck on the other.
For video playback, we could place the image in the middle of the screen. Each presenter scene could also be preloaded with the name of the speaker, their slide deck and maybe a photo.
Here is the first in the series of videos that I made to show my team how the idea was progressing.
Feedback from the team was good – the concept was worth pursuing. You can see in this video that the Grand VJ software randomly inserts a large DEMO text on the screen from time to time – other than this its pretty much fully functional which is nice.
Next step was to add the capability to take a video signal from a laptop with the speaker’s slide deck so that it could be incorporated into the projection. I ordered a Blackmagic Intensity Pro PCI-e card from Videoguys to test that it would be compatible with my hackintosh and to do some further testing. The card worked fine, and I was able to take HDMI signal from a couple of sources. Here is a test video showing video signal from an XBOX360.
The Intensity Pro card does a great job of capturing via HDMI, but it is very fussy about the incoming signal. The Intensity Pro has to be configured to exactly match the incoming signal ie 720p at 59.9Hz for XBOX 360, 1080p 30Hz for iPad 3 and 1080i 59.94 for a Canon 7d. If these settings aren’t right, you don’t get a video signal at all. I also demoed how we might display the TED talk videos that are part of the TEDx program.
A change of direction
The more time we spent with Grand VJ – the less confident we were about it. Whenever we closed the program, it would give us an error message. And there were some actions we could do that would reliably cause it to crash. The clincher was discovering that it couldn’t make use of two black magic cards simultaneously. We then considered a number of other programs and decided to go with VDMX which is a very flexible media server and show control program. VDMX is a very open system and can take inputs from a number of places, such as two Blackmagic cards but also from other software running on the machine.
A design feature that we hoped to include in the show was a slowly moving animated background featuring the circle motif visible throughout our event visual design. We used VDMX’s capability to take an input from another program to send the continuous circle animation from processing (a computer program to generate animations and visualisations).
The VDMX settings used to produce the edge blending
The image projected was made up of a number of layers. Anyone who has worked with photoshop or similar systems will understand the idea. An image is created by laying image layers one on top of the other. Most layers have areas that are transparent allowing the layer below to be seen. We used the slow circle animation as our lowest level, on top of this we overlaid a transparent image with the name of the speaker and event logo. The next layer was the speaker’s slide deck. Next was the video image of the speaker live from stage. VDMX allows for some sophisticated uses and preset configurations. We made use of its edge blending capabilities. And were hoping to do more with presets and a control interface (possibly OSC) but time was not on our side. We had also hoped to do some very impressive things with Processing to do realtime data visualisation at the event based on attendees and other ‘on the day’ inputs – but we will need to try that next time.
Brian Schmidt on stage at TEDxCanberra 2012 (photo: Adam Thomas)
Keeping with the no-budget theme, we decided to make our own screen. Our excellent roadie/technical operator constructed the screen out of timber and mdf panels painted white (with a hint of grey apparently). The screen was constructed in six sections that were bolted together during bump in, the day before the show – these were then flown (hung) from the rigging system in the theatre so that they floated in the air near the back of the stage. Behind this was the large white cyclorama regularly used at the theatre – we lit this with some red lights for effect.
As mentioned above in the section on edge blending, the early thinking on the screen was the it should be double the width of a widescreen projection. Widescreen is normally 16:9 – so we though 32:9 would be the ratio of width to height. As it turns out, this isn’t the case as when the two projectors overlap for edge blending, you lose some of their width. While we would be send the projectors 2x 1920×1080 signals, the screen size will be more like 3240×1080 (3:1 ratio) rather than 3840×1080 (32:9 ratio). It can be a bit hard to get your head around it – here is a diagram.
Diagram showing how the two projector images were blended together
The screen was great – and is something that we hope to use again or lend to other events. The only thing I would try to improve for next time is the straightness of the screen. It had a slight bow which made it difficult to aim projectors at. This would have been fixed easily by adding some straight reinforcing to the rear of the screen.
How it happened on the day
These ideas were all pretty good – but in the end we did something slightly different – though it achieved more or less the same look.
The advice from the company that hired the projectors to us was that there is a delay of a few frames between the speaker on stage and the image of the speaker projected onto the screen. This is caused by different parts of the signal chain needing a few milliseconds to do its thing. The camera -> switching gear -> projector – and anything else in the chain. A long delay can be noticeable/distracting for the audience.
In testing, the Blackmagic card and processing through VDMX seemed to introduce a delay of a few frames – probably a little more than would have been the case with a dedicated hardware switching console. The projectors that we hired had the capability to do a picture in picture (PIP) so in the end we used this to put the footage of the speakers into the projection and the blackmagic cards weren’t used for this.This added some extra complexity for our operators as it meant switching the PIP on and off as required. If we had used VDMX these changes could have been automated but the frame delay may have been noticeable.
Using the PIP function on the projectors also meant that there was one less thing relying on the media server computer which was a positive. A single black magic card was used to capture the speakers slidedeck from a macbook pro running keynote. This was carried via a HDMI signal from the macbook to the Blackmagic card. We used the monitor output on the Blackmagic card to feed into a HDMI splitter and then to the throwback screen on stage and to the video streaming system. If you want to see the finished product, take a look at this talk (Brian is pretty great too).
Coming up in a future post – How we did the video shoot / live stream / edit