Not So Silent Night 2013

2013-12-14 21.33.01It’s that time of the year!  2013 is ending with a bang here at Calvary.  Lots of big events leading up to Christmas!

One of these December events is Oceans Edge School of Worship’s Not So Silent Night.  This is their annual Christmas concert and it’s the first big concert of the school year.  It’s the students time to apply what they have learned so far and push themselves.

For us on staff it’s a chance to try and do some new and cool things on the tech side.  We pretty much get free reign with all the toys we have at our disposal, ha ha.  This year I got to try something I’ve been wanting to try for a long time, using a projector as a moving light.

We happened to have a spare projector that’s pretty bright, 17,000 lumens!  We positioned this upstage and shot it forward over the stage and over most of the audience.  With lots of haze in the air the results were some pretty awesome beams that were very laser like if we wanted that effect.

Since it was a projector (and not a gobo in a light) we could put out any content we wanted.  Anything white on a black background looked like nice sharp beams cutting through the air.  The results were really cool and couldn’t be done with a moving light or even a laser.  It also allowed us to project words that were actually readable in the air which was really cool.

The projector provided some very high tech kinda looks.  To contrast that we used a lot more conventionals than we normally do.  We had a flown row and a ground based row of pars which we used as audience blinders.  We also had 10 lekos with gobos spread throughout the room to put out some nice, warm, beams.

Between the usual intelligents we have, the projector, and all the additional conventional lights we had lots of options.  I like to try and create variety to keep things from getting repetitive.  Even if something is only used on one song I think it’s worth the effort.

We also had fun with other effects.  Lots of haze (of course), plus low lying fog from a pair of Martin Glaciators, and confetti cannons.  Our rigging system had a lot of cues, bringing lights up and down for different looks.  Plus some different flown backdrop elements that the students created which flew in and out.

Another cool element was some “screens” made out of recycled pallet wood.  This gave a stylized look to all of the projections.  Not everything looked great on it with the texture but once we found the right content it was a cool look.  We purposely left some gaps between the panels so that a light could be put behind the screens and shine through.  this create beams through the cracks and gave us more options.

Once again we made Ableton Live our master control for all click, loop, ProPresenter lyrics, lighting, and projection.  One machine ran Ableton, another ran the two side screens with ProPresenter, a third computer ran Qlab for the stage projection that faced the audience, lastly out ETC Ion controlled lights.

A combination of MIDI notes and MIDI show control was used for all of these commands.  Some devices were hardwired and some were sent commands over our WiFi network using Apple’s built in MIDI over Ethernet.  An iPad with Qlab’s remote app was our backup control for the projector on stage.  Everything worked great!

This seems like a lot of extra work but in the end it’s actually less work.  Start one machine and several follow in perfect harmony.  It also let us hit cues that just aren’t practical when everything is human controlled.  We had lots of hits on specific notes that were hit perfectly every time with this system.

We had two sold out nights in our theatre and it was a lot of work but a lot of fun.  Now it’s time to get ready for our main Christmas outreach service in the BB&T center!  Load in starts tomorrow, I’ll be sure and post some info on that as well!

Check out Oceans Edge’s Instagram for some more pictures.  I’ll post some more pictures and videos as soon as I can.

TouchOSC App For Video Control

Screen shot 2012-12-18 at 12.56.08 PMI’ve been looking into MIDI controllers lately for controlling lights and video.  I didn’t really want to purchase one though before I really was sure it would do what I want it to do and that it’s worth the money.  I searched for some apps for my iPad that offers MIDI control and I found a really cool one called TouchOSC.  It’s only $5 and so far does everything I want it to do.

I put together a short video showing how to set it up and control ProVideoPlayer as an example.  My goal is to get it to control the Green Hippo media servers.  Once I get that working I’ll post another video.  Enjoy!

 

Ableton Live Controlling Lighting, Video, and Lyrics

Hey guys, we just got done with a week of rehearsals and shows for Oceans Edge’s Not So Silent Night.  Everything went great!  In this show we tried out some new ideas that we haven’t done in a show yet.  The biggest one being some pretty heavy automation thanks to Ableton Live and MIDI.

We ended up with Ableton Live sending out MIDI commands to our lighting console for lighting cues.  To another machine running ProVideoPlayer for videos on our stage screen.  Then to yet another machine with ProPresenter for lyrics which was a master for two other machines running ProPresenter in slave mode connected to our side screens.  We didn’t have video cabling to those areas so we wirelessly connected to them.

In the end Ableton Live on one machine was triggering a grand total of five other machines running different programs and performing different tasks.  All through MIDI and MIDI Show Control.  Pretty cool stuff!  This allowed us to have the precision of automated cues but unlike timecode we could easily change the order of cues, repeat cues, skip cues, change the tempo, all things that timecode is too rigid to do well and simply.

This involved some testing and extra work on the front end but resulted in a better show that was very easy to run.  We only ended up with about 100 lighting cues, about 5-10 were manually triggered.  If Ableton Live wasn’t triggering most of the lighting it would have been at least 175-200 cues.  This is because we used Ableton Live to repeat cues (for easier programming) and trigger presets saved to our submasters that could then be triggered as individual lighting cues or looks.

Just like you can hit the bump buttons to make the submasters go Ableton Live can do the same thing through MIDI Show Control commands.  So one song that would have been 50-100 cues was simply 23 presets triggered remotely in different arrangements.  This even allowed us to divide up the programming between several people.  I was able to focus on lighting looks and programming the lighting console while other people carefully placed cues into Ableton Live to trigger the lights.

Connectivity was pretty simple as well.  In fact only the lighting console itself had a physical MIDI cable plugged into it.  The rest of the machines received MIDI commands over the network using Apple’s Audio MIDI setup that’s built into the OS.  We have used this a lot and it has proved to be very reliable provided that you have a good network connection and not a lot of network congestion.  We created our our network just for these machines in order to make sure everything worked as fast as possible.  Everything in the lighting booth was hard wired together and the two remote machines connected over the wireless N WifI network.  This worked very well.

My buddy WIll Doggett and I put together this short video where he walks through the setup.  Ignore the messy lighting booth, ha ha.

ProPresenter Communications Module

Renewed Vision, the makers of ProPresenter, announced a new add on communications module.  It adds Art-Net support for DMX over ethernet.  All kinds of video switcher control for playback control from a video switcher.  And last, but not least, MIDI over network control.

The MIDI one caught my eye.  That’s how we have been controlling ProVideoPlayer for video playback from Ableton Live during nights of worship.  My friend Will Doggett from LoopsInWorship.com put together a short video and some resources for MIDI from Ableton, check it out!

Ableton Live Lighting Control

So today we got some time to experiment with Ableton Live controlling our lighting console.  Basically the goal was to find a way to have Ableton control the lighting cues and trigger everything.  This way we have everything in sync and it’s all automated.

Today was time well spent.  We figured out all kinds of cool stuff that has really opened the doors wide open to all sorts of possibilities.  Basically we can now have Ableton be the center of control for the entire show, lights, media servers, lyrics, anything that can see MIDI!

The best part is that since it’s through Ableton we can still have the creative freedom to change tempos or repeat sections and everything will follow along.  So everything can be programmed but we can still be flexible, pretty rad!  This is a lot better than the other way around where Ableton chases another source and the performer can’t change anything on the fly.  Locked into a timecode that can’t speed up, slow down, or jump around.

For now here are just a couple of teaser videos.  Ableton was on one laptop outputting MIDI commands to our ETC ION console.  The the ION was outputting DMX over ARTnet to another laptop running Light Converse visualization software.  We’ll make another video explaining things in more detail at some point.  For now this shows a couple of lighting looks mapped to MIDI notes that could be played live or played from the timeline.  You can just imagine where this could lead with some more time to plan out looks!

QLab for Audio, Lighting, and Video Control

So if you saw the other post about QLab you’ll know that we love it for track editing and playback of audio for shows. Recently we were playing around with some of the other features.

We were trying different options for controlling our lighting console from Ableton Live. We managed to get Ableton to output MIDI time code (through an in between program) and have our ETC ION console chase it. I’ll make another post about how we got that to work.

After some playing around with that setup we opened up QLab. QLab has native support for all kinds of timing and MIDI options. Using QLab we were able to simultaneously send time code to Ableton to track and play audio and send MIDI show control to the lighting console to “go” on the the cues. Basically we found that it would be pretty easy to set up QLab to be the center of control for everything. Hit one “go” button and trigger Ableton, lighting, the built in audio playback, built in video playback, just about anything!

Mainly we were testing sync’d playback from Ableton which is actually pretty easy. Either have the lighting console and Ableton both track to time code. Or have Ableton track time code and the lighting console track MIDI show control. Both options mean we can have our lighting cues precisely mapped out and repeatable all with the touch of a button.

And since you can set up multiple devices we were able to send MIDI timecode internally to Ableton and externally to our lighting console at the same time at different timecodes is we wanted to. This means if we needed to offset the timing to one or the other it’s pretty simple. So if you programmed a bunch of cues to a certain timecode range, but then had to change it in Ableton for some reason, it wouldn’t be a big deal, just offset the times.

I can’t wait for the next show where we need this kind of precision. By linking Ableton directly to the lighting console, or controlling both Ableton and the lighting console from QLab, we’ll have all the control we need! One “go” button and everything will sync up perfectly, pretty cool!

For more info on all of these products check out the manufacturer’s sites.

Ableton Live

ETC ION

Figure 53’s QLab