Software Documentation

Software Documentation

TimecodeDocumentation

Last updated: July 12, 2023

3 Synchronizing events to music in a production you control

Synchronizing events to music in a production you control is essentially an automated equivalent to the manual task of “press play on the soundtrack player and press start on the firing system at exactly the same time.”  The automated equivalent involves combining a timecode signal as one channel of the sound track with music as the other channel or channels of the same soundtrack, ensuring the timecode signal and the music play in perfect synchronization when the soundtrack is played.  While the music channels are routed to speakers for people to hear, the timecode channel is routed into the firing system, which is configured to fire the script events according to the times in the timecode.

 

Creating a soundtrack with timecode

The mechanics of synchronizing events involve three resources: the script, the timecode recording, and the music recording.  All synchronization is based on the relationship between these resources.

The script file contains a list of events, or triggers, at specific times.  Imagine an event as, “Trigger module 1, pin 1 at 00:04:01:00 (0 hours, 4 minutes, 1 second, 0 frames)”.  The timecode is an audio signal that hardware devices like firing systems or your computer can decode into a sequence of times, represented as HH:MM:SS:FF streaming in sequentially at some frame rate.  Imagine timecode as a fast talking clock reader, barking out: “0 hours, 0 minutes, 0 seconds, 0 frames, <short pause>, 0 hours, 0 minutes, 0 seconds, 1 frame, <short pause>, 0 hours, 0 minutes, 0 seconds, 2 frames, etc.”  Lastly, the music is the third resource, and it is just regular digital song.

Combining a timecode signal on a soundtrack in one channel with music on the other channels creates a correlation between the HH:MM:SS:FF times in the timecode with points in the music.  The timecode for the beginning of the song could begin with 00:00:00:00, or it could begin with other timecode times, such as 07:00:00:00.  Whatever the timecode range is on the soundtrack, that is the range that correlates with the music.

When a firing system is configured to operate by timecode, it plays the script events whose times correspond to the timecode signal streaming in.  Some older firing systems require the script event times to match the timecode times exactly, but most firing systems today have an internal clock that locks onto the incoming stream of timecode times and progresses along with it continuously.

If the timecode signal drops out, the firing system typically stops after losing the lock; if the timecode jumps discontinuously, the firing system typically recognizes the jump after a short delay and jumps along with it.  Thus, by playing the script events using its internal clock, the firing system keeps in pace with the timecode streaming in as far as speed of play, starting, stopping, and jumping, but is resilient to minor glitches in the timecode signal, and doesn’t skip script events that fall in between the times in the timecode signal.

Whereas the timecode signal is represented as HH:MM:SS:FF in units of frames, the script events can be represented in decimal seconds or milliseconds or any other time representation.  The representations of script events and timecode times don’t need the same syntax because they all correspond to real time points on a timeline.

Returning to the objective of synchronizing events to music in a production you control, you can use timecode as an automated equivalent to the manual task of “press play on the soundtrack player and press start on the firing system at exactly the same time” by 1) exporting your script from Finale 3D, 2) exporting a sound track from Finale 3D with timecode on one of the channels, 3) configuring your firing system to play by timecode, 4) routing the timecode channel from your soundtrack playing device to your firing system, using a splitter for the audio cable, and routing the other channel or channels to your speakers, and 5) playing the sound track to initiate the show!

 

Adding pre-roll time

In real world productions, it is nice to have a delay at the beginning of the soundtrack before the music and pyro events kick in but during which the timecode is playing.  One reason for the delay is that the firing systems require a short bit of time to lock on to the incoming timecode signal.  If your show began with an event at time 00:00:00:00, the firing system might skip it if it hasn’t locked on to timecode signal yet.  Another reason for the delay is that it gives you a chance to confirm that the timecode signal is correctly routed to the firing system, and that the firing system is properly configured to receive it.

Most firing systems will display the timecode times as they are being received.  If you start playing the soundtrack and don’t see the timecode times displayed on the firing system, you know something is wrong.  Having a delay in the soundtrack before the music and events kick in gives you a chance to scramble and fix the problem without delaying the show.

How long a delay should you have?  10 seconds, 30 seconds, a minute, or even 10 minutes are common practice.  Bear in mind that if the show is set to begin at 9pm and you have a one minute delay, you need to start playing the sound track at exactly 8:59pm.

In Finale 3D, you can add a delay by unlocking the song file (“Music > Lock songs on timeline”) and dragging its dotted line on the timeline to right, moving the song to start at the appropriate delay.   Add the same delay to the events to keep them in synch with the song.  Press control-A to select all the events, then do “Script > Time adjustments > Shift times” to move them over.  When you export the soundtrack with “File > Export > Export soundtrack…”, the exported timecode will begin at zero without the music, and later, after the delay, the music will begin.  The event times in the exported script will include the proper delay because they are at the proper positions on the timeline in coordination with the music.

It is not necessary to set the “Firing system export offset” or “Timecode export offset” to add the delay when you’ve added the delay by shifting everything on the timeline.  The offsets are used for delays in other timecode workflows described in Synchronizing events to music in a production controlled by others; not for the delay at the beginning of a show controlled by you.

 

Adjusting for latency

Inherently every firing system or timecode decoding device has internal latency for processing timecode streaming into it.  Decoding hardware includes algorithms to compensate, or even overcompensate, for the latency.  Thus even if the timecode in the exported soundtrack and the music in the exported soundtrack correspond exactly to the times of the pyro events in the exported script, the pyro events may be a short bit late or early in reality, in comparison to the music.

To adjust for this latency, Finale 3D has an optional field in the “Show > Set show information…” dialog called “Firing system export offset”.  This field is also used for some timecode workflows described in Synchronizing events to music in a production controlled by others, but for shows controlled by you this field is used exclusively for making small adjustments, positive of negative, to script times in order to make them align with the music in reality, taking into consideration all the latencies in all aspects of the firing system and audio systems.

The best way to determine the firing system export offset is to run a test on your actual equipment with an LED or e-match on a firing system pin corresponding to a recognizable point in the music.  Start the soundtrack playing a minimum of a few seconds before that point in the music, to give the firing system a chance to lock onto the timecode signal and settle in, and observe if the LED or e-match fires exactly on time or a little early or late.  Choose a firing system export offset to provide the right degree of compensation.