last updated 16 Mar 2008
NEIL SADWELKAR

Non-linear editing
for newcomers

go to...
Quick Index

If you're completely new to editing on a computer
Then this might serve as a crash course.
I have assumed that you have some familiarity with computers.

Non-linear basics
This is for those who are familiar with editing but are new to non-linear editing. And a little fuzzy about what exactly it means to take clips and make sequences and projects. 

First, a look into the near past.

Linear Editing
Supposing you had a tape in which you had shots in random order, like a typical "rush" tape. To make a meaningful film out of it, you would normally play the tape in a player, look at shots, select portions you want and then record (transfer) them to another tape in a recorder.  This manner of editing is called "linear" editing or "tape-to-tape" editing.

Linear editing was the method of choice until "non-linear" editing became possible and popular. One of the fundamental problems with "linear" editing was that after one finished the edit, if one had to remove even one shot, or add a shot in the edit, one had to reedit everything from that point forward. There was no way to physically cut the tape to delete unwanted stuff or insert new stuff.

Here's how non-linear editing solves this problem ...

One first transfers all the shot material or "rushes" onto a computer hard disk. The hard disk is a "random access" device. Meaning, it can play your shots one after another in any order, regardless of the order they were on your tape. So your rush tape remains as it is, and the shots play in the meaningful or edited order from the hard disk. 
The process of "non-linear" editing doesn't actually cut the shots to correct length or reorder them on the disk. It doesn't need to. Because the disk is capable of playing one shot and then almost instantly switch to the next and the next and so on. 

At any time if you wish to "extend" a shot or "trim" another, you can do it and see the change instantly. And this is also why "non-linear" editing is also called "non-destructive" editing.

So, coming back to FCP. Here's how FCP does non-linear

You first "capture" your shots from tape to hard disk. "Capture" is the transfer of your shots from tape to the hard disk that's sitting inside your Mac. You capture using the "Log and Capture" mode. Capture is also referred to as "grab" or digitise" or "dump to disk".

Now, as you capture a shot to hard disk, a QuickTime movie file is created on your hard disk. You don't actually see this file in FCP. You see a "clip" in the browser. The "clip" is not the actual shot or QuickTime movie, but only a graphical representation, a sort of a shortcut or alias to it.
The "clip" in your browser contains information about which tape the shot came from, what is the start and end time code, Audio and Video format etc. This clip in the browser window takes up very little space on your hard disk.

But the "invisible" QuickTime movie on your hard disk is what contains the actual video and audio. And this is what takes up space. Over 1000 MB for every 5 mins. worth of DV tape.

So, if you delete the QuickTime movie in the hard disk (not while running FCP, please) the "clip" in the browser will play as "Media off-line". But because the clip contains all the info about where it came from, you can automatically recapture it from your original tape. So deleting QuickTime movies from hard disk is the only way to make space on your hard disks. Don't bother deleting clips in the browser.

Conversely, if you delete a clip in the browser, the QuickTime movie it refers to, will remain and hog disk space. This QuickTime movie is now an "orphan" file.

So, never delete clips without also deleting the QuickTime movies they refer to.

The process of non-linear editing in FCP consists of taking clips from the browser and placing them in the right order in the "timeline" woven together into a "sequence". 
Here too, when you take a clip and place it into the timeline, the original QuickTime movie on your hard disk is not moved. It stays exactly where it was. 

Even if you mark only part of a clip to be used in the timeline, the original movie doesn't get cut. Also, if you use a clip 10 times in a sequence, there's only one of it on your disk.

So all those little bars that represent shots in the timeline, are only "pointers" to clips in the browser which are in turn "pointers" to QuickTime movies sitting on your hard disk.

A sequence in FCP is only a "playing order list" of QuickTime movies.

All these clips sitting inside bins, and sequences made of parts of clips, together make up an FCP "project". The project itself takes up very little space on disk. But it represents all the hours of effort you put into making your film. 
For example you may capture 6 hrs. worth of footage that take up 72000 MB or 72 GB space in you hard disks, and edit it into a 35 min film with all kinds of effects, graphics and what have you. But your project might take up just 5 MB of space. Because all it contains is numbers. That is, one or more "playing order lists".

So after editing this masterpiece if you want to start on another, just get rid of all the QuickTime movies. But keep the project, and every single bin and every single clip. And keep it for as long as you (or your disk) live.

Also, if you have a film half done, but need to start another, and need space for it, don't bother making EDLs or batch lists or anything else. Just keep your project and get rid of all the QuickTime movies. And later when you'd like to come back to this half done job all you need to do is recapture the "off-line" media from the clips. And all you sequences become "alive" and "online".

 

 

What exactly is a render ? Rendering ?

In a non-linear system, the video material (shots, clips whatever) exist as files on a hard disk. One can edit these around and keep playing back the edit from anywhere to anywhere.

But suppose, one were to add a dissolve between two shots. Shot 1 and shot 2. Now, the editing system has to play back shot 1 till the dissolve starts, then create the dissolve and play it, and then play back shot 2 from dissolve onwards. Most edit systems (FCP included) can't do this dissolve an play it back "on the fly". So they need to render it. Meaning they need to compute this dissolve and save it as a separate file.

This file is called a "render" file and the creation of a render file is called "rendering"

After rendering, the edit system then can play back shot 1 till dissolve, dissolve, shot 2 after dissolve in a seamless fashion so the user gets the illusion of a shot dissolving into another.

Similarly, other effects like colour correction, resize, crop, wipe, push and other effects also need to be rendered by most systems before you can see them.

This rendering takes time depending on the kind of system you're using. Some systems use the main system processor to render. Others have special hardware cards that do the rendering. Software (like FCP) which use the computer's main processor for the rendering, will be able to render faster and faster as processor speed increases. In fact a day will come when the main processor will be render as fast or faster than required. Then everything will always be "real-time".

So what is "real-time" ?

Some systems have additional hardware in the form of cards that sit inside this computer and process effects as fast as require for playback as the effect happens. Meaning if you add a 2 sec dissolve between two shots, this hardware can process this dissolve faster than 2 secs. So, effectively it can "play" the dissolve as it happens.

This amounts to creating or processing an effect in real time. And such hardware is real-time hardware.

Avid Media Composer systems of the nineties had a Targa 2000 card installed which was specially modified to take almost any effect present in the system, and create it in real time. So one could place this effect on any clip and just play it. No rendering. This was in 1995 and the Mac was a 9500/132 with just 56 MB RAM. I still have one and it's still real time. The Hard Disks incidentally, in these systems were 3600 rpm narrow SCSI.

Today, Matrox makes a card called RT-Mac that can make some selected effects in FCP play out in real-time. Promax used to sell have a RT-Max, DH-Max, RT Lite range of cards that can show FCP effects in real-time. Aurora also sells their Igniter range of cards with some real-time effects. And Targa Cinewave made by Pinnacle systems has many effects that play out real time, even with uncompressed Beta or DigiBeta video. In fact at this time, the Cinewave has the most real-time effects. AJA has the Kona card with some real-time effects as also the Digital Voodoo cards by Digital Voodoo. You can check out intros to these products on my What other gear - for FCP? page on this site

What do they mean when they say "off-line" - and "online" editing ?

These terms also date back to when editing system hard disks that measured in MBs. So one had to capture all the video to disk in a very low resolution form to be able to fit it in those small disks. This edit was called the "off-line" edit. Meaning editing with a copy of the original material. After editing this low-resolution picture, the whole edit had to be repeated with the original full quality material on the tape.

So the edit in the non-linear system was exported out as a small text file that contained the IN and OUT point of every shot in the edit. This is called an Edit Decision List or EDL. This EDL file was then fed into an "online editor" which was a machine that could control two or more decks. This machine would work with the original tapes, cue up all the shots used in the edits one by one and transfer them to another tape in the correct order. It would do the entire edit again without thinking - just by the numbers. This is called on-line editing.

Today hard disks have become massive, and it is no longer necessary to work in low resolution. Especially in DV where there is no such thing. FCP captures DV footage via the Firewire terminal in a Mac G4 computer "as is". It uses up 1 GB of hard disk space for every 5 mins of DV. And with 20 GB hard disks costing under Rs. 6000/- there is no need to capture this footage in a low resolution to save disk space.

Also, today there are online systems that don't use the tape-to-tape method of online editing. Smoke, Fire, Flame, Quantel, Jaleo etc. etc. are online edit systems that take in an EDL, and capture all the material from the tape to a hard disk in a lossless uncompressed form. They even recreate some effects used in the edit automatically.

FCP can also work in this online uncompressed manner if equipped with special hardware like, Igniter uncompressed, Targa Cinewave, Digital Voodoo, or AJA KONA. More on these in my "What other gear - for FCP?" page.

 
And if you want to check out what different Macs do FCP check out my
"What kind of system is needed, or, will my Mac work ?
" page
 

go to ...Quick Index

 

 

If there's something that you didn't quite understand, or if you'd like to see something on this page, or if you want to be informed when this page changes, or even if you want to just say thanks to me, do mail me.