WHAT IS THIS COURSE ABOUT?

How to edit a professional video sequence.

 

COURSE LAYOUT

The course begins with an introduction to editing, proceeds to outline the steps involved, provides an analysis of professional techniques, discusses image and audio editing, and concludes with issues relating to file management.

 

TARGET AUDIENCE

This course teaches the foundations of professional video editing. It is a guide for beginners to learn professional video editing methods and techniques. It benefits anyone who wants to produce high quality video sequences with an editing program.

 

Curriculum

1 Professional Video Editing – Introduction

  • Introduction

Welcome to the “Professional Video Editing Course”. I’m Dorin Cernat. I’m a technical trainer at Skills Gap Trainer Communications. I put together this course because I know there are a lot of people who are interested in the field of video editing, and curious about the techniques used to produce high quality video sequences. The course discusses what professional video editing is all about. The course begins with an introduction to editing, proceeds to outline the steps involved, provides an analysis of professional techniques, discusses image and audio editing, and concludes with issues relating to file management. The course also includes information on editing technique, editing theory and the art of editing.

 

TABLE OF CONTENTS

  • PART 1 – INTRODUCTION TO EDITING
  • PART 2 – EDITING STEPS
  • PART 3 – EDITING BASICS
  • PART 4 – MAKING CUTS
  • PART 5 – CHOOSING SHOTS
  • PART 6 – IMAGE CONTROL
  • PART 7 – EDITING AUDIO
  • PART 8 – FILE MANAGEMENT

 

PART 1 – INTRODUCTION TO EDITING

Editing begins after the production ends. Editing is part of the post-production process. After the footage is shot, the editor should have a collection of media clips on the hard drive. An editing process is then required to trim out or discard the poor clips, and to select the clips with the best quality and performance metrics. The editor will need to organize or arrange the best and most appropriate shots into logical sequences. The editing process takes in numerous input clips and produces a single output, perhaps into story format. The output media becomes a piece which then communicates ideas effectively and showcases the best work done by the production team and cast.

Much of editing education refers to the technical aspects of editing. Editing requires technical skills such as an understanding of the software editing program features, such as; masks, colour correction, cuts, transitions, audio effects, video effects, and compression. However, knowing the technical aspects of editing is insufficient to produce truly great works. One needs to be more of a technician, but an artist as well. Therefore, a professional editor needs to understand editing concepts and theory, to understand the technique and the art. Editing theory is what this guide is about.

 

2 Professional Video Editing – Editing

PART 2 – EDITING STEPS

Editing isn’t simply about working with clips in the editing program timeline. Editing encompasses a series of steps, which must be performed, in order to bring the project towards completion. The steps are:

Acquire Footage – Get the audio and video clips and import them into the editing program.

Organize Footage – The visual and audio elements need to be organized into bins and folders.

Identify Best Shots – After viewing all of the shots, the best shots are identified.

Rough Cut – A preliminary version of the edited sequence. Some shots are placeholders.

Fine Cut – The final version of the edited sequence, tweaked and finalized.

Finishing – The step where image enhancement will take place.

Mastering – Rendering the sequence to a high quality media file, from which copies can be made.

Distribution – Outputting the finished sequence to Blu-ray Disc or to the web.
 

PART 3 – EDITING BASICS

ORGANIZE FOOTAGE

The editor needs to keep the footage in the database organized. In addition, the editor needs to approach their their task with an organized process. There are various reasons for this. Lets analyze.

Depending on the complexity of the video project, the act of editing will vary in difficulty. If there are fewer clips to edit, then the difficulty to edit a sequence will be lower. However, if the production team provides the editor with many clips, then editing a workable sequence will be more difficult. This leads us to the idea that editing difficulty is partly a function of the quantity of source clips. When an editor assembles a sequence, he or she must compare and contrast the clips in order to select the best and most fitting piece. With a large database of clips, the number of comparisons that must be made increases dramatically. By being organized with the footage and with the editing approach, the editor can ensure that all needed comparisons are made, and potentially good combination of sequences are not overlooked.

Every clip has both pros and cons, and choosing among many different clips can be challenging to resolve. Different combinations of clips will lead to different psychological effects on the viewer. The difficulty an editor faces is both the act of choosing the best clips, and also, choosing the most optimum sequences. Often times, the decisions made usually come from artistic and theoretical knowledge, rather than to technical knowledge. That is why the editor is more an artist than a technician. Learning just the program features and functionality of editing programs, is insufficient. To become a professional, the editor must practice theory driven shot selection, shot arrangement and fine-tuning.
 

TRANSITIONS

The way an editor uses transitions, can be a tell for the audience as to whether that editor is a professional or not. When it comes to the type of transitions to use, the simple transitions turn out to look the most professional. A fade from black can be used at the beginning, and a fade to black can be used at the end. A black fade can also be used to create the feeling of time passing by within a sequence. A fade to white can also be used, and this often represents the passage of time. Lastly, the crossfade is a little less professional than a simple cut, but it can still be used within a professional sequence. In a crossfade, the current clip fades out and blends over the next clip as the new clip fades in.
 

FINE TUNING

Editing requires a lot of clips to be selected for deletion. It also requires adjustments to the beginning and endpoints of the video clips that are accepted. Editing involves knowing when to make cuts, when to move cuts, or when to shorten or lengthen clips that have been cut. Editing is about fine-tuning the sequence through subtle cut modifications. The cuts must be frame specific and done with precision.
 

TRANSITION TO A NEW SCENE

Many new editors edit video sequences with a slow pacing. If the editor loves the scenery which they are filming too much, they may decide not to delete any transitional footage, and instead, incorporate too much transitional footage into the final take. In addition, the production team may also like to film characters walk on into and out of the different sets, as it often looks very cinematic. There are often many takes and many angles filmed of the critical transitional movements between scenes. Just because the shots are available, it doesn’t mean that they all need to be used to make a sequence. Showing too much sitting, standing, walking or driving is likely to result in slowing down the pacing of the sequence enough to turn the presentation into a boring piece of media. In order to show a change in location from one place to another, the editor needs to pick the best shot of the character leaving the scene, then a shot or two of the new geographical location, and then a shot of the new character entering the new location. The audience will assume that the character has changed locations, without actually being shown the travels. The selection of transitional movement should be done with restraint and a good sense.
 

THE ROUGH CUT

The editor should not scan through the footage and edit different portions of the sequence at random. The editor needs to adopt a systematic and organized approach in the evaluation and editing of the various scenes. Editing a sequence is best done through a multiple pass system. Instead of making final edits as you go, you leave some decisions for later with the idea that you will come back to tweak that portion of the sequence later. A circular process, which first lays a rough cut, provides the advantage that on the subsequent passes, new information will be available. Information such as what each different edit feeds into is of critical importance. Final editing decisions can only be made after the sequence has been viewed a few different times and after numerous tweaks and revisions to the sequence have been made.
 

SCREEN DIRECTION

The world established on camera, the physical space represented, must be done in a logical way so as to conform to real world directions; up, down, left, right, forward, and back. If there is a clip that shows a character travelling left and out of frame, the next clip must show that character enter from frame right. There must be a consistent and continuous space representation as the sequence progresses.
 

180-DEGREE RULE

When shooting a scene, the film crew usually starts off with a wide shot of the talent, so as to have a baseline to which to cut to in the event that there are no other shots taken for a particular section. The talent faces each other, and they are filmed from the side. Their eye lines align along an imaginary axis or imaginary line, which needs to be acknowledged and visualized by the crew. The crew can and should film from only one side of this axis, the side from which they had set up their wide shot. This idea of filming from only one side of the axis is referred to as the 180-degree rule. If the crew were to film from the other side of the axis, and some crews do take shots from the other side, the footage could potentially be edited in a way so as to not maintain screen direction. The footage would not cut together well. What is screen left or what is screen right, would flip directions and confuse the audience. If the editor receives footage that crosses the 180-degree axis, then he or she must be careful to edit a sequence from takes shots from only one side of the axis.

30-DEGREE RULE

Besides the “180-degree rule”, there is also the “30-degree rule”. Whereas the 180-degree rule is “inclusive” in that it states that one should film everything from one side of the imaginary axis, the 30-degree rule is “exclusive” in that it recommends not showing two sequential shots that are filmed within 30-degrees of one other. Simply put, when cutting from one shot to another shot, the two shots should be spaced at least 30 degrees apart. Otherwise, the viewer will notice a “jump cut or glitch” effect at the cut point. After a shot is made within the 180 degrees, the crew should then move the camera at least 30 degrees to set up the next shot. If they do not do this, the two shots will look too similar in the minds of the viewer and this will create an “observable effect”. If the sequence does require two similar shots filmed within 30 degrees of one another, one possible way to mask the potential jump effect is to add in a shot between the two similar takes, if it makes sense to do so.

 

3 Professional Video Editing – Making Cuts

  • Making Cuts

PART 4 – MAKING CUTS

THE CUT

The cut is the most widely used and most professional of the transition types. This is because it reinforces the ‘invisible editing’ philosophy. Editing with simple cuts leads to a sequence where the editing itself is not apparent to the viewer. To support invisible editing, the cut must be made at the right time. The cut must be made to a shot that will bring new visual material or new information. Consider what would happen if two shots were edited in a sequence, where both shots presented similar visual information. In the mind of the viewer, the shots would appear to jump, and a visual discontinuity or glitch would be apparent. But by cutting from one shot to the next shot with different visual information, the viewers mind is forced to search the frame for new details, and this mental engagement reduces the jump effect. But making cuts to shots with new information is not just a glitch avoidance mechanism, far from it. The purpose of every new shot is to add new information to the sequence, which was not previously there. Every shot must present new information. The editor must figure out if all the necessary information has been shown, what information needs to be shown next, and what motivating element needs to be focused on.
 

JUMP CUTS AND MOTION

The production team must be careful to plan and film in such a way so as to not lock the editor into having to put together footage which exhibits any “jump effects”. The production crew will provide footage of shots that are in motion and also shots that are static. Sometimes the motion shots will begin and end on a static frame, and sometimes they will not. When cutting from a static to a motion shot or vice versa, the potential will exist for a visual crash to occur, or a jump effect. This will not always be the case, but it is important to consider the effect that may occur. The production team should provide pans, tilts, or dolly shots that are in motion, but that begin and end on a static frame. This will ensure that the editor will have greatest amount of choice and ability to remove any jump effect that may occur. In addition, the motion of a shot will have a particular direction. If the shot is panning towards the left, it is motivated towards the left. If the editor then cuts to another moving shot, but one that is panning right, then the motivation is not continued. This will result in a discontinuity or a visual crash. The cut point must maintain its invisible editing effect; so therefore, it shouldn’t be centered between opposing motions.

EDITING DIALOGUE

When editing dialogue, it can be tempting to edit the sequence to show the characters only when they speak, and not when they react to speech. Editing according to dialogue, or providing screen time to the current person that is speaking is normal. However, if the scene has a lot of dialogue, it can become predictable and boring. From time to time, it can be a good practice to surprise the audience by showing them the character from the listener’s perspective. Showing the audience the character’s reaction as they listen to the dialogue can break the flow of the editing and make it more dynamic, and it can also add new depth to the scene.
 

PAUSES AND OTHER SOUNDS

The editors job is to put together a video sequence that looks and feels professional. In order to accomplish this, the editor chooses to remove takes which are either flawed technically or in which the actors have not performed their best. Lets look at an example of how the editor can proceed to resolve acting issues.

For example: In editing sequences of video, it becomes apparent that the actors will occasionally pause in between sentences or they will insert a few “ums” and “ahs”. There is no general editing rule that must apply, but there are two guidelines to take note of. Generally, in documentary editing, the goal is to make the presenter sound as legitimate and as professional as possible. This means that the goal of the editor is to polish the speech. This can be done by removing pauses, or by removing unprofessional sounds such as “ums” and “ahs”. These types of sounds could be indicative of poor public speaking skills or of having a tough time remembering the subject matter.

However, during other types of editing such as film editing, what to edit out is not immediately apparent. The film actors practice for many hours. Many times, their pauses and vocalizations are well rehearsed, calculated and presented for emotional effect. In this instance, the pauses may need to be left in order to maintain the intended emotional effect.

 

4 Professional Video Editing – Choosing Shots

PART 5 – CHOOSING SHOTS

SELECTING SHOTS

The editor’s main responsibility is to select the best shots. But what does the best refer to? Is it the best performance? Is it the shot with the best technical quality for audio and video? Is it the shot, which highlights the conflict in the story? Is it the shot that just happens to fit the editing sequence in terms of composition? Or is it a combination of all these factors. Usually, editors will weigh the pros and cons of the different priority areas and pick the shot that satisfies most of the criteria. They may choose the shot that offers the best performance. To come to a decision, many editors will have to rely on their feelings and on their instincts. At other times, they will not know the answer and may choose to guess or to pick a shot at random to fill in the sequence. Great editors understand the meaning of the shots and use their refined instinct to convey the right message to the viewer. Great editing is about understanding the factors of shot selection.
 

WIDE SHOTS

Wide shots are usually used at the beginning of new scenes. These type of shots are very useful in this role because they establish several key things. Using wide shots, the filmmakers are able to, with a single shot, show: the environment, the lighting (mood), the characters and their positions relative to each other and to objects. In addition, wide shots can be used at different points within a scene. For example, if there is movement within the scene, that may provide the opportunity to cut back out to a wide shot so as to update the audience with a new “mental map” of where everything is in the scene relative to everything else.

MEDIUM SHOTS

The medium shot is the most used shot. The medium shot is more selective than the wide shot. It allows the audience to focus on the actions and dialogue of one individual, while cutting out what is not relevant for that moment in time. While it has the benefit of being close enough to the actor’s face and upper body that emotions and body language can be seen, it is also far enough away that little bits of the background action still connects the viewer with the environment.
 

CLOSE-UP SHOTS

Close-Up shots are very intense shots. From the cinematic perspective, they look great. From the directing perspective, they are very intimate shots, which convey a lot of emotion. The key to close-up shots is timing, or knowing when to use them. They can’t be used too early on in the scenes development or they will not have the necessary effect when they are needed later on. The best time to begin to use them is when the drama of the scene intensifies and the scene approaches the climax.
 

SIMILAR SHOTS

When editing a dialogue sequence between actors, the type of shot used on one actor should usually be used on another. For example: In a two person dialogue scene, if a medium shot is used on one actor, than a medium should be used on the other actor which they are speaking too. By shooting similar shots from both sides, the shots tend to adopt similar properties, such as: focal length, frame composition, lighting and distance. These similar properties make the editing flow more seamless, connected, and a continuous match. The audience expects to see similar shots cut together, as opposed to dissimilar shots.
 

2ND VISUAL REFERENCE

For most edits, the editor will be able to put together a sequence, which flows visually and is seamless. However, for some shots, they will appear to “jump”, and not appear as seamless as the rest. This can happen when there are more objects in focus. The characters on screen are a reference point. The audience will pay attention to a character, or to their reference point, and then they will pay attention to the next character or reference point as it is presented. However, when there is also another prominent object in the scene, this can become a second reference point. The mind begins to track both reference points. Now imagine that within a shot there is a second object, say a large flower is screen left and the actor is screen right. As you cut to the other actor, perhaps now the large flower is screen right and the new actor is screen left. Under these conditions, the flower has “changed positions” within the frame, so it has appeared to jump. Under this circumstance, perhaps cutting to a closer shot without the flower is more appropriate.
 

3 CHARACTER EDITING

Some of the scenes provided to the editor from the production crew will be of three characters having a conversation. In this case, the editor cannot simply edit a two-shot of two of the characters speaking, and then edit to a matching perspective to another two-shot, of two characters. When making an edit of three characters, from a two-shot to another two-shot, one of the characters appears to “jump” positions on screen. Imagine filming a two-shot of character 1 which is screen left, along with character 2, which is screen-right. The editor then cuts to another two-shot of character 2, which is now screen-left and character 3, which is now screen-right. In this scenario, character 2 has jumped positions from screen-right to screen-left. The editor will need to pay attention to this scenario, and perhaps cut from a two shot of character 1 and character 2, to a single shot of character 3. If this solution doesn’t work, then the editor could instead cut from a two-shot of character 1 and character 2, to a wide shot of all three.

 

5 Professional Video Editing – Image Control

PART 6 – IMAGE CONTROL

COLOUR WORK

Enhancing video clips by correcting colour or adding visual effects can be a lot of fun and very satisfying process. However, editors that choose to do this early on are less efficient then editors who leave the work for after the edit is complete. This is because it is inefficient to do image enhancement work on all the footage that has been shot, as opposed to doing it only on the footage that will actually be used. It is best to leave colour work towards the end of the post-production process.

IMAGE CONTROL WITH SCOPES  

It is important for editors to use scopes as part of the colour enhancement process. Human perception of colour can vary depending on lighting conditions. Editors cannot fully rely on their eye’s perception to consistently and reliability set the appropriate levels of colour in their work. Editors use scopes, which offer a graphical representation of the necessary colour information. Modern editing programs have these scopes built into the feature set.

WAVEFORM MONITOR

The waveform monitor allows the editor to check, and if necessary, to modify the footage so that it is broadcast safe. On the monitor, a value further up represents a brighter image and a value further down represents a darker value. A value of 0 represents black, and a value of 100 represents white. It is important to ensure that the image does not go below 0 or above 100.

VECTORSCOPE

The Vectorscope allows the editor to check the balance of colour, or colour cast within an image. It shows the saturation of a hue for a given clip. On the outer boundary of the Vectorscope, there are six colours: red, blue, green, yellow, cyan and magenta. The distance from center to the edge represents the level of saturation.

HISTOGRAM

The histogram shows where in the range a particular lightness or colour value occur. The histogram has a series of spikes. If there are more spikes in the left side of the histogram, the image will be darker. If there are more spikes in the middle, the image has little contrast. If there are more spikes on the right side, the image is very bright. It is best to have a histogram that is distributed throughout the range, and has a hill in the middle. Of course, if the scene is not lit properly, and the editor must do colour work to adjust and balance, then there is a risk that the adjustments which will need to be made to balance the image can end up squeezing the histogram values to either the low end or the upper end of the range. This could end up destroying the smooth tonal range and instead it can lead to the creation of a harsher digital image with less information.

 

6 Professional Video Editing – Editing Audio

PART 7 – EDITING AUDIO

Even though audio is highly important, new media producers often overlook its importance. Many times, it is only a secondary consideration. The reality is that audio is more important than video. One can watch bad video with good quality audio but cannot watch cinematic footage with incomprehensible audio. Getting the audio recording right is critical to creating a watchable media presentation. This is often out of the editor’s hands, but one would hope, that the editor would be provided media with good quality audio. Without good audio, the final sequence is not watchable, and the project is likely to fail.
 

AUDIO TYPES

On the editing timeline, video is shown on one track. Below the video, will be several layers of audio tracks. The reason that editors don’t stack video tracks on top of one another has to do with the fact that video is solid and a track beneath another track will not be visible. Unlike video, audio is layered into multiple tracks. Each track will contain a different type of audio. As part of mixing the audio tracks, editors will specify the volume levels of the individual tracks and the individual clips within the tracks.
Some examples of audio types on the different audio layers are:


Ambience Sounds

Ambient sound refers to the sound of the environment or the sound backdrop on location. It is the atmosphere. It is sounds such as cars driving by, pedestrians walking by, the wind hustling and the fan spinning. Even when recording audio within an empty and “noise free” room, there is still sound. The “atmosphere” or “room tone” must be recorded and layered underneath all of the recordings that happen on set, so as to create a consistent feeling across cuts as the sequence progresses. At least a minute of room tone needs to be recorded for every location. The atmosphere or ambient sound track will provide a feeling of reality. Otherwise, pauses in sound will occur in the sequence and the soundtrack will feel as if it’s recorded in an artificial studio.

Dialogue

Dialogue is recorded on set. If it is not recorded properly, then the project will be at a high risk of failing. Fixing recording problems with dialogue is not usually possible.

Effects Sounds 

Effects sounds are sounds that are artificially added in a studio. These could include things such as footsteps, punching, or glass breaking.
Music
 

AUDIO METERS

In the timeline, there should be several layers of audio. As these layers are stacked on top of one another, the sound levels will add up. The resultant level may be too “hot”. The editor needs to monitor the audio track levels by looking at the audio meters. If the meters change from green to yellow, that means the audio clip is nearing peak levels. If the meters change from yellow to red, this means the sound is exceeding the peak levels. The editor must adjust the audio volumes for the individual tracks and monitor the resultant overall mix level throughout the sequence, so as to ensure the audio track does not distort and clip.

 

7 Professional Video Editing – File Management

PART 8 – FILE MANAGEMENT

TRANSCODING

The transcoding process allows the editor to convert the highly compressed DSLR capture format into a less compressed intermediate format for editing. The reason an editor would want to do this has to do with the fact that if the colouring work is done on an intermediate format, the resultant image quality will be higher quality than if it is done within the native capture format. In addition, the fact that intermediate formats are less compressed, a lower performance editing workstation can be used for editing. However, though the intermediate formats can be edited on cheaper computers, the performance requirements and disk space requirements will be greater. From a file management point of view, it is much easier for the editor to use a very high performance computer, and edit native highly compressed capture format directly.


BACKUPS

Editors don’t just create work, but they archive work for long-term storage and for distribution. A good backup workflow to use involves using two portable hard drives to backup the contents of the camera’s memory card. Then the contents of the memory card are also copied to the computer’s editing drive. Finally the computer’s output sequence is archived to long-term storage like Blue-Ray and two archive hard drives. Yes, media storage costs can add up in price. It is important to remember that hard drives fail, and discs scratch and corrupt. Multiple copies and redundancies ensure that high quality versions of the project continue to exist long after they are shot.

 

To see our Donate Page, click https://skillsgaptrainer.com/donate

Support the future. Support Skills Gap Trainer Communications.

To go back to our Home Page, click https://skillsgaptrainer.com

To see our Instagram Channel, click https://www.instagram.com/skillsgaptrainer/

To see our Twitter / X Channel, click https://twitter.com/SkillsGapTrain

To visit our LinkedIn Page, click https://www.linkedin.com/company/skills-gap-trainer/

To see our YouTube Channel, click https://www.youtube.com/@skillsgaptrainer

To see some of our Udemy Courses, click SGT Udemy Page

 

Discover the future of learning with Skills Gap Trainer, where technology meets art, and education transcends boundaries. Our innovative curriculum in AI, blockchain, user experience, digital marketing, and more, isn’t just about acquiring skills — it’s about mastering the art of possibility in a digitalized world. By linking to us, you’re not just sharing a resource; you’re igniting a beacon of knowledge that enlightens paths in technology, leadership, and beyond, for a global community of learners. Join us in our mission to bridge the skills gap and shape the future of education. Together, let’s empower minds across Canada, America, Europe, Britain, India, and beyond. Link to Skills Gap Trainer – where learning meets innovation, and every click opens a door to endless possibilities.