Sunday 7 October 2012

Visual Components

Footage:

In film-making and video production, footage is the raw, unedited material as it had been originally filmed by movie camera or recorded by a video camera which usually must be edited to create a motion picture, video clip, television show or similar completed work. More loosely, footage can also refer to all NSK sequences used in film and video editing, such as special effects and archive material.


Establishing shot
An establishing shot in film-making and television production sets up, or establishes the context for a scene by showing the relationship between its important figures and objects. It is generally a long- or extreme-long shot at the beginning of a scene indicating where, and sometimes when, the remainder of the scene takes place.

Establishing shots were more common during the classical era of film-making than they are now. Today's filmmakers tend to skip the establishing shot in order to move the scene along more quickly. In addition, scenes in mysteries and the like often wish to obscure the setting and its participants and thus avoid clarifying them with an establishing shot.

The use of establishing shots are as follows:

Indicate location: Establishing shots may use famous landmarks to indicate the city where the action is taking place or has moved to, such as the Empire State building or the Statue of Liberty to identify New York, the London Eye or Big Ben to identify London, the Sydney Opera House to identify Sydney, the Eiffel Tower to identify Paris, or the Las Vegas Strip to identify Las Vegas.

Time: Sometimes the viewer is guided in his understanding of the action. For example, an exterior shot of a building at night followed by an interior shot of people talking implies that the conversation is taking place at night inside that building; the conversation may in fact have been filmed on a studio set far from the apparent location, because of budget, permits or time limitations.

Relationship: An establishing shot might be a long shot of a room that shows all the characters from a particular scene. For example, a scene about a murder in a college lecture hall might begin with a shot that shows the entire room, including the lecturing professor and the students taking notes. A close-up shot can also be used at the beginning of a scene to establish the setting (such as, for the lecture hall scene, a shot of a pencil writing notes).

Establish a concept: An establishing shot may also establish a concept, rather than a location. For example, opening with a martial arts drill visually establishes the theme of martial arts. A shot of rain falling could be an establishing shot, followed by more and more detailed look at the rain, culminating with individual raindrops falling. A film maker is colluding with his audience to provide a short hand learned through a common cinematic cultural background.

A good example of an establishing shot is in Shawshank Redemption when (below) here you see the whole prison, so you know now where the film will be taking place right before you see the main character (Andy) roll in on a bus.



B-roll
B-roll is the supplemental or alternate footage inter-cut with the main shot in an interview or documentary. The term "B-roll" is now limited to secondary footage that adds meaning to a sequence or disguises the elimination of unwanted content. This technique of using the cutaway is common to hide zooms in documentary films: the visuals may cut away to B roll footage of what the person is talking about while the A camera zooms in, and then cut back after the zoom is complete. The cutaway to B roll footage can also be used to hide verbal or physical tics that the editor and/or director finds distracting: with the audio separate from the video, the filmmakers are free to excise "uh"s, sniffs, coughs, and so forth. Similarly, a contextually irrelevant part of a sentence or anecdote can be removed to construct a more effective, succinct delivery. This can also be used to change the meaning of the speaker to fit the view of the producer. In fiction film, the technique can be used to indicate simultaneous action or flashbacks, usually increasing tension or revealing information.

"B roll" also refers to footage provided free of charge to broadcast news organizations as a means of gaining free publicity. For example, an automobile maker might shoot a video of its assembly line, hoping that segments will be used in stories about the new model year. "B roll" sometimes makes its way into stock footage libraries.



Visual effects:

Many movies are made from books, but the plot line and characters are often changed to make it into a more "movie worthy" story. Watch the trailer below, the flashes of black and white contrasting with slices of colour emphasize the graphic nature of the film and the medium in which it was originally created.


Miller chose to colour only certain items in each scene, emphasizing the "feel" of the scene. Colour draws the eye, so each coloured item holds significant importance. For example, the character "The Yellow Bastard" is always coloured in the scene because the yellow is both disturbing and an eye catcher. The red dress of the woman who is murdered emphasizes her sexuality and her importance. The appearance of random colour is unsettling, another definite intention of Miller.

The influence of graphic writing is not merely seen in the reflection of colour, but also in the movement of the screen. Certain scenes are blocked to convey the feel of the graphic novel script. It is even believed that Rodriquez and Miller planned each shot according to the panels from the original graphic novel.


The film's similarity to the comic series is due to the involvement of the author. The film was written, produced and directed by Frank Miller and Robert Rodriguez. Miller's influence is undeniable as the images from the comic and random colours he chooses directly mirror those in the movie.

Even down to the detail of the outfits, Miller made sure the movie reflected the comic. The colours in the movie are specifically placed. The flashing of police lights, the red dress, the yellow skin. The colours are intense, and even more so when against a background of black and white.





“Transformers” is a very commercial but very successful action movie. This movie is the result of another series of comics, television series, and children's toys finally culminating in two action packed 2007/2009 films.

The ability to film explosions, car crashes, etc. is challenging because of the nature of the material. An explosion may not have the colours or height the director wants, and a car may not crash in a dramatic way, but these techniques are common in action movies and continue to be improved upon. To me, the most impressive visuals occur in the appearance of the robots themselves.

Part of what has been both challenging and thrilling in movies through the generations is the technical ability to make "aliens" seem real. This challenge proved to be particularly difficult with the robot/car transformation.


The director Michael Bay had specific ideas on the design and action of the Transformers. The models were so specific that the motion of a wrist required 17 visible parts. This complexity and attention to detail is what made the Transformers look life-like. Bay reported, "The visual effects were so complex it took a staggering 38 hours for Industrial Light and Magic to render just one frame of movement".

In the movie "300" Post-production was handled by Montreal's Meteor Studios and Hybride Technologies filled in the bluescreen footage with more than 1,500 visual effects shots. Visual effects supervisor, Chris Watts, and production designer, Jim Bissell, created a process dubbed "The Crush”, which allowed the Meteor artists to manipulate the colors by increasing the contrast of light and dark. Certain sequences were desaturated and tinted to establish different moods. Ghislain St-Pierre, who led the team of artists, described the effect: "Everything looks realistic, but it has a kind of a gritty illustrative feel." Various computer programs, including Maya, RenderMan and RealFlow, were used to create the "spraying blood." The post-production lasted for a year and was handled by a total of ten special effects companies.


Movie Titles:

Have you ever thought of what makes you remember a certain movie or TV show? Of course, it’s the story being told, you’ll say, but what about movies such as Goldfinger, Seven and Snatch? What’s the first thing that comes to mind? We are pretty sure their opening title sequences stick out for many of you.

Film titles can be great fun. In them we see the bond between the art of film-making and graphic design and perhaps visual culture as a whole. They have always served a greater purpose than themselves: to move the overarching story forward.

Breakthrough ideas in titling, such as timing the typography to interact with metaphorical imagery or to create its own world, were largely innovations that came from outsiders to the Hollywood studio system. Figures such as Saul Bass, Pablo Ferro, Maurice Binder and Richard Williams arrived on the scene in the 1950s, at a time when the studios were starting to flounder in their fight with TV. At that time, independent filmmakers made commercial headway by doing things differently, spreading utterly fresh ideas about the possibilities of title sequences. This is the era in which the discipline of film title sequence design was actually born.

Maurice Binder worked on the title designs of 14 films about Agent 007, including the first episode, “Dr. No” (1962). Binder created the famous gun-barrel sequence, which became a signature for the Bond series

Experimentation on the fringes, where title sequences really thrive, have led to all kinds of innovation in what a title can be and how it can serve the story and the director’s intent. Perceptive directors like Otto Preminger, Alfred Hitchcock, Blake Edwards and Stanley Donen embraced these innovators and gave them the reign to surprise audiences from the opening shots. The Bond films, the Pink Panther series, Barbarella: the sequences for such films became enticing and often sexy popular amusements. By the mid-1960s the top title designers were celebrities in their own right, people who could be relied on to deal with the messy business of credits with playful panache.

It could be argued that typography lost importance in this era of title design. The imagery behind the credits received a lot more attention. Still, the interplay of typography and images was by no means ignored. Popular trends of the 1950's were using three-dimensional lettering and embedding type in physical artifacts such as embroidery and signage. In contrast, Saul Bass often approached the lettering of a main title as he would a logo, making it function as the core element in a full marketing campaign. While the variety of solutions increased considerably, their anchor was always the relationship of on-screen typography to the movie itself.

The power of minimalism is shown in the opening sequence for Ridley Scott’s “Alien” (1979). Credit for this design goes to Richard Greenberg, with creative direction from Stephen Frankfurt
Every sphere of contemporary life, and especially the film business, has been affected by computers. For designers, creating film titles meant participating in the apprenticeship tradition learning by doing, on the job; that continued unabated into the mid-1990s. At that time, dynamic openers by Kyle Cooper and others showed what the next generation of design-educated, film-literate, tech-savvy creative could do. That apprenticeship tradition has largely been overshadowed by the rise of popular technology, the Internet-enabled archiving of everything and the plethora of schools that propagate countless design disciplines. Most significantly, we see designers working like filmmakers and filmmakers working like designers.

The revolutionary title sequence for “Se7en” (1995) by Kyle Cooper was named by New York Times Magazine as “one of the most important design innovations of the 1990s”:

Throughout the history of cinema, film titles have evolved with the film industry, as well as with social trends and fashion movements. But the measure of a title design’s quality is the same now as it was in the silent era. Whatever function they perform, titles remain an essential part of film.

Granted, in recent years the business of film titling has been terribly strained by the control of producers over commissions and their persistence in demanding speculative work as the price of admission. Creatively speaking, though, as film-making consolidates into the most powerful international cultural phenomenon of the 21st century, ingenuity in titling is a certainty. As designers have always known, the opening moments can make a deeply satisfying contribution to any film.

Logo's:

What attracts customers? Obviously the quality of a product does, but visual images contribute a great deal. It is not only the image provided by the packaging that counts but the whole corporate identity of the company.

There are now many products and services on the market which are similar in content though produced by different companies. It is vital, therefore, for a company to distinguish itself from its competitors by having a strong company image which is immediately recognisable.

Logos are part of this image. They are symbols which often include a name or initials to identify a company. The logo establishes a visual identity for the company, just as different groups of young people express their identity through hairstyles and clothes. All groups from all cultures and throughout the ages have used colours and symbols to show their identity.

In different cultures, different colours carry different meanings. Some colours may be connected with coldness in one culture and with warmth in another; some colours represent life in one culture but death in another. International companies have, therefore, to make sure  that their logos will not be misunderstood or misinterpreted in different countries.


The logos of large international companies are instantly recognisable throughout the world. One of the most famous logos is that of Coca Cola. The design of the words "Coca Cola" has not changed since 1886, although the surrounding design has been changed from time to time.

Many companies have, over the years, renewed their logos to fit in with contemporary design and to present more powerful images. Company logos can be emotive and can inspire loyalty by influencing the subconscious. Some logos incorporate an idea of the product; the steering wheel in the Mercedes logo, for example, and the aeroplane tail of Alitalia.



Logos are used on letterheads, packaging and brochures as well as on the product itself. They may also appear in newspapers or on television as part of an advertising campaign.
Companies need to have a strong corporate identity. The logo helps to promote this image and to fix it in the minds of the consumers. Logos, therefore, need to be original and to have impact and style.


The film production industry is a highly professional and creative industry. It is also an iconic industry rich with symbolism. When creating logo designs for film-related businesses, there are numerous immediately recognizable and simple film-related symbols that designers can draw on directly such as film reels, camera lenses, director’s chairs and tripod stands. Whether the business is a large scale production company working the Hollywood scene or a corporate film production company or a freelance producer working on small scale events and functions, most will want a design that is both professional and creative.

Movie Company Logo's


A production logo is a special form of a logo used by movie studios and television production companies to brand what they produce. Production logos are usually seen at the beginning of a theatrical movie (an opening logo), or at the end of a television program or TV movie (a closing logo). Several production logos have become famous over the years, such as the 20th Century Fox Tower, MGM's Leo the Lion, and Columbia's Torch Lady. Logos for smaller companies are sometimes called vanity logos.


Above is an example of a production movie logo, every time a movie has been produced by 20th century fox this short clip will play revealing their logo/name or even a TV show like The Simpsons for example this logo will be shown at the end. In 1935, Twentieth Century Pictures and Fox Film Company (back then mainly a theatre-chain company) merged to create Twentieth Century-Fox Film Corporation.

The original Twentieth Century Pictures logo was created in 1933 by famed landscape artist Emil Kosa, Jr. 
After the merger, Kosa simply replaced "Pictures, Inc." with "Fox" to make the current logo. Besides this logo, Kosa was also famous for his matte painting of the Statue of Liberty ruin at the end of the Planet of the Apes (1968) movie, and others. Perhaps just as famous as the logo is the "20th Century Fanfare", composed by Alfred Newman, then musical director for United Artists.

At first glance, the whole logo looks like a superior one, with them using a low camera angle so that we are made to look up to it. This could be a technique employed to make them look better than their competition such as DreamWorks. The colour gold is dominant in this logo which again connotes royalty, emphasizing again the idea of them being the best. The stairs leading up to the text also gives the idea that it is a temple and should be worshiped almost as one of the top film companies. In the animated version of the logo as well they have the blue flood lights moving around it, as if they are protecting it because it is so precious. All of these features make it look impressive to the audience and of important stature. The clouds also appear and move at a fast speed near the end this could signify haste and a movement that the company is making and sends a positive message to the audience and not forgetting the music that is played when the logo starts to appear, the music is powerful as it uses a number of drums and horn instruments all of them loud and graceful. The music hits you straight away and creates excitement for what is about to be played.




Dream Works Animation have a history of making 3D films such as Shrek, and more recent ones such as Kung Fu Panda. The first thing you notice about their logo is the colour they have included in it. When shown at the beginning of each movie trailer the audience of ages 4 to 15 are more likely to be attracted to it. The young boy silhouette sat on the edge of the moon with a fishing line again appeals to the younger audience as well, and the fact that he's shown fishing in the clouds suggests that he is going to be discovering people's dreams up there and creating them into a film. This gives the company a positive image to the audience, implying that they are going to be creating amazing films and giving the audience what they want. The text is also in the centre of the screen, making it bold and attention seeking for the viewers. The blue sky shown behind it also connotes freedom, and gives the overall logo a care free , relaxed atmosphere.

In 1994, director Steven Spielberg, Disney studio chairman Jeffrey Katzenberg, and record producer David got together to found a new studio called DreamWorks.

Spielberg wanted the logo for DreamWorks to be reminiscent of Hollywood's golden age. The logo was to be a computer generated image of a man on the moon, fishing, but Visual Effects Supervisor Dennis Muren of Industrial Light and Magic, who has worked on many of Spielberg's films, suggested that a hand-painted logo might look better. Muren asked his friend, artist Robert Hunt to paint it.

Hunt also sent along an alternative version of the logo, which included a young boy on a crescent moon, fishing. Spielberg liked this version better, and the rest is history. Oh, and that boy? It was Hunt's son, William.

The DreamWorks logo that you see in the movies was made at ILM from paintings by Robert Hunt, in collaboration with Kaleidoscope Films, Dave Carson (director), and Clint Goldman (producer) at ILM.

Fake Logo's in Movies:

We all know there is a great deal of effort when it comes to creating movies and games but one of the more challenging aspects is making things look believable and by that I mean replicate reality. Among those aspects are the logos which are designed corresponding to the time and style of the movie.

Take for instance, the show “LOST”, they can’t just take existing airlines and create a conspiracy out of it, instead they make a fictional company with a logo that compliments it. Same goes to the wonderful movie “UP” where the little boy earns a badge saying “Wilderness Explorer.

A popular Logo in a movie is the Ghost-buster logo, having this image appear on the car and the suits of the Ghost-busters helps you familiarize it with them so every time you see that logo you will think of the ghost-busters. whether it be in the film on merchandise or just on the streets.






The logo also appears on the appears on movie title "Ghost-busters" but instead of an "O" it is replaced with the logo as it is a circular shape and can be represented as an "O".


Another fake logo that has been used in movies and even branded itself popular on clothing is the superman symbol of the "S" on his chest. The Superman logo, also informally known as the S shield, is the iconic emblem for the fictional DC Comics superhero Superman. As a representation of the first superhero, it served as a template for character design decades after Superman's first appearance. The tradition of wearing a representative symbol on the chest was mimicked by many subsequent superheroes, including Batman, Spider-Man, Fantastic Four, Green Lantern, the Flash, and many others.


Still Images:

A still image in drama is where a cast is preforming and you pause in a dramatic scene for a few seconds then carry on with the performance. Still images and freeze frames are both a form of tableau. With freeze-frame, the action in a play or scene is frozen, as in a photograph or video frame. Still images, on the other hand, require individuals or groups to invent body-shapes or postures, rather than freeze existing action.


While not usually a problem for fiction films (unless it’s a stylistic choice), most documentaries find the only visual media available for some parts of the film are still photos. The most common method that comes to mind to add movement to still photos is the Ken Burns effect.

The Ken Burns effect is a type of panning and zooming effect used in video production from still imagery.
The name derives from extensive use of the technique by American documentarian Ken Burns. The technique predates his use of it, but his name has become associated with the effect in much the same way as Alfred Hitchcock is associated with the Hitchcock zoom.

The feature enables a widely used technique of embedding still photographs in motion pictures, displayed with slow zooming and panning effects, and fading transitions between frames. The technique is principally used in historical documentaries where film or video material is not available. Action is given to still photographs by slowly zooming in on subjects of interest and panning from one subject to another. For example, in a photograph of a baseball team, one might slowly pan across the faces of the players and come to a rest on the player the narrator is discussing.

The effect can be used as a transition between clips as well. For example, to segue from one person in the story to another, a clip might open with a close-up of one person in a photo, and then zoom out so that another person in the photo becomes visible. The zooming and panning across photographs gives the feeling of motion, and keeps the viewer visually entertained.





While the Ken Burns effect is obviously effective and the staple of many PBS films, with the advent of After Effects and other tools, there are many more ways to dynamically include stills in movies.

THE 2.5TH DIMENSION
Using Photoshop and After Effects, you can achieve amazing results by removing elements from still photos and compositing them in a 3D space. These moves can be as simple as a pan or dolly with a little depth, to full blown camera fly-throughs of entire composited scenes. Here are 2 examples of how this is used.





STILLS WITH SOUND DESIGN
Through interviews and/or really good sound design of the scene in the photo, you can add life to the stills. This is a technique that would work really well if you have access to a bunch of raw negatives. Or you could even shoot a film in this style with a still camera and audio recorder. The entire Stories from the Gulf series were done this way. For Marwencol, the images themselves are a part of the story told in the film.


THE CONTACT SHEET
Back before digital, when you developed a roll of film you’d lay out all the negatives on an 8×10 piece of photo paper and create a contact sheet to figure out what negatives to print more of. I think contact sheets are beautiful in themselves, and Johnny Cash at Folsom Prison incorporates them into this film about Cash’s prison concert in a very effective way.




STILLS IN THE SCENE
American Greed and some of those other true-crime shows on american TV do a great job of using stills and documents from the story and creating a dramatically lit scene with them, which they then shoot the crap out of and get tons of b-roll. They’ll put pictures in frames or tape them to the wall and constantly go back to them for people they couldn't get on camera.





FAST CUT TRANSITION. KEN BURNS ON STEROIDS
Who says you have to hold on a still for a few seconds? Got tons? Throw them all in, 1 frame each. The film below uses them to act as transitions and inter-cut with archival footage, giving it a really dynamic feel. We can process images a lot faster than we think.





Production Techniques

In a post production, sound designers will take a raw footage from a principle shooting, This is the actual filming of individual scenes, without any special effects or musical background, and turn it into  a finished motion picture adding the sound effects and musical backgrounds to create an emotional effect whether it be dramatic or comical.


In film and TV, the audio portion of a project is recorded separately from the video. Unlike your home video camera, the film or video cameras used in professional productions don't have built-in microphones. Instead, all dialogue is recorded with either a boom or a tiny, wireless lavalier mic that can be hidden in an actor's clothing. Most other audio, like ambient background noise and music is added in post-production.

Post production refers to all the editing, assembling and finalizing of a project once all the scenes have been shot. Audio post production begins once the editors have assembled a locked cut of the project. A locked cut of a film contains all of the visual elements, selected takes, special effects, transitions, graphics that'll appear in a film's final cut.

With the locked cut in hand, the audio post-production staff can start spotting the film for sound. Different members of the post production team look for different things:

  1. The dialogue editor examines every line of spoken dialogue, listening for badly recorded lines (too quiet, too loud or jarbled, e.t.) or times when an actor's voice is out of sync with his lips.
  2. Sound effects designers look for places where they'll need to add ambient background noise (honking cars in a city, tweeting birds in the country), and "hard effects" like explosions, doors slamming and gun shots.
  3. Foley artists look for places to fill in details like footsteps across a wood floor, a faucet running, the sound of a plastic cup being placed on a marble counter top, e.t.
  4. The music editor looks for inspiration to either commission original music or buy licenses for existing song use.
  5. The composer, if he's already hired, looks for places where original music would add to the on-screen moment.
If the dialogue editor needs to replace or re-record unusable pieces of dialogue, he'll ask the actors to come in for an automated dialogue replacement (ADR) session. Here, the actors and editors synchronize the newly recorded dialogue with the lip movements on the screen and mix the audio smoothly into the existing recording.

Foley artists, named after the pioneering audio and effects man Jack Foley, use an eclectic bag of tricks to reproduce common sounds (a wooden chair for a creaky floor, cellophane for a crackling fire, a pile of audio tape for a field of grass, e.t.)

Sound designers and effects editors spend much of their time collecting libraries of ambient natural sounds. They record the sound of Monday morning traffic and save it as a digital file for later use. They record washing machines running, children playing and crowds cheering. You can also buy ready-made libraries with all of these sounds. But some of the best sound designers like to create entirely original effects.

Film and TV editing is an entirely digital world. No one sits around splicing film stock anymore. Even if a project is shot on film, it'll be digitized for editing and laid back onto film for distribution. The same is true for audio post production. The nice thing about digital audio editing technology is that there's a product and system for every budget and skill level.

For the home studio, everything can be done on a single computer without fancy control panels or consoles. You can buy a basic version of Pro Tools, Adobe Audition or a similar digital audio workstation and do all your recording, editing, mixing and exporting using the software's built-in functionality. Pro Tools doubles as a MIDI (Musical Instrument Digital Interface) sequencer, so you can even record a soundtrack straight into the software using a MIDI controller or live instruments.

Professional audio post production studios add another level of control by using large digital editing consoles. All of the knobs and faders on the console control specific elements within a DAW like Pro Tools or Nuendo. For many editors, it's faster and easier to manipulate knobs and faders by hand than to constantly be reaching for the mouse and keyboard.

Here are some features of DAW software for audio post production work:

  1. It handles an unlimited amount of separate tracks for the same project. This is especially advantageous in mixing a big project with different Foley recordings, sound effects, dialogue, background noise, music, e.t. All sounds can be loaded into a sequence.
  2. Tracks the audio to a built-in video feed. This is critical for timing the placement of effects and music.
  3. DAW Allows for tons of different automated pre-sets. Each separate audio recording session requires different levels on each track to create a balanced recording. DAW software makes it so you only have to set those levels once. Once they're saved as pre-sets, you can just click a button and return to the desired settings. This works with the large consoles as well. Click a button and all of the knobs and faders will return to where they were two Wednesdays ago.
  4. Cleans up bad recordings. Maybe a plane flew overhead when your hero was saying his big line, or the air conditioning unit in the grocery store was buzzing too loud. DAW software includes special filters and tools for cleaning up clicks, pops, hums, buzzes and all other undesirable background noise.
  5. Endless plug-in options. Plug-ins are small software add-ons that allow for additional tools and functionality. They can be special effects plug-ins, virtual instruments for scoring a movie, or emulators that reproduce the sound of classic analog instruments and equipment.
  6. Graphic interfaces for placing sound recordings in the 5.1 surround sound spectrum. By moving a cursor back and to the right, you can make it sound like a train is approaching from behind the audience.
Sequencing is putting all part of a song together, in this case for sound designers working in post production sequencing will be putting all the sound effects, Foley effects and background noise and music together and time it with whats going on in the footage, By sequencing these sounds you move them around within the realm of the footage to arrange the sounds in a way you want them to come across.

Synthesis and Sampling:

First, I will begin with a Synthesizer, Synths can come in a s a hardware device or software plug-in on your DAW, these days people usually just go for the software as it does not take up any space and is easily accessible. Synths are used in music to create and manipulate sounds, it can also be used to create sound effects for TV and movies as you can create any type of sound you want using a synthesizer if you know how to work it.

A synthesizer consist of three main sections an
Oscillator - this is what creates the sound, here you have a choice of what sound wave to use.
Filter - this is where you can alter the frequencies of the sound and experiment on creating weird effects
Amplifier - this controls the volume of the sound.

There is also the envelope section and ADSR section which you can use to also manipulate the sound as well as the effects section where you can add a chorus or delay effect to create the sound you want.  


Sampling, the art of triggering a sound clip to a backing beat or tempo, can be implemented in many ways during the songwriting process. It can be used to create drum kits or digital instruments or insert pieces of another recording into your song, or it can be used to destroy a clip altogether for the sake of creating an original noise.


Synthesizers are almost always used in Sci-Fi and horror films because they can produce otherworldly sounds. But for straightforward emotion, horns are used too. These are associated with pageantry, the military, and the hunt, so they are used to suggest heroism. Movies featuring death-defying heroes such as Star Wars and RoboCop use a lot of horns.




Sound sampling is a way of converting real sounds into a form that a computer can store, and replay. Natural sound is in analogue form. Analogue means that something is continually changing, or to put another way, that it has no definite value. Sound waves are a subject on their own, but you should know that sound has a frequency. This frequency dictates the pitch of the sound we hear. This frequency is measured in Hertz (Hz). If the sound oscillates at 50 times a second, then its frequency is 50Hz, and so on. The higher the frequency, the higher the pitch of the sound.

A software sampler is a piece of software which allows a computer to emulate the functionality of a sampler.
In the same way that a sampler has much in common with a synthesizer, software samplers are in many ways similar to software synthesizers and there is great deal of overlap between the two, but whereas a software synthesizer generates sounds algorithmically from mathematically-described tones or short-term wave forms, a software sampler always reproduces samples, often much longer than a second, as the first step of its algorithm.


You can combine Synths and Samples together to create ambient backdrops which can be used in films
http://audio.tutsplus.com/tutorials/production/how-to-combine-synths-and-samples-to-create-ambient-backdrops/

Equalizing:


An electronic device or piece of software that alters sound waves is known as a signal processor. One very common signal processor is an audio equalizer. An audio equalizer raises and lowers the strength of a sound wave. The goal of equalization (EQ) is to help achieve a good mix of sound that allows all instruments and vocals to sound good together.

Equalization can target part of a sound based on the frequency amplitude, or height, of the sound wave. For example, if the bass drum is drowning out the cymbals in an audio mix, an audio equalizer can make the cymbals sound louder. In this case the sound engineer will choose to raise the strength, or gain, of the high frequencies that make up the cymbal’s sound. The engineer may also choose to decrease the gain of the very low frequencies in the bass drum track.

Removing sound is another equalization goal. A bass drum microphone may also pick up and record sounds from the cymbals. The problem of recording unwanted sounds is known as bleeding or leakage. To get a cleaner bass drum track, an engineer can use an audio equalizer to lower the high frequencies on the bass drum track. This effectively removes the cymbal leakage.

An audio equalizer can be part of an audio mixer, a stand-alone piece of electronic hardware, or a software application. Audio equalizers inside a mixer usually have controls for three bands of frequencies including high, mid-range, and low. These equalizers make it easy to use EQ during the recording process.


Several varieties of stand-alone audio equalizers can be used that target sounds based on different characteristics. A sound is generally made up of a range of frequencies known as the bandwidth. The centre frequency is in the middle of the bandwidth. A peaking, or parametric, equalizer includes controls that can affect a sound wave’s gain, bandwidth, and centre frequency.

A graphic equalizer usually includes several controls, or sliders, to manipulate several frequency ranges. These equalizers also illustrate sounds levels with a row of lights for each frequency range. These lights make it easier for an engineer to see which frequencies need to be adjusted to get a good sound mix.
Specialty software applications, often called plug-ins, that perform EQ are also widely available. Usually, the EQ software works with, or plugs in to, a larger sound recording application. The engineer can use an audio equalizer plug-in on a certain track, part of track, or all of the tracks in a recorded song.

All signal processing adds noise, or unwanted sound, to an audio track. For this reason, engineers may want to limit the amount of equalization needed during the mixing process. In place of EQ, an engineer can try to achieve a better mix of sound during the recording process by using different microphones, moving microphones, or recording various instruments during separate recording sessions.



In film sound, the sound designer matches sound to the look of the film. A sad movie has mood lighting, and the sound will be designed to match it in emotional tone. Its dialogue is EQ'd less crisply, with a lower-frequency boost.

In a happy comedy, lower frequencies are rolled off, and it's EQ'd and mixed to be "brighter."
Film sound is "sweetened" by manipulating room tone, premixing audio levels, and carefully considering dialog, music, and effects for their proper audio EQ.

Film sound expects post-production sweetening, which makes film audio sound so different from audio for video. Video sound can be sweetened, but Indies use it pretty much as it is recorded.

EQ can also alter the frequencies of the human voice to make them sound like they are on the phone which would be good in a scene where a character is on the phone and you hear the voice of the person on the other end. (Example of this in previous post in a scene from the Matrix).



Use EQ to replace missing bass or treble (by using the high and low shelving controls), reduce excessive bass or treble, boost room ambience (high frequency shelf), improve tone quality (using all the controls), and help a track stand out in the mix (by using the parametrics . An instrument's sound is made up of a fundamental frequency (the musical note) and harmonics, even when playing only a single note, and it is these harmonics that give the note its unique character.  If you use EQ to boost the fundamental frequency, you simply make the instrument louder, and don't bring it out in the mix. It should be noted that a particular frequency on the EQ (say 440 Hz) corresponds directly to a musical note on the scale (in the case of 440 Hz, to the A above middle C - hence the expression A-440 tuning reference). Boosting the harmonic frequencies, on the other hand, boosts the instrument's tone qualities, and can therefore give it its own space in the mix.  Below are listed useful frequencies for several instruments:

  1. Voice: presence (5 kHz), sibilance (7.5 - 10 kHz), boominess (200 - 240 kHz), fullness (120 Hz)
  2. Electric Guitar: fullness (240 Hz), bite (2.5 kHz), air / sizzle (8 kHz)
  3. Bass Guitar: bottom (60 - 80 Hz), attack (700 - 1000 Hz), string noise (2.5 kHz)
  4. Snare Drum: fatness (240 Hz), crispness (5 kHz)
  5. Kick Drum: bottom (60 - 80 Hz), slap (4 kHz)
  6. Hi Hat & Cymbals: sizzle (7.5 - 10 kHz), clank (200 Hz)
  7. Toms: attack (5 kHz), fullness (120 - 240 Hz)
  8. Acoustic Guitar: harshness / bite (2 kHz), boominess (120 - 200 Hz), cut (7 - 10 kHz)




The thing to remember about EQ is not to get carried away be specific and use it only when you need it, where you need it.  If you get the mic placement correct and use good pre-amps on a good sounding instrument, you shouldn't need much.

The key to mixing audio is to make it sound exactly how you want it to sound and make the recording of the sound even better, you do this by adding stuff like a compressor to reduce the dynamic range so that nothing is too loud or too quiet but in a sound effect you might want something to start at a low volume and then increase, in this case you would not want a compressor but add in a fader or filter. It all depends on what kind of sound you are going for, another thing to use is noise gate if you want to eliminate any background noise in a sound below a certain threshold.

Saturday 6 October 2012

Sound Designers

Sound Designers (previously known as Sound Effects Editors or Special Effects (SFX) Editors) are responsible for providing any required sounds to accompany screen action. Most Sound Designers are experienced Supervising Sound Editors who carry out a managerial role, steering the work of the entire sound post production process, combined with the specialist role of creating the sound concept for films. As well as creating the sounds for giant explosions or car crashes, Sound design is also the art of creating subtle sounds that enrich the language and feeling of a film.

Sound effects are added after filming, during the editing process, to give the film its sonic identity, e.g., location, period, or a particular mood. Creating, manipulating and positioning these sound effects are the responsibilities of Sound Designers. They may be employed by Audio Post Production Houses, or work on a freelance basis and dry-hire a room close to the picture Editor providing their own Digital Audio Workstations. They are also likely to own their own recording equipment, e.g., DAT recorders or direct to hard-disc recorders and various microphones. Sound Designers work long hours to meet a demanding schedule of deadlines.

Benjamin "Ben" Burtt, Jr.


Ben Burtt is an American sound designer who has worked on various films including: the Star Wars and Indiana Jones film series, Invasion of the Body Snatchers (1978), E.T. the Extra-Terrestrial (1982), and WALL-E (2008). He is also a film editor and director, screenwriter, and voice actor.
He is most notable for creating many of the iconic sound effects heard in the Star Wars film franchise, including the "voice" of R2-D2, the lightsaber hum, the sound of the blaster guns, and the heavy-breathing sound of Darth Vader.



Burtt pioneered modern sound design, especially in the science fiction and fantasy film genres. Before his work in the first Star Wars (now known as Star Wars Episode IV: A New Hope) in 1977, science fiction films tended to use electronic sounding effects for futuristic devices. Burtt sought a more natural sound, blending in "found sounds" to create the effects. The lightsaber hum, for instance, was derived from a film projector idling combined with feedback from a broken television set, and the blaster effect started with the sound acquired from hitting a guy wire on a radio tower with a hammer.



Walter Scott Murch


Water Murch is an American film editor and sound designer. Murch started editing and mixing sound with Francis Ford Coppola's The Rain People in 1969. Subsequently, he worked on George Lucas's THX 1138 and American Graffiti and Coppola's The Godfather before editing picture and mixing sound on Coppola's The Conversation, for which he received an Academy Award nomination in sound in 1974. Murch also mixed the sound for Coppola's The Godfather Part II which was released in 1974, the same year as The Conversation. He is most famous for his sound designing work on Apocalypse Now, for which he won his first Academy Award in 1979.

Notice the sound of the rotor blades from the helicopters panning from left to right as they fly past (especially effective on headphones), and the combination of the sound from the blades combined with the image of the ceiling fan.


In 1979, he won an Oscar for the sound mix of Apocalypse Now as well as a nomination for picture editing. Murch is widely acknowledged as the person who coined the term Sound Designer, and along with colleagues developed the current standard film sound format, the 5.1 channel array, helping to elevate the art and impact of film sound to a new level. Apocalypse Now was the first multi-channel film to be mixed using a computerized mixing board.


Unlike most film editors today, Murch works standing up, comparing the process of film editing to "conducting, brain surgery and short-order cooking", since all conductors, cooks and surgeons stand when they work. In contrast, when writing, he does so lying down. His reason for this is that where editing film is an editorial process, the creation process of writing is opposite that, and so he lies down rather than sit or stand up, to separate his editing mind from his creating mind.


Bernard Herrmann



Bernard Herrmann born was an American composer noted for his work in motion pictures.
An Academy Award-winner for The Devil and Daniel Webster in 1941, Herrmann is particularly known for his collaborations with director Alfred Hitchcock, most famously Psycho, North by Northwest, The Man Who Knew Too Much, and Vertigo. He also composed notable scores for many other movies, including Citizen Kane, The Ghost and Mrs. Muir, Cape Fear, and Taxi Driver. He worked extensively in radio drama (most notably for Orson Welles), composed the scores for several fantasy films by Ray Harryhausen, and many TV programs including most notably Rod Serling's The Twilight Zone and Have Gun Will Travel.



Verbal Vigilante


Jode Steele and David Wainwright, A.K.A. "Verbal Vigilante" are two sound designers who specialise in huge percussion and orchestral scores for film and TV. They have a section on their website outlining their projects on movies such as "Skyline", "Shark Knight 3D", "In Time", "Dream House" and "Tremors".



Richard King



Richard King is an American sound designer and editor who has worked on over 70 films. A native of Tampa, Florida, he graduated from Plant High School (1972) and the University of South Florida. He won the Academy Award for Best Sound Editing for the films Master and Commander: The Far Side of the World (2003), The Dark Knight (2008) and Inception (2010) and was also nominated for War of the Worlds (2005).

Since his early days editing sound at Cannon Films (known for the Death Wish sequels and Chuck Norris action pictures), sound designer Richard King has progressed to audio intensive and award-winning mega-budget films such as Master and Commander: The Far Side of the World, War of the Worlds and, most recently, Christopher Nolan’s The Dark Knight, his second collaboration with the director.
“The sheer sonic density of The Dark Knight was challenging,” says King. “There were always two or three things going on that had to aurally work in tandem with each other and work with the music. There were a couple of big set-pieces that required a lot of effort, a lot of sound-effects recording, and a lot of trial and error.”

Because Nolan regards the temp dub to be a “charcoal sketch of the oil painting that will be the final,” King says he had to have all his ducks in a row by the temp dub, including temp versions of the final score. There was constant mixing and remixing of the sound as well as constant re-editing and redesigning of the sound effects. “Luckily I had enough time on the show, I could design and mix down large sections of the film and send them to picture editor, Lee Smith to cut into the Avid,” he recalls. “Chris would hear it and give me feedback so the track was able to evolve alongside the picture editing. By the time we got to the temp dub, there were no huge surprises.” The temp was like a finished mix with additional refining for the final.

The sound designer’s favourite sequence in the film is a truck chase where the Joker, played by the late Heath Ledger, tries to capture Harvey Dent, played by Aaron Eckhart. The long non-stop action sequence was designed to play without score. “We needed to be as musical as we could with the sound effects and try to create a rhythm that accentuated Lee’s picture editing or worked as a counterpoint to it,” says King. So he put in as many interesting sounds and frequencies as he could to keep the track alive. He tried to find high-end elements, not just the low-end roar of the truck engines, and had fun adding abstract sounds, such as animal roars, to accentuate accelerations. “Weapons were oversized sounding for what they actually are,” he says, “but worked within the context of the scene, a scene that pulls out all the stops and gets more and more crazy.”





Thursday 4 October 2012

The role of sound effects and musical elements in the moving image


Sound and music, or the absence of it has a profound impact on film and TV. It is used to enhance drama and to help illustrate the emotional content in the story. It is also used to manipulate the feelings and sentiments of viewers. This is universal. Whether you watch domestic or foreign content, music is used to this effect. Rarely do you see a film or TV program that doesn't employ music and sound.

Sound effects and sound libraries are used all over films and it is pretty much guaranteed that there will be at least one instance of this in every film and TV show you watch.

Jack Foley is widely known as the initial founder of adding post-production sound effects to film. The reason this process takes place is because some sound effects could be hard to record on-set and need to be more distinguished, need to be taken from the sound of something else or need to be edited to achieve the desired sound. People who record sound effects separately from the film are known as "Foley artists"

You can see that they are experimenting with many different objects to achieve a fitting sound for the movie, whilst watching the film to keep it in time.

An example of a sound being taken from somewhere else is the process used to create the laser blast sound from George Lucas' "Star Wars" movies. Sound designer Ben Burtt climbed a radio tower and used a hammer to strike one of the guide wires whilst recording it to produce the “pew, pew” sounds.

Another example of this lies in Peter Jackson's "The Lord of the Rings: The Fellowship of the Ring". David Farmer (sound designer on the LotR project) came up with the original template for the Balrog, he wanted it to sound like it was something that would live in the very bowels of the world, sort of like a big flaming turd with a sword and a whip. Or a giant horned tapeworm, if you will.

To that end, the Balrog's voice, and some of its movement, wound up being something ingenious in its simplicity: a cinder block scraping along a wooden floor at different speeds. That delightfully cracky, grinding sound that accompanies the demon is made of a mixture of rocks grinding together and the cinder block tearing over someone's parquet.


Ben Burtt, who is the sound designer for the Star Wars movies, used a very interesting method for creating the famous sounds of the lightsabers in the films, as explained in the following video.


The sound of the lightsaber moving when a microphone is waved past the sound source is created by the scientific principal known as the "Doppler Effect". This is the apparent change in the frequency of a wave caused by motion between the sound source and the observer.

For example, if you were to stand at the side of a road and a car drove past you, the sound waves given off by the car as it approaches are compressed against the front of the car making it appear to be higher pitched. Then when the car passes, the sound waves at the back of the car are flowing off of it and are spaced further apart, making the sound appear to be lower in pitch. here is a video of a fire truck siren demonstrating the Doppler effect.

Music in Movies:

Since the dawn of motion pictures, music has played an integral part of the cinematic experience. Before the advent of "talkies," music quickly became a necessary tool to aid the narrative. These conventions have become movie-making standards and are still used today.

Music can help express character emotion. In the days of silent film, the only methods to express how a character felt were the dialog cards, the actor's face, and the music score, all of which worked together to convey the necessary emotion. In a silent film, if a character delivers what appears to be a tense or dramatic speech, tense and dramatic music is sure to accompany it. In modern film-making, the same can be said to be true.

Music score is a basic and effective way to heighten the drama of a given scene in a film. Regardless of the genre (drama, comedy, romance), a film score can add to nearly any scene. In an action set-piece, the music will match the action in terms of power and intensity. In a comedy scene, the score can be expected to be as light or silly as the action taking place within the scene.

Music can establish a mood. At the beginning of a movie or scene, music is often used (along with establishing shots) to help set a tone, before any dramatic action takes place. In the beginning of a horror film, shots of a desolate, empty street may have ominous, foreboding music accompanying them, to establish that the action about to transpire is very scary.

Music can establish a time or setting. In many period pieces, film score or music from the era is used to help establish and reinforce the specific time period which the move is set in. Example: A film set in medieval times, will often use source or score music from the very era that it is trying to recreate, thus further embedding the viewer within a particular time and place.

Music can help advance the story. The music montage is a popular way to condense a large amount of information into a short amount of time. The use of a pop song or score selection, accompanied by thematically related shots (the lead character is sad, people falling in love, hero in training) advances the story without spending the days, months, or years it would take in real life for the actual events to transpire.

Music also can mislead the viewer. Musical misdirection is most often employed in, but not limited to, thriller and horror movies; usually to lull the viewer into a sense of complacency before a big scare. Example: The heroine babysitter walks into the upstairs bedroom as calm music plays, only to discover a killer behind the door, immediately cued by pierced, shocking music. In an opposite example, the babysitter walks upstairs, with tense and scary music underneath the scene, expecting to find a killer, when it turns out only to be the house cat.

Music guides your emotions in a film. Think of movies like Psycho, The Ten Commandments, Super Fly or the Indiana Jones films. Think of Seinfeld, Lost or Numb3rs. The music in those films and shows serve as a powerful guide as to how you should feel about what is happening on the screen. Music brings a film to life like nothing else can. It has its own language that can signal a certain theme, a mood, a quality in a particular character or even give landscapes in the film a character like quality. Certain kinds of instruments can be used to create this effect. The instruments can evoke a certain era or time period, culture, country or a fantasy land. The Portrait of a Lady, Batman, Amadeus and The Lord of the Rings all have scores that employ these techniques. It can range from ancient, ethnic sounds or something evocative of antiquity like the score for Conan the Barbarian or Spartacus, with romantic orchestral style underpinnings or something very modern and atonal like The Matrix or Alien. Lest you think I've forgotten about soundtracks, soundtracks are every bit as effective as film scores.


The music for Raiders of the Lost Ark played a key role in this film, and John Williams provided an excellent score for this film, also earning a nomination for an Oscar for best score. The spirit of Williams' style for Raiders of the Lost Ark is finely tuned to the adventuresome tone of the film's story, matching the exuberance of each of its scenes with the same precision of theme and emotion. The title march attracts the most obvious attention when the masses recall Raiders of the Lost Ark, but in reality the extremely effective and even catchy subthemes for the score are equally vital to the score's success. Still, it's the title march you hear in stadiums and in trailers for the following entries in the franchise; just as Monty Norman's theme for James Bond and Williams' theme for Darth Vader are engrained in pop culture as the most obvious musical representations of one serial movie character, the march for Indy Jones is worthy of the same distinction. The score is a rare occasion in which the entire package, with only a few small detriments in lesser cues, is better than the brightest moments of almost any other score. Williams so thoroughly nails the pulse of this picture, from the melodramatic awe of the Ark to the gritty rhythms of Jones resilience as he battles a convey of trucks, that Raiders of the Lost Ark is a cinematic experience much greater in both intensity and entertainment value because of Williams' contribution. Three major themes exist in the score, and the purpose of each is so clear that the composer would work all of them into sequels in the franchise. A variety of lesser motifs, including secondary phrases of these major themes, occupy significant roles in the work. There has long been speculation about additional motifs in Raiders of the Lost Ark, though while Williams does definitely conjure auxiliary ideas throughout the score, their direct application remains open for debate. What isn't contested is the harmonic beauty of the score. Even the film's major action sequences  offer exhilarating tonal structures and readily enjoyable rhythms, producing a consistently fluid experience.

Soundtracks of the Lord of the Rings film trilogy music was composed and conducted by Howard Shore. The scores use a technique called leitmotif, which is a musical phrase associated with a character, a feeling, an event, etc. Below is one of the theme music for Lord of the Rings called "Concerning Hobbits", this is the theme for the Shire. On the soundtrack, it starts by a solo flute playing the theme, followed by a solo violin and then the complete orchestra repeating this theme. The track then quiets down and the solo flute plays a second melody, followed by the orchestra. The track ends with the violin and then the flute repeating the first melody.

  1. First we have an introduction given by the cellos and bassoons to set the "hobbit" mood. Notice that in most of the other tracks, the low strings, when played by itself, gives a kind of foreboding. Here, by playing them staccato (detached) and playing 1-5-8-5-1-5-8 which is tonal it gives it a more playful sound. (The numbers refer to the notes of the scale. If you are in C then C is 1, D is 2, E is 3 and so forth.)
  2. At: 05 the tin whistle, I believe, gives us Frodo's theme when he says, "It's wonderful to see you Gandalf!” Notice that it is played light and airy, in other words, there is little bass and completely tonal. This is definitely our happy Frodo! The theme also has much ornamentation such as grace notes and slides which is a folk derivative. The high strings take over the cello part, which helps maintain a steady, but very subtle beat. Almost all of the tracks have a lot of dissonance in them and the fact that this is so tonal makes the music seem very folk like. The harmony is also quite simple in its form.
  3. This time at: 23 the bassoons play the 1-5-8-5-1-5-8 intro by themselves. This is where Frodo asks about the outside world.
  4. At: 28 a solo violin plays the folk like melody. Each note is much attached to make it sound like a fiddle which is a folk instrument. A harpsichord is added playing the notes of each chord one at a time to give a very simple accompaniment. The fact that HS used harpsichord instead of piano makes it seem even more folk like and detached. (A piano hits the strings and a harpsichord plucks them.) The strings also do what I call a boom-chuck-chuck, boom-chuck-chuck which is also very simple in its form and gives a kind of humour to the music. This is where Gandalf says, "Well, what can I tell you?" and you see Hobbiton for the first time.
  5. At: 43 we hear a version of Frodo's theme again while we see the market place at Hobbiton. The music is very legato (opposite of detached) and the contrast makes this section seem very warm and inviting. However, you can still hear the harpsichord still playing its detached notes in the background to continue the happy/playful side of the hobbits.
  6. At: 54 the solo violin enters again as we see the "Happy Birthday Bilbo Baggins" sign raised. The music also rises in pitch to help the expectation of the party. This doesn't last very long since Gandalf changes the subject to Bilbo.
  7. 1:04. here the music changes to a more serious tone, a premonition perhaps? There is no tune until 1:13, just held chords by the strings. It's like the playfulness of the hobbits paused for a moment of foreboding. However, there is still no dissonance here.
  8. Of course it doesn't last very long because at 1:18 Frodo changes the subject and tells Gandalf that he can keep his secrets and the boom-chuck-chuck and harpsichord returns. Then he says, "Whatever you did, you've been officially labelled as a disturber of the peace." This is right on 1:36. Here the violins play even more staccato, any more and I don't think the note would exist, with the addition of the flute to give it as much of a humorous tone quality as possible as the grumpy old hobbit gives Gandalf the grim look.
  9. Then we see the countryside again with the young hobbits running toward Gandalf's cart at 1:44. Frodo's theme comes back played very lyrically, again to give us the warm, fuzzy hobbit feeling.
  10. At 1:57 the hobbit children cheer for the fireworks and we get the interaction with the grumpy hobbit and his wife. The music returns to the light and detached theme to say that this is funny. I think the wife's glare happens at the silence at 2:07.
  11. At 2:15 as Frodo says, "Gandalf, I’m glad you're back." His theme returns so we can end with the warm fuzziness as Gandalf pulls up to Bag End.


Music in film is also about the imagery. Intimate drawing rooms, beer parties, a coronation, sweeping landscape vistas, a foggy road all of these scenes will require the music to adapt and fit what the audience sees on screen and to help tell the story in these scenes.

Music can be broadly appealing and direct in its storytelling like in Shrek or Spiderman or subtle and minimalist in films like Spider or Memento. Music is not all that is used. Ambient sound in film is just as important. Animal sounds, flushing toilets, throwing objects together, crushing paper, dropping things on surfaces or rubbing things together all these techniques will get interesting sounds that can fit the scenes in a film or show. These sounds can be used to surprising and unexpected ways. The only limit is the creativity of the team doing the work. Sound, whether from ADR, film music, modern music or soundscapes is not accidental. Some of the sounds you hear in films come from making accidental noises but they are kept and used because of some certain quality that can be used to enhance the film. Sounds are recorded from nature or from urban settings. The point is that sound is designed, not simply recorded and thrown into the mix.

Like white space in graphic design, silence is also used to dramatic effect. Too much music can grate on the nerves. Beats of silence can be put in to a scene or silence can be used as an element to underscore something or someone and it can be just as dramatic and effective as music. Sometimes sound has to be pulled out in order to be heard properly. It, like music, can overwhelm viewers and may have to be pulled back, which can make it more impactful, more visceral. Less can be more and this technique can heighten the intensity in a scene without overwhelming it.

Overall, music and sound are an integral part of film and TV production. It helps tell the story, guide the audience and evoke mood, character and themes that can drive home whatever the story is trying to convey. Music has a profound effect on our emotions and if the performances don't affect you, often the film score or the soundtrack will.


Sounds for Intrigue
Create intrigue by "asking questions" with the weird twanging of high strings, the plinking of a child's piano, or the buzzing of a synthesizer. War of the Worlds is possibly the best example of using weird sounds to fascinate and capture the audience's imagination.

Sounds for Tension
One of the best ways to create suspense is just to raise the volume or tempo of ambient sounds. It's amazing how irritating ambient sounds can be, and this can be used to your advantage. You can really grate on the audience, subtly building tension in them with annoying spaceship beeps or jungle insects.

Theme Music
Bring atmosphere to individual characters by giving them their own theme music. Of course ensure that it matches their style!

Sounds for Heroism
After all these years, horns and brass still provide the crescendo of sounds to represent heroism. Not just accompanying sounds for heroes, but acts of heroism are often represented with a powerful orchestra as well.

Sounds for Humour
Of course, using sounds for humour means not having to worry too much about making the sounds "invisible". In fact, making the sound overly obvious can provide a bit of humour in itself. Use techniques such as timing, exaggeration, and surprise.


Non-Pitched Atmospheric Sounds

Non-pitched atmospheric sounds are used in film making to create a similar feeling to what the score tries to establish, but in a more subtle fashion. They aren't usually very noticeable unless you're looking for them, but will still have the effect regardless of if you're aware of a sound being played.

A brilliant example of the effectiveness of atmospheric sounds is Orin Peli's "Paranormal Activity". Often throughout the film, when there is a silence, an extremely low rumble is used to create subtle suspense and a feeling of uneasiness in the viewer.
Note how the extremely low frequency sound stops as soon as the door slams shut. The sound is used to build suspense, to make the viewer aware that something is going to happen. This is very effective on high end sound systems such as those in cinemas where the sub-woofers can produce extremely low sounds. Found footage movies like this generally have no score or non-diegetic sounds, so these techniques are usually relied on to keep it subtle and make it feel more realistic than high-budget blockbuster.

The thing with Non-pitched sounds is that they are the little sounds you see in movies that you know are real, non pitched sounds could be a door slamming or birds chirping at the park while sound effects are usually done on synthesizers or created by recording various different sounds to create unrealistic sounds like the sound of a lazer gun or hover car.

Although non-pitched sounds are not always their true sounds, for example if you recorded a plain taking off it is unlikely that the sound of the real plane will be recorded on the footage usually sound designers will come up with  a different way to re-create the sound and to make it sound like a plane and then edit it in, this is where Foley comes in.

 Foley effects are sound effects added to the film during post production (after the shooting stops). They include sounds such as footsteps, clothes rustling, crockery clinking, paper folding, doors opening and slamming, punches hitting, glass breaking, etc. In other words, many of the sounds that the sound recordists on set did their best to avoid recording during the shoot.

The boom operator's job is to clearly record the dialogue, and only the dialogue. At first glance it may seem odd that we add back to the soundtrack the very sounds the sound recordists tried to exclude. But the key word here is control. By excluding these sounds during filming and adding them in post, they have complete control over the timing, quality, and relative volume of the sound effects.

For example, an introductory shot of a biker wearing a leather jacket might be enhanced if we hear his jacket creak as he enters the shot, but do we really want to hear it every time he moves? By adding the Foley sound fx in post, they can control its intensity, and fade it down once the dialogue begins. Even something as simple as boots on gravel can interfere with the comprehension of the dialogue if it is recorded too loudly. Far better for the actor to wear sneakers or socks (assuming their feet are off screen!) and for the boot-crunching to be added during Foley.


Foley is usually performed by Foley artists. Ideally they stand on a Foley stage (an area with a variety of possible surfaces and props) in a Foley studio (a specialized sound studio), though any post production sound studio will do with a little modification. The Foley artists can clearly see a screen which displays the footage they are to add sound fx to, and they perform their sound effects while watching this screen for timing. The actions they perform can include walking, running, jostling each other, rubbing their clothing, handling props, and breaking objects, all while closely observing the screen to ensure their sound fx are appropriate to the vision.

Increasingly, many simple Foley sound fx are done without Foley artists the sound effects are stored electronically and performed by the post production sound engineer on a keyboard while watching the visual. Done poorly this type of "Foley" sounds bland and repetitive, and it is nowhere near as flexible as the real thing, but it is much cheaper than renting a Foley stage and paying Foley artists to create the Foley sound effects.

Without Foley, a film sounds empty and hollow - the actors seem to be talking in a vacuum. The sound recordist, if they did a good job, has given us the dialogue and excluded everything else, but our films needs more than this for the picture to come alive. We need to hear the little sounds of clothes, furniture, etc . But we need to control those sound effects so they don't obscure any of the dialogue.
Another common use for Foley sound replacement is adding it to documentary footage. Old historical film seems lifeless when it is screened without sound, and adding Foley to it helps bring those long dead images to life. Next time you watch a history documentary that uses silent archival footage, listen closely and you should hear at least minimal Foley sound fx, mostly footsteps, behind the narration.

Foley can also be used to enhance comedy or action scenes. Watch most comedy films and you'll notice that many of the sounds are enhanced for comic effect, and sometimes the Foley sound is the joke. As for action, most fist fights do not involve the actors really hitting each other, and even if they did we would not be able to record a satisfying punch sound. By punching and variously molesting such objects as cabbages, celery and sides of beef, Foley artists can record unique and much more 'realistic' action sounds.

The three important types of sound we use in films and television are referred to as hard effects, Foley, and backgrounds. The difference between the types is primarily their length. Hard effects will ordinarily short sounds like gunshots, light switches, a big slap on face, glass smashing, etc. Backgrounds normally begin in the beginning of the scene and carry on all the way to the finish of the scene. As an example, if the scene we editing on is staged at a city park, we'll rarely not contain effects like birds chirping and singing from the beginning of the scene to the end of it even though birds usually are not seen all the time and in a lot of cases, not seen in any way.

Why do we have to have the background sound effects?

If you ever look at a footage of the film as it had been recorded on location you are going to understand that it's somewhat dead as much as ambiance sounds. For example, in case you see a scene in a bar you will see a lot people speaking but you may only hear the voices of the important characters. The way in which this really is done is usually that the people who you see speaking are basically pretending to be talking or speaking in very low voices. Clearly, this is very unnatural for us humans to watch considering within real bar there really are a lot of people speaking. If the people in the bar were talking normally, that would have made it hard for the mixer to mix it in a way that the audience can easily understand the dialogue of the main actors (which is always the most important).

In order to make it more natural, the sound editor can add in the sound effect of men and women speaking (sometimes called "Walla"). The editor will also add other sound to help the scenes seem more realistic such as glass cups clinking, drinks mixing, room tones, perhaps an off stage billiard game and much more.
What are room tones? Inside of pretty much any room you enter you'll notice some kind of sound. It could be from your air condition, it could be from city traffic outside, it can be from a refrigerator or something else. You don't at all times notice it but our brain is quite use to hearing those sounds all the time. Then to make the film as natural to us humans as real-life, we always add room-tones on internal scenes. Once scenes are outdoors, we have the choice of including traffic or wind. Additionally to making the movie appear much more natural they are also very helpful for the dialogue mixer because the dialogue coming in to the mix stage might have times when the background noise in the dialogue recordings have noticeable changes as well as holes. A background sounds are helpful to mask those holes.

But, the most important reason we add background sound effects is to make a movie or Television show richer. The backgrounds are excellent for setting the atmosphere that your director would like the audience to experience. As an example, you can have two different films which are shot at the precise same location in a city, let's say one romantic movie and one suspenseful. In a romantic film we will put light city traffic elements, nearly no horns, wind in the trees, and birds. For the suspenseful film you can add very heavy city traffic, people honking , police sirens, nervous crowd voices, helicopters, etc. Together with music, the background sounds is a fantastic instrument of setting the mood for a film.

An example of how non-pitched sounds will be used in a romantic movie is the final scene to "Pretty Woman", and as well as the music helping make the scene feel more romantic, pay attention to the little sounds that help make the film look more realistic.

The start of the clip you hear a slight wind breeze out on the street to fill the background noise, when it cuts to  Julia Roberts in her room you can hear her footsteps walking around the room. As you hear the car horn, she walks towards the window and you hear her footsteps again. When the music kicks in, unless you pay close attention you won't be able to notice the next sounds but, the sound of the limo driving is there as well as the wings of the birds flapping away. There is also the screeching noise of the ladder getting pulled down as he climbs up. All these sounds are important even though you don't always notice them they are there to make the film seem more realistic and its because its realistic you don't notice them as that's what you can expect to hear if it was in a real life situation.

My last example is a scene from the "Matrix", listen to how the non-pitched sounds help create a more dramatic and suspenseful scene as Neo (main character) attempts a daring escape from the agents.


At the start of the clip when the delivery man comes in you can hear the slight noises of Neo's chair turning round, then all the little sounds of Neo signing for a package, like the writing of the pen and passing of the package. Then comes the sound of the ripping opening of the package, the ringing of the phone and the sound of the phone flicking open, hardly you will ever see a phone make that sound when flicked open but it is added for dramatic effect.

Morpheus voice on the phone would of been added in the post production as it won't be possible to record the scene and the phone call at the same time, the sound designer would of also altered the frequencies in Morpheus voice using an Equalizer to make it sound like the receiving end of a phone call, making the conversation on the phone more realistic.

As the agents walk around the room you can hear their footsteps, as well as the music being played, this also adds suspense as they walk around in search for Neo. The footsteps of Neo is also heard as he runs to the room at the end after being told to, but his footsteps are more fast paced creating the feeling of haste and panic. As Neo enters the room you hear the sound of the door open and close which is another non-pitched sound. Then comes the sound of the window opening followed straight after with the sound of the wind from outside, all these sounds come together to make it feel that he has actually opened the window, the wind also helps build suspense as it makes the audience feel that he is high up.

After the phone call with Morpheus, Neo climbs out of the window and yet you can hear all the little sound of his steps to make it sound real, and the wind gets more intense. You then hear Neo's feet shuffling along the edge of the window making his movements feel like he's in a panic but is also moving fast, at the same time the wind starts to blow hard making almost a whistle sound creating more suspense.

Neo then comes to an obstacle, at this point you can hear the metal clanging and rattling as he grabs hold of it and tries to make his way around. Neo takes one final look down before a massive wind blows, making a whistle, sound knocking the phone out of his hand. Neo quickly grabs onto the metal barrier and the sounds of the metal clang and bang are heard as his hands lock on for his life. The window continues to blow and Neo finally says "I can't do this" and forfeits his efforts of escaping. As Neo is captured and being taking into custody you hear light traffic noises in the background to ease the tension and the doors of the car opening and shutting as  Neo is put inside.

Without these little sounds films will not be brought to life as much, as these are the sounds you hear in everyday life so as you watch these films you expect to hear certain sound that you would normally hear in a real life situation, this when used properly help make an impact on a scene as I have just demonstrated to you by using these two examples.