Nine Extraordinary Compositional Features

From 911composites

Jump to: navigation, search
fig.  1 - Chopper 5 – West Coast or “Salter” version
fig. 1 - Chopper 5 – West Coast or “Salter” version
fig.  2 - Chopper 7 – From Eric Salter
fig. 2 - Chopper 7 – From Eric Salter

I begin by comparing the two live shots. Only 2 different airplane videos are confirmed to have been shown live –news helicopter shots known as “Chopper 5” and “Chopper 7”. Neither one actually shows an airplane hitting anything. They feature a remarkable list of shared compositional characteristics:

  • Very brief (<1.5 seconds) appearance and disappearance of plane
  • High contrast between sky and tower edge
  • Plane path is across sky only
  • Plane disappears across straight vertical edge
  • All surfaces requiring airplane shadows are hidden
  • Actual impact wall is hidden
  • Camera is gyroscopically stabilized
  • Helicopter is as motionless as possible, drifting very slowly to the left
  • No panning, tilting, zooming or focusing while airplane is on screen

As it turns out, these are precisely the characteristics necessary for live video compositing. [1] Absent any one of these nine, real-time compositing becomes impossible. Given all nine, real-time compositing is quite feasible.


Pulling a Key

Both live shots are from the shady side of the towers, looking into a bright sky, making for a very high contrast difference between the two. A flying airplane image can be instantly added on top of any shot, but making it appear to pass behind a building requires “pulling a key”. That is, software must be able to accurately distinguish between what is sky, and what is tower.

The simplest type of key is “luminance keying” or “luma key”, in which the software decides what is what on the basis of brightness. Given this very high contrast, and also the razor-straight edge, pulling a key is easy. With lower contrast,or an irregular edge, realistically separating the elements is impossible.

Keeping it Simple

Computer animation software can render very realistic shadows, but doing so requires an accurate model of the object casting the shadow, and also a model of whatever the shadow falls upon. The feasibility of animating shadows depends directly on the complexity of the surfaces involved. A flying airplane casts a shadow on the ground and buildings below, and if it smashed into a tower, it would cast a shadow on the wall in the process. Attempting to render accurate shadows in real-time would be a sure recipe for detection. Far easier would be to compose the shot in such a way as to not need them.

Making an airplane image disappear through a wall is done by masking. [2] A shape can simply be drawn, defining a region of transparency. As the airplane crosses into the mask, it disappears. However, the positioning and timing are critical. Misplacing the mask or the airplane image by even a few pixels, or having an explosion go too early would be a dead giveaway. Compositors would not even contemplate trying to show a plane hitting the tower wall in real-time.

How convenient it was that all of the 9/11 news helicopters, including Chopper 5 and Chopper 7, were positioned north and west of the towers. None could see the south face of the south tower, the wall United flight 175 allegedly crashed into.

Keeping it Steady

News helicopter cameras are mounted in a very sophisticated gyroscopic stabilizer system. Though the helicopter itself is full of vibrations, and cannot hold still, helicopter video is remarkably stable. Attempting to real-time composite the smooth motion of an airplane onto any sort of non-stabilized shot is a non-starter.

Isn’t it strange that neither of the two camera operators followed the motion of the incoming airplane? Inserting an airplane image into a live shot requires the live camera to hold still. Zooming, panning, tilting, or focusing during the shot would expose the composite right away, because the airplane image would not show the same camera action.Compositing onto a moving camera shot is possible, with a process called “motion tracking”, but not in real-time.Real-time motion tracking did not exist in 2001, and even today, is not reliable enough to correctly insert an airplane in this situation.

When United 175 appears on the screen, both Chopper 5 and Chopper 7 are as motionless as possible, drifting slowly to the left. As soon as the plane is gone, both camera operators tilt and pan the camera around.


I invite all to please study other helicopter footage from 9/11, or from any live news event. Note the compositional characteristics. News helicopters are moving around all the time. They zoom in and out, pan left and right, tilt up and down. They follow the action, like, for example, an incoming passenger jet.How likely is it that all 9 of the compositional characteristics required for real-time compositing occur by chance, on both live airplane shots, during the exact time the airplanes are on screen? Perhaps a rigorous study could be made to quantify the answer. For now I am content to say: Extremely unlikely.

Thus, the compositional characteristics of Chopper 5 and Chopper 7 both strongly favor the compositing hypothesis, and make the real plane hypothesis extremely unlikely.

Personal tools

דומיין בעברית  דומיין  דומין  תוכנה לניהול  קשרי לקוחות  CRM, ניהול קשרי לקוחות  דומין בעברית  פורומים  ספרדית  גיבוי