Computer graphics created a whole new art form, but also a whole new way of looking at the world. That’s a big statement, but there is a historical analogy in art with the discovery of perspective.
Okay. Let’s take a step back.
You know how early art was kind of…flat? See, before 1400 A.D. classical art was lacking perspective. Groups of people and objects were placed on the paintings in ways that were…well…flat. Paintings had little or no depth.
But around 1400 that changed with the discovery of perspective, that is, an approximate representation, on a flat surface (such as paper), of an image as it is seen by the eye. The two most characteristic features of perspective are that objects are smaller as their distance from the observer increases, and that they are subject to foreshortening, meaning that an object’s dimensions along the line of sight are shorter than its dimensions across the line of sight.
Italian Renaissance painters and architects including Filippo Brunelleschi, Masaccio, Paolo Uccello, Piero della Francesca and Luca Pacioli studied linear perspective, wrote treatises on it, and incorporated it into their artworks, thus contributing to the mathematics of art.
Now. Imagine for a moment, what it must have been like to have been there when these principles were first discovered: the basic principles of the way that the eye sees depth and how to use simple mathematics to make a visual representation of the world around you on a piece of parchment. To the artists it must have felt like unlocking the secrets of the universe.
It was like getting a peek at God’s work notes.
Okay. Let’s fast forward to to today. Or, at least to the 1980’s when a similar remarkable revolution in graphics was taking place.
Computer graphics were created from geometry and mathematics. By plotting enough points in space, by employing enough lines in between those points, by using them to create virtual polygons, by mapping textures over those structures…suddenly whole universes were opening up right in front of the computer programmer’s eyes.
Wait…did I say computer programmers? Where were the artists?
Ah, well, that’s the thing. Suddenly a new art form had been stumbled upon by computer number crunchers, and you havd no artists? Welcome to the beginning of the development of a new breed of artists. The computer artists.
Suddenly there was a void that needed to be filled. Computer geeks were attracted to this new paragon of computer art in ways that could not have been predicted. And, of course, Hollywood played its part.
So, computer graphics up to this point were presented by Hollywood as simply that. Graphics that came from or were created by a computer. But what if computer graphics were capable of not just depicting the products of computers? What if computer graphics could actually stand in for real objects?
Okay, spaceships aren’t technically real, but as most people know, when you needed a spaceship in movies before the advent of computer graphics, you hired people to build a model. Miniature spaceships were filmed on wires in front of painted backdrops, then eventually in front of blue screens as compositing technology improved.
But what if you could just build the model inside the computer?
That’s exactly what happened with 1984’s The Last Starfighter, a science fiction film directed by Nick Castle. The film tells the story of Alex Rogan (Lance Guest), an average teenage boy recruited by an alien defense force to fight in an interstellar war.
3D rendered models were used to depict space ships and many other objects. The computer graphics for the film were rendered by Digital Productions on a Cray X-MP supercomputer. The company created 27 minutes of effects for the film. This was considered an enormous amount of computer generated imagery at the time. Digital Productions estimated that using computer animation required only half the time, and one half to one third the cost of traditional special effects. The result was a cost of $14 million for a film that made about $21 million at the box office.
Okay, fast forward to 1988 and a film called Willow. Willow was an American-British high fantasy film directed by Ron Howard, produced and with a story by George Lucas. Not a huge box office success, but it is important for one scene. In the film, the protagonist played by Warwick Davis, has to use magic to transform a wizard from a goat into a human form. But he’s not that good at it. The script calls for the hapless wizard to transform from a goat into an ostrich, then a turtle, then a tiger and finally into a human woman. Dennis Muren, the director of visual effects for the film, considered all sorts of traditional methods for the transformation.
Muren found both stop motion and optical effects to be too technically challenging and decided that the transformation scene would be a perfect opportunity for Industrial Light and Magic to create advances with digital morphing technology. He proposed filming each animal, and the actress playing the wizard, and then feeding the images into a computer program developed by Doug Smythe. The program would then create a smooth transition from one stage to another before outputting the result back onto film. Smythe began development of the necessary software in September 1987. By March 1988, the impressive result Muren and fellow designer David Allen achieved would represent a breakthrough for computer-generated imagery.
You could be forgiven for not realizing that a computer was even involved in that effect because the results were so spectacular.
After that CGI was used to create the water creature from 1989’s The Abyss. Computer generated characters were showing up everywhere. The T1000 in 1991’s Terminator 2 was a tour-de-force of digital moviemaking that blew audiences away.
Then, in 1993, Steven Spielberg decided to adapt Michael Crichton’s novel Jurassic Park to the big screen.
This is the moment where everything changes. And next week I will examine the impact of that moment, how it changed visual effects for good and how it began to change our own ability to perceive computer generated images at all.