Per-pixel lighting

Source: Wikipedia, the free encyclopedia.

In

3D model and then interpolates
the resulting values over the model's faces to calculate the final per-pixel color values.

Per-pixel lighting is commonly used with techniques, such as

. Each of these techniques provides some additional data about the surface being lit or the scene and light sources that contributes to the final look and feel of the surface.

Most modern video game engines implement lighting using per-pixel techniques instead of vertex lighting to achieve increased detail and realism. The

, among others, also implement per-pixel shading techniques.

Deferred shading is a recent development in per-pixel lighting notable for its use in the Frostbite Engine and Battlefield 3. Deferred shading techniques are capable of rendering potentially large numbers of small lights inexpensively (other per-pixel lighting approaches require full-screen calculations for each light in a scene, regardless of size).[1]

History

While only recently have personal computers and video hardware become powerful enough to perform full per-pixel shading in real-time applications such as games, many of the core concepts used in per-pixel lighting models have existed for decades.

Frank Crow published a paper describing the theory of shadow volumes in 1977.[2] This technique uses the stencil buffer to specify areas of the screen that correspond to surfaces that lie in a "shadow volume", or a shape representing a volume of space eclipsed from a light source by some object. These shadowed areas are typically shaded after the scene is rendered to buffers by storing shadowed areas with the stencil buffer.

Jim Blinn first introduced the idea of normal mapping in a 1978 SIGGRAPH paper.[3] Blinn pointed out that the earlier idea of unlit texture mapping proposed by Edwin Catmull was unrealistic for simulating rough surfaces. Instead of mapping a texture onto an object to simulate roughness, Blinn proposed a method of calculating the degree of lighting a point on a surface should receive based on an established "perturbation" of the normals across the surface.

Hardware rendering

Real-time applications, such as

GPU
hardware to process the effect. The scene to be rendered is first rasterized onto a number of buffers storing different types of data to be used in rendering the scene, such as depth, normal direction, and diffuse color. Then, the data is passed into a shader and used to compute the final appearance of the scene, pixel-by-pixel.

specular data, diffuse data, emissive maps and albedo
, among others. Using multiple render targets, all of this data can be rendered to the g-buffer with a single pass, and a shader can calculate the final color of each pixel based on the data from the g-buffer in a final "deferred pass".

Software rendering

Per-pixel lighting is also performed in software on many high-end commercial rendering applications which typically do not render at interactive framerates. This is called offline rendering or

Softimage
is a well-known example.

See also

Notes

  1. ^ "Forward Rendering vs. Deferred Rendering".
  2. ^ Crow, Franklin C: "Shadow Algorithms for Computer Graphics", Computer Graphics (SIGGRAPH '77 Proceedings), vol. 11, no. 2, 242-248.
  3. ^ Blinn, James F. "Simulation of Wrinkled Surfaces", Computer Graphics (SIGGRAPH '78 Proceedings, vol. 12, no. 3, 286-292.
  4. Hargreaves, Shawn [pl
    ]
    and Mark Harris: "6800 Leagues Under the Sea: Deferred Shading". NVidia Developer Assets.