Science Fair Project Encyclopedia
Rendering is the process of generating an image from a description of three dimensional objects, by means of a software program. The description is in a strictly defined language or data structure, and would contain geometry, viewpoint, texture and lighting information. The image is a digital image / raster graphics image.
It is one of the major sub-topics of 3D computer graphics, and in practice always connected to the others. In the 'graphics pipeline' its the last major step, giving the final appearance to the models and animation. With the increasing sophistication of computer graphics since the 1970s onward, it has become a more distinct subject.
It has uses in: computer games, simulators, movies/tv special effects, and design visualisation. Each employing a different balance of features and techniques. As a product, a wide variety of renderers are available. some are integrated into larger modelling and animation packages, some are stand-alone, some are free open-source projects. On the inside, a renderer is a carefully engineered program, based on a selective mixture of disciplines related to: light physics, visual perception, mathematics, and software development.
In the case of 3D graphics, rendering is a slow, computationally intensive process (typically for movie creation), or supported by realtime 3D hardware accelerators in graphics cards (typically for 3D computer games). The term is by analogy with an "artist's rendering" of a scene.
When the pre-image (a wireframe sketch usually) is complete, rendering is used, which adds in Bitmap textures or Procedural textures, lights, bump mapping, and relative position to other objects. The result is a completed image the consumer or intended viewer sees.
For movie animations, several images (frames) must be rendered, and stitched together in a program capable of making an animation of this sort. Most 3-D image editing programs can do this.
A rendered image can be understood in terms of a number of visible features. Rendering research and development has been largely motivated by finding ways to simulate these efficiently. Some relate directly to particular algorithms and techniques, while others are produced together.
- shading — how the color and brightness of a surface varies with lighting
- texture-mapping — a method of applying detail to surfaces
- bump-mapping — a method of simulating small-scale bumpiness on surfaces
- fogging/participating medium — how light dims when passing through non-clear atmosphere or air
- shadows — the effect of obstructing light
- soft shadows — varying darkness caused by partially obscured light sources
- reflection — mirror-like or highly glossy reflection
- transparency — sharp transmission of light through solid objects
- translucency — highly scattered transmission of light through solid objects
- refraction — bending of light associated with transparency
- indirect illumination — surfaces illuminated by light reflected off other surfaces, rather than directly from a light source
- caustics (a form of indirect illumination) — reflection of light off a shiny object, or focusing of light through a transparent object, to produce bright highlights on another object
- depth of field — objects appear blurry or out of focus when too far in front of or behind the object in focus
- motion blur — objects appear blurry due to high-speed motion, or the motion of the camera
- non-photorealistic rendering — rendering of scenes in an artistic style, intended to look like a painting or drawing
Two families of overall, light transport, techniques have emerged: radiosity — related to finite element mathematics, and ray tracing — related to monte carlo mathematics. Both can provide a framework for a fairly complete solution to the rendering equation. Such approaches can be very slow and computationally-intensive.
For real-time, a complete calculation is not currently possible. Much faster is to simplify with one or both of these common approximations: No illumination, just texture mapping — since the intrinsic colors of an object has the greatest influence on its appearance. Or direct illumination — light from light-source to surface, then reflected from surface to camera/eye, since this light path is usually dominant in a scene. These would often be augmented with other special-case effects, or precalculations.
- the painter's algorithm
- Scanline algorithms
- Z-buffer algorithms
- Global illumination
- Ray tracing
- Volume rendering
Rendering for movies often takes place on a network of tightly connected computers known as a render farm.
The current state of the art in 3-D image description for movie creation is the RenderMan scene description language designed at Pixar. (compare with simpler 3D fileformats such as VRML or APIs such as OpenGL and DirectX tailored for 3D hardware accelerators).
Movie type rendering software includes:
Most rendering development and use aims at photorealism — to produce images indistinguishable from photographs.
The implementation of a realistic renderer always has some basic element of physical simulation or emulation — some computation which resembles or abstracts a real physical process.
The term 'physically-based' indicates the use of physical models and approximations that are more general and widely accepted outside rendering. A particular set of related techniques have gradually become established in the rendering community.
The basic concepts are moderately straightforward, but intractable to calculate; and a single elegant algorithm or approach has been elusive for more general purpose renderers. In order to meet demands of robustness, accuracy, and practicality, an implementation will be a complex combination of different techniques.
Rendering research is concerned with both the adaptation of scientific models and their efficient application.
the rendering equation
This is the key academic/theoretical concept in rendering. It serves as the most abstract formal expression of the non-perceptual aspect of rendering. All more complete algorithms can be seen as solutions to particular formulations of this equation.
Meaning: at a particular position and direction, the outgoing light (Lo) is the sum of the emitted light (Le) and the reflected light. The reflected light being the sum of the incoming light (Li) from all directions, multiplied by the surface reflection and incoming angle. By connecting outward light to inward light, via an interaction point, this equation stands for the whole 'light transport' in a scene.
This expresses a simple model of light interaction with a surface. Light interaction is often approximated by the even simpler models: diffuse reflection and specular reflection, although both can be BRDFs.
Rendering is practically exclusively concerned with the particle aspect of light physics — known as geometric optics. Treating light, at its basic level, as particles bouncing around is a simplification, but appropriate: the wave aspects of light are negligable in most scenes, and are significantly more difficult to simulate. Notable wave aspect phenomena include diffraction — as seen in the colours of CDs and DVDs — and polarisation — as seen in LCDs. Both types of effect, if needed, are made by appearance-oriented adjustment of the reflection model.
Though it receives less attention, an understanding of human visual perception is valuable to rendering. This has two causes: image displays have restricted ranges, and human perception has restricted ranges. A renderer can simulate an almost infinite range of light brightness and color, but current displays — movie screen, computer monitor, etc. — cannot handle so much, and something must be discarded or 'compressed'. Human perception also has limits, and so doesn't need to be given large-range images to create realism. This can help solve the problem of fitting images into displays, and, furthermore, suggest what short-cuts could be used in the rendering simulation, since certain subtleties won't be noticeable.
Chronology of published ideas
- 1970 Scan-line algorithm (Bouknight, W. J. (1970). A procedure for generation of three-dimensional half-tone computer graphics presentations. Communications of the ACM)
- 1971 Gouraud shading (Gouraud, H. (1971). Computer display of curved surfaces. IEEE Transactions on Computers 20 (6), 623–629.)
- 1974 Texture mapping (Catmull, E. (1974). A subdivision algorithm for computer display of curved surfaces. PhD thesis, University of Utah.)
- 1974 Z-buffer (Catmull, E. (1974). A subdivision algorithm for computer display of curved surfaces. PhD thesis)
- 1975 Phong shading (Phong, B-T. (1975). Illumination for computer generated pictures. Communications of the ACM 18 (6), 311–316.)
- 1976 Environment mapping (Blinn, J.F. Newell, M.E. (1976). Texture and reflection in computer generated images. Communications of the ACM 19, 542–546.)
- 1977 Shadow volumes (Crow, F.C. (1977). Shadow algorithms for computer graphics. Computer Graphics (Proceedings of SIGGRAPH 1977) 11 (2), 242–248.)
- 1978 Shadow buffer (Williams, L. (1978). Casting curved shadows on curved surfaces. Computer Graphics (Proceedings of SIGGRAPH 1978) 12 (3), 270–274.)
- 1978 Bump mapping (Blinn, J.F. (1978). Simulation of wrinkled surfaces. Computer Graphics (Proceedings of SIGGRAPH 1978) 12 (3), 286–292.)
- 1980 BSP trees (Fuchs, H. Kedem, Z.M. Naylor, B.F. (1980). On visible surface generation by a priori tree structures. Computer Graphics (Proceedings of SIGGRAPH 1980) 14 (3), 124–133.)
- 1980 Ray tracing (Whitted, T. (1980). An improved illumination model for shaded display. Communications of the ACM 23 (6), 343–349.)
- 1981 Cook shader (Cook, R.L. Torrance, K.E. (1981). A reflectance model for computer graphics. Computer Graphics (Proceedings of SIGGRAPH 1981) 15 (3), 307–316.)
- 1983 Mipmaps (Williams, L. (1983). Pyramidal parametrics. Computer Graphics (Proceedings of SIGGRAPH 1983) 17 (3), 1–11.)
- 1984 Octree ray tracing (Glassner, A.S. (1984). Space subdivision for fast ray tracing. IEEE Computer Graphics & Applications 4 (10), 15–22.)
- 1984 Alpha compositing (Porter, T. Duff, T. (1984). Compositing digital images. Computer Graphics (Proceedings of SIGGRAPH 1984) 18 (3), 253–259.)
- 1984 Distributed ray tracing (Cook, R.L. Porter, T. Carpenter, L. (1984). Distributed ray tracing. Computer Graphics (Proceedings of SIGGRAPH 1984) 18 (3), 137–145.)
- 1984 Radiosity (Goral, C. Torrance, K.E. Greenberg, D.P. Battaile, B. (1984). Modelling the interaction of light between diffuse surfaces. Computer Graphics (Proceedings of SIGGRAPH 1984) 18 (3), 213–222.)
- 1985 Hemi-cube radiosity (Cohen, M.F. Greenberg, D.P. (1985). The hemi-cube: a radiosity solution for complex environments. Computer Graphics (Proceedings of SIGGRAPH 1985) 19 (3), 31–40.)
- 1986 Light source tracing (Arvo, J. (1986). Backward ray tracing. SIGGRAPH 1986 Developments in Ray Tracing course notes)
- 1986 Rendering equation (Kajiya, J.T. (1986). The rendering equation. Computer Graphics (Proceedings of SIGGRAPH 1986) 20 (4), 143–150.)
- 1987 Reyes algorithm (Cook, R.L. Carpenter, L. Catmull, E. (1987). The reyes image rendering architecture. Computer Graphics (Proceedings of SIGGRAPH 1987) 21 (4), 95–102.)
- 1991 Hierarchical radiosity (Hanrahan, P. Salzman, D. Aupperle, L. (1991). A rapid hierarchical radiosity algorithm. Computer Graphics (Proceedings of SIGGRAPH 1991) 25 (4), 197–206.)
- 1993 Tone mapping (Tumblin, J. Rushmeier, H.E. (1993). Tone reproduction for realistic computer generated images. IEEE Computer Graphics & Applications 13 (6), 42–48.)
- 1993 Subsurface scattering (Hanrahan, P. Krueger, W. (1993). Reflection from layered surfaces due to subsurface scattering. Computer Graphics (Proceedings of SIGGRAPH 1993) 27 (), 165–174.)
- 1995 Photon mapping (Jensen, H.J. Christensen, N.J. (1995). Photon maps in bidirectional monte carlo ray tracing of complex objects. Computers & Graphics 19 (2), 215–224.)
Books and summaries
- Foley; Van Dam; Feiner; Hughes (1990). Computer Graphics: Principles And Practice. Addison Wesley. ISBN 0201121107.
- Glassner (1995). Principles Of Digital Image Synthesis. Morgan Kaufmann. ISBN 1558602763.
- Pharr; Humphreys (2004). Physically Based Rendering. Morgan Kaufmann. ISBN 012553180X.
- Dutre; Bala; Bekaert (2002). Advanced Global Illumination. AK Peters. ISBN 1568811772.
- Jensen (2001). Realistic Image Synthesis Using Photon Mapping. AK Peters. ISBN 1568811470.
- Shirley; Morley (2003). Realistic Ray Tracing (2nd ed.). AK Peters. ISBN 1568811985.
- Glassner (1989). An Introduction To Ray Tracing. Academic Press. ISBN 0122861604.
- Cohen; Wallace (1993). Radiosity and Realistic Image Synthesis. AP Professional. ISBN 0121782700.
- Akenine-Moller; Haines (2002). Real-time Rendering (2nd ed.). AK Peters. ISBN 1568811829.
- Gooch; Gooch (2001). Non-Photorealistic Rendering. AKPeters. ISBN 1568811330.
- Strothotte; Schlechtweg (2002). Non-Photorealistic Computer Graphics. Morgan Kaufmann. ISBN 1558607870.
- Blinn (1996). Jim Blinns Corner - A Trip Down The Graphics Pipeline. Morgan Kaufmann. ISBN 1558603875.
- Description of the 'Radiance' system
- SIGGRAPH The ACMs special interest group in graphics — the largest academic and professional association and conference.
- Ray Tracing News A newsletter on ray tracing technical matters.
- Real-Time Rendering resources A list of links to resources, associated with the Real-Time Rendering book.
- http://www.graphicspapers.com/ Database of graphics papers citations.
- http://www.cs.brown.edu/~tor/ List of links to (recent) siggraph papers (and some others) on the web.
- http://www.pointzero.nl/renderers/ List of links to all kinds of renderers.
- 'Radiance' renderer. A highly accurate ray-tracing software system.
- 'Aqsis' renderer A free RenderMan compatible OpenSource REYES renderer.
- http://www.povray.org/ A free ray tracer.
- 3D Computer Graphics and Visual Arts at Xmeta.com A directory of 3D graphics resources, including artists, tutorials, products, etc.
The contents of this article is licensed from www.wikipedia.org under the GNU Free Documentation License. Click here to see the transparent copy and copyright details