Sphere mapping

Environment mapping technique From Wikipedia, the free encyclopedia

In computer graphics, sphere mapping (or spherical environment mapping) is a parameterization of directional radiance obtained by projecting the reflection of a mirrored sphere onto a plane using an orthographic projection. The resulting 2D texture encodes incident light as a function of direction and is used for reflection mapping by converting a surface reflection vector into texture coordinates. The method assumes the environment is distant (so radiance depends only on direction), but it introduces non-uniform distortion and contains a singularity in the direction opposite the viewing direction used to create the map.[1]

To use this data, the surface normal of the object, view direction from the object to the camera, and/or reflected direction from the object to the environment is used to calculate a texture coordinate to look up in the aforementioned texture map. The result appears like the environment is reflected in the surface of the object that is being rendered.

Usage example

In the simplest case for generating texture coordinates, suppose:

  • The map has been created as above, looking at the sphere along the z-axis.
  • The texture coordinate of the center of the map is (0,0), and the sphere's image has radius 1.
  • We are rendering an image in the same exact situation as the sphere, but the sphere has been replaced with a reflective object.
  • The image being created is orthographic, or the viewer is infinitely far away, so that the view direction does not change as one moves across the image.

At texture coordinate , note that the depicted location on the sphere is (where z is ), and the normal at that location is also . However, we are given the reverse task (a normal for which we need to produce a texture map coordinate). So the texture coordinate corresponding to normal is .

See also

References

Related Articles

Wikiwand AI