From Surf Wiki (app.surf) — the open knowledge base
High-dynamic-range rendering
Rendering a computer graphics scene
Rendering a computer graphics scene
High-dynamic-range rendering (HDRR or HDR rendering), also known as high-dynamic-range lighting, is the rendering of computer graphics scenes by using lighting calculations done in high dynamic range (HDR). This allows preservation of details that may be lost due to limiting contrast ratios. Video games and computer-generated imagery movies and visual effects benefit from this as it creates more realistic scenes than with more simplistic lighting models. HDRR was originally required to tone map the rendered image onto Standard Dynamic Range (SDR) displays, as the first HDR capable displays did not arrive until the 2010s. However, if a modern HDR display is available, it is possible to instead display the HDRR with even greater contrast and realism.
Graphics processor company Nvidia summarizes the motivation for HDRR in three points: bright things can be really bright, dark things can be really dark, and details can be seen in both.
History
The use of high-dynamic-range imaging (HDRI) in computer graphics was introduced by Greg Ward in 1985 with his open-source Radiance rendering and lighting simulation software which created the first file format to retain a high-dynamic-range image. HDRI languished for more than a decade, held back by limited computing power, storage, and capture methods. Not until recently has the technology to put HDRI into practical use been developed.{{cite book |access-date=18 August 2009}}
In 1990, Eihachiro Nakame and associates presented a lighting model for driving simulators that highlighted the need for high-dynamic-range processing in realistic simulations.{{cite book
In 1995, Greg Spencer presented Physically-based glow visual effects for digital images at SIGGRAPH, providing a quantitative model for flare and blooming in the human eye.{{cite book
In 1997, Paul Debevec presented Recovering high dynamic range radiance maps from photographs{{cite book | doi-access=free | author-link=Paul E. Debevec | doi-access=free}} These two papers laid the framework for creating HDR light probes of a location, and then using this probe to light a rendered scene.
HDRI and HDRL (high-dynamic-range image-based lighting) have, ever since, been used in many situations in 3D scenes in which inserting a 3D object into a real environment requires the light probe data to provide realistic lighting solutions.
In gaming applications, Riven: The Sequel to Myst in 1997 used an HDRI postprocessing shader directly based on Spencer's paper. After E3 2003, Valve released a demo movie of their Source engine rendering a cityscape in a high dynamic range. The term was not commonly used again until E3 2004, where it gained much more attention when Epic Games showcased Unreal Engine 3 and Valve announced Half-Life 2: Lost Coast in 2005, coupled with open-source engines such as OGRE 3D and open-source games like Nexuiz.
By the 2010s, HDR displays first became available. With higher contrast ratios, HDRR can reduce or eliminate tone mapping, resulting in an even more realistic image.
Examples
One of the primary advantages of HDR rendering is that details in a scene with a large contrast ratio are preserved. Without HDRR, areas that are too dark are clipped to black and areas that are too bright are clipped to white. These are represented by the hardware as a floating point value of 0.0 and 1.0 for pure black and pure white, respectively.
Another aspect of HDR rendering is the addition of perceptual cues which increase apparent brightness. HDR rendering also affects how light is preserved in optical phenomena such as reflections and refractions, as well as transparent materials such as glass. In LDR rendering, very bright light sources in a scene (such as the sun) are capped at 1.0. When this light is reflected the result must then be less than or equal to 1.0. However, in HDR rendering, very bright light sources can exceed the 1.0 brightness to simulate their actual values. This allows reflections off surfaces to maintain realistic brightness for bright light sources.
Limitations and compensations
Human eye
The human eye can perceive scenes with a very high dynamic contrast ratio, around 1,000,000:1. Adaptation is achieved in part through adjustments of the iris and slow chemical changes, which take some time (e.g. the delay in being able to see when switching from bright lighting to pitch darkness). At any given time, the eye's static range is smaller, around 10,000:1. However, this is still higher than the static range of most display technology.
Output to displays
Although many manufacturers claim very high numbers, plasma displays, liquid-crystal displays, and CRT displays can deliver only a fraction of the contrast ratio found in the real world, and these are usually measured under ideal conditions. The simultaneous contrast of real content under normal viewing conditions is significantly lower.
Some increase in dynamic range in LCD monitors can be achieved by automatically reducing the backlight for dark scenes. For example, LG calls this technology "Digital Fine Contrast"; Samsung describes it as "dynamic contrast ratio". Another technique is to have an array of brighter and darker LED backlights, for example with systems developed by BrightSide Technologies.
OLED displays have better dynamic range capabilities than LCDs, similar to plasma but with lower power consumption. Rec. 709 defines the color space for HDTV, and Rec. 2020 defines a larger but still incomplete color space for ultra-high-definition television.
Since the 2010s, OLED and other HDR display technologies have reduced or eliminated the need for tone mapping HDRR to standard dynamic range.
Light bloom
Main article: Bloom (shader effect)
Light blooming is the result of scattering in the human lens, which human brain interprets as a bright spot in a scene. For example, a bright light in the background will appear to bleed over onto objects in the foreground. This can be used to create an illusion to make the bright spot appear to be brighter than it really is.
Flare
Main article: Lens flare
Flare is the diffraction of light in the human lens, resulting in "rays" of light emanating from small light sources, and can also result in some chromatic effects. It is most visible on point light sources because of their small visual angle.
Typical display devices cannot display light as bright as the Sun, and ambient room lighting prevents them from displaying true black. Thus, HDR rendering systems have to map the full dynamic range of what the eye would see in the rendered situation onto the capabilities of the device. This tone mapping is done relative to what the virtual scene camera sees, combined with several full screen effects, e.g. to simulate dust in the air which is lit by direct sunlight in a dark cavern, or the scattering in the eye.
Tone mapping and blooming shaders can be used together to help simulate these effects.
Tone mapping
Main article: Tone mapping
Tone mapping, in the context of graphics rendering, is a technique used to map colors from high dynamic range (in which lighting calculations are performed) to a lower dynamic range that matches the capabilities of the desired display device. Typically, the mapping is non-linear – it preserves enough range for dark colors and gradually limits the dynamic range for bright colors. This technique often produces visually appealing images with good overall detail and contrast. Various tone mapping operators exist, ranging from simple real-time methods used in computer games to more sophisticated techniques that attempt to imitate the perceptual response of the human visual system.
HDR displays with higher dynamic range capabilities can reduce or eliminate the tone mapping required after HDRR, resulting in an even more realistic image.
Applications in computer entertainment
- HPL Engine 3
- Babylon JS
- Torque 3D
- X-Ray Engine
References
References
- Simon Green and Cem Cebenoyan. (2004). "High Dynamic Range Rendering (on the GeForce 6800)". nVidia.
- Forcade, Tim. (February 1998). "Unraveling Riven". Computer Graphics World.
- [http://www.lge.com/about/press_release/detail/PRO%7CNEWS%5EPRE%7CMENU_20075_PRE%7CMENU.jhtml Digital Fine Contrast]
- [http://www.dolby.com/promo/hdr/technology.html BrightSide Technologies is now part of Dolby -] {{webarchive. link. (2007-09-10)
- (2006). "Rendering – Features – Unreal Technology". Epic Games.
- (2007). "SOURCE – RENDERING SYSTEM". Valve.
- (2015). "The Amazing Technology of The Witcher 3". PC-Gamer.
- (2004). "FarCry 1.3: Crytek's Last Play Brings HDR and 3Dc for the First Time". X-bit Labs.
- (2011). "CryEngine 2 – Overview". CryTek.
- Pereira, Chris. (December 3, 2016). "Kojima Partnering With Killzone, Horizon Dev Guerrilla for Death Stranding". [[CBS Interactive]].
- (2011). "Unigine Engine – Unigine (advanced 3D engine for multi-platform games and virtual reality systems)". Unigine Corp..
- "BabylonDoc".
- (2019-08-22). "MIT Licensed Open Source version of Torque 3D from GarageGames: GarageGames/Torque3D".
This article was imported from Wikipedia and is available under the Creative Commons Attribution-ShareAlike 4.0 License. Content has been adapted to SurfDoc format. Original contributors can be found on the article history page.
Ask Mako anything about High-dynamic-range rendering — get instant answers, deeper analysis, and related topics.
Research with MakoFree with your Surf account
Create a free account to save articles, ask Mako questions, and organize your research.
Sign up freeThis content may have been generated or modified by AI. CloudSurf Software LLC is not responsible for the accuracy, completeness, or reliability of AI-generated content. Always verify important information from primary sources.
Report