abstract
AbstractPhysically based renderers produce high quality images with high dynamic range (HDR) values. Therefore, these images need to be tone mapped in order to be displayed on low dynamic range (LDR) displays. A typical approach is to blindly apply tone mapping operators without taking advantage of the extra information that comes for free from the modeling process for creating a 3D scene. In this paper, we propose a novel pipeline for tone mapping high dynamic range (HDR) images which are generated using physically based renderers. Our work exploits information of a 3D scene, such as geometry, materials, luminaries, etc. This allows to limit the assumptions that are typically made during the tone mapping step. As consequence of this, we will show improvements in term of quality while keeping the entire process straightforward.