Z. Li and J. Zheng, "Visual-Salience-Based Tone Mapping for High Dynamic Range Images," in IEEE Transactions on Industrial Electronics, vol. 61, no. 12, pp. 7076-7082, Dec. 2014. doi: 10.1109/TIE.2014.2314066
Visual saliency aims to predict the attentional gaze of observers viewing a scene, and it is thus highly demanded for tone mapping of high dynamic range (HDR) images. In this paper, novel saliency-aware weighting and edge-aware weighting are introduced for HDR images. They are incorporated into an existing guided image filter to form a perceptually guided image filter. The saliency-aware weighting and the proposed filter are applied to design a new local tone-mapping algorithm for HDR images such that both extreme light and shadow regions can be reproduced on conventional low dynamic range displays. In particular, the proposed filter is applied to decompose the luminance of the input HDR image into a base layer and a detail layer. The saliency-aware weighting is then adopted to design a saliency-aware global tone mapping for the compression of the base layer. The proposed filter preserves sharp edges in the base layer better than the existing guided filter. Halo artifacts are thus significantly reduced in the tone-mapped image. Moreover, the visual quality of the tone-mapped image, especially attention-salient regions, is improved by the saliency-aware weighting.
(c) 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.