http://scholars.ntou.edu.tw/handle/123456789/5752
Title: | Object-Coherence Warping for Stereoscopic Image Retargeting | Authors: | Shih-Syun Lin Chao-Hung Lin Shu-Huai Chang Tong-Yee Lee |
Keywords: | Stereo image processing;Shape;Image edge detection;Image segmentation;Optimization;Vectors;Image color analysis | Issue Date: | May-2014 | Journal Volume: | 24 | Journal Issue: | 5 | Start page/Pages: | 759 - 768 | Source: | Ieee Transactions on Circuits and Systems for Video Technology | Abstract: | This paper addresses the topic of content-aware stereoscopic image retargeting. The key to this topic is consistently adapting a stereoscopic image to fit displays with various aspect ratios and sizes while preserving visually salient content. Most methods focus on preserving the disparities and shapes of visually salient objects through nonlinear image warping, in which distortions caused by warping are propagated to homogenous and low-significance regions. However, disregarding the consistency of object deformation sometimes results in apparent distortions in both the disparities and shapes of objects. An object-coherence warping scheme is proposed to reduce this unwanted distortion. The basic idea is to utilize the information of matched objects rather than that of matched pixels in warping. Such information implies object correspondences in a stereoscopic image pair, which allows the generation of an object significance map and the consistent preservation of objects. This strategy enables our method to consistently preserve both the disparities and shapes of visually salient objects, leading to good content-aware retargeting. In the experiments, qualitative and quantitative analyses of various stereoscopic images show that our results are better than those generated by related methods in terms of consistency of object preservation. |
URI: | http://scholars.ntou.edu.tw/handle/123456789/5752 | ISSN: | 1051-8215 | DOI: | 10.1109/tcsvt.2013.2291282 |
Appears in Collections: | 資訊工程學系 |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.