Skip to main content
Article thumbnail
Location of Repository

Fast Adaptive Edge-Aware Mask Generation

By Michael W. Tao and Aravind Krishnaswamy


Figure 1: An example of complex editing. The user uses one stroke to increase local contrast and another to increase local brightness on the original image (a). With just two strokes, our adaptive masking computes editing masks to the respective user-strokes as shown in (b) and (c), resulting the output image (d). Our paper focuses on producing high quality results in an efficient framework with low user intervention. Selective editing, also known as masking, is a common technique to create localized effects, such as color (hue, saturation) and tonal management, on images. Many current techniques require parameter tuning or many strokes to achieve suitable results. We propose a fast novel algorithm that requires minimal strokes and parameter tuning from users, segments the desired selection, and produces an adaptive feathered matte. Our approach consists of two steps: first, the algorithm extracts color similarities using radial basis functions; second, the algorithm segments the region the user selects to respect locality. Because of the linear complexity and simplicity in required user-input, the approach is suitable for multiple applications including mobile devices

Topics: Index Terms, I.4.6 [Image Processing and Computer Vision
Year: 2013
OAI identifier: oai:CiteSeerX.psu:
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.