19.5. Image Interpolation and ExtrapolationIn 1994, Paul Haeberli and Douglas Voorhies published an interesting paper that described imaging operations that could be performed with interpolation and extrapolation operations. These operations could actually be programmed on the high-end graphics systems of that time; today, thanks to the OpenGL Shading Language, they can be done quite easily on consumer graphics hardware. The technique is quite simple. The idea is to determine a target image that can be used together with the source image to perform interpolation and extrapolation. The equation is set up as a simple linear interpolation that blends two images:
The target image is actually an image that you want to interpolate or extrapolate away from. Values of alpha between 0 and 1 interpolate between the two images, and values greater than 1 extrapolate between the two images. For instance, to adjust brightness, the target image is one in which every pixel is black. When alpha is 1, the result is the source image. When alpha is 0, the result is that all pixels are black. When alpha is between 0 and 1, the result is a linear blend of the source image and the black image, effectively darkening the image. When alpha is greater than 1, the image is brightened. Such operations can be applied to images (pixel rectangles in OpenGL jargon) with a fragment shader as they are being sent to the display. In cases in which a target image is really needed (in many cases, it is not needed, as we shall see), it can be stored in a texture and accessed by the fragment shader. If the source and target images are downloaded into memory on the graphics card (i.e., stored as textures), these operations can be blazingly fast, limited only by the memory speed and the fill rate of the graphics hardware. This should be much faster than performing the same operations on the CPU and downloading the image across the I/O bus every time it's modified. 19.5.1. BrightnessBrightness is the easiest example. The target image is composed entirely of black pixels (e.g., pixel values (0,0,0)). Therefore, the first half of the interpolation equation goes to zero, and the equation reduces to a simple scaling of the source pixel values. This is implemented with the fragment shader in Listing 19.3, and the results for several values of alpha are shown in Color Plate 30. Listing 19.3. Fragment shader for adjusting brightness uniform float Alpha;
19.5.2. ContrastA somewhat more interesting example is contrast (see Listing 19.4). Here the target image is chosen to be a constant gray image with each pixel containing a value equal to the average luminance of the image. This value and the alpha value are assumed to be computed by the application and sent to the shader as uniform variables. The results of the contrast shader are shown in Color Plate 31. Listing 19.4. Fragment shader for adjusting contrast
19.5.3. SaturationThe target image for a saturation adjustment is an image containing only luminance information (i.e., a grayscale version of the source image). This image can be computed pixel-by-pixel by extraction of the luminance value from each RGB value. The proper computation depends on knowing the color space in which the RGB values are specified. For RGB values specified according to the HDTV color standard, you could use the coefficients shown in the shader in Listing 19.5. Results of this shader are shown in Color Plate 32. As you can see, extrapolation can provide useful results for values that are well above 1.0. Listing 19.5. Fragment shader for adjusting saturation
19.5.4. SharpnessRemarkably, this technique also lends itself to adjusting any image convolution operation (see Listing 19.6). For instance, you can construct a target image by blurring the original image. Interpolation from the source image to the blurred image reduces high frequencies, and extrapolation (alpha greater than 1) increases them. The result is image sharpening through UNSHARP MASKING. The results of the sharpness fragment shader are shown in Color Plate 33. Listing 19.6. Fragment shader for adjusting sharpness
These examples showed the simple case in which the entire image is modified with a single alpha value. More complex processing is possible. The alpha value could be a function of other variables. A control texture could define a complex shape that indicates the portion of the image to be modified. A brush pattern could apply the operation selectively to small regions of the image. The operation could be applied selectively to pixels with a certain luminance range (e.g., shadows, highlights, or midtones). Fragment shaders can also interpolate between more than two images, and the interpolation need not be linear. Interpolation can be done along several axes simultaneously with a single target image. A blurry, desaturated version of the source image can be used with the source image to produce a sharpened, saturated version in a single operation. |