Welcome to My Project 2

Part 1.1: Finite Difference Operator

I convolve the cameraman image with D_x = [1, -1] and D_y = [1, -1].T using scipy.signal.convolve2 with mode='same', boundary='symm' to get the partial derivatives. With the partial derivatives computed, I calculate the gradient magnitude by np.sqrt(partial_x**2 + partial_y**2). At the end, I use a threshold of 0.27 to binarize the graident magnitude image.

cameraman
cameraman
cameraman_dx
cameraman_dx
cameraman_dy
cameraman_dy
cameraman_gradient_magnitude
cameraman_gradient_magnitude
edge_image
cameraman_binarized

Part 1.2: Derivative of Gaussian (DoG) Filter

Now, I try to use Gaussian filter G to blur the image first (by convolving the image with a Gaussian) and then repeat what I have done in part 1 to compute the gradient magnitude and binarize the graident maginitude. There is less noise.

cameraman_dx
blur_cameraman_dx
cameraman_dy
blur_cameraman_dy
cameraman_gradient_magnitude
blur_cameraman_gradient_magnitude
edge_image
blur_cameraman_binarized

I used a Gaussian filter first to blure the image and then compute the partial derivatives and gradient magnitude. Now I convolve the finite difference operator D_x and D_y first with the Gaussian filter and then compute the gradient magnitude. The results are identical.

cameraman_dx
DoG_filter_x
cameraman_dy
DoG_filter_y
cameraman_dx
DoG_cameraman_dx
cameraman_dy
DoG_cameraman_dy
cameraman_gradient_magnitude
DoG_cameraman_gradient_magnitude
edge_image
DoG_cameraman_binarized

Part 2.1: Image "Sharpening"

To sharpen an image, I first convolve the image with a Gaussian to blur the image which gives a low frequency image. Then, I subtract the low frequency image from the original image to get a high frequency image. Finally, I add the high frequency image (by a factor of alpha) to the original image to get the sharpened image.

The first example is Taj Mahal. The original image is on the left and the sharpened image is on the right. The alpha value is 3.

taj_mahal
Original Taj Mahal
taj_mahal_sharpened
Sharpened Taj Mahal

The second example is a porcelain piece. The original image is on the left and the sharpened image is on the right. The alpha value is 3.

shuri
Original porcelain
shuri_sharpened
Sharpened porcelain

The third example demonstrated are the original image of a swallow, the blurred image, and the sharpened image from the blurred image. The alpha value is 3. We can see that the sharpened image looks identical to the original image but not as smooth as the original one. The edges look more artificial.

swallow
Original swallow
swallow_sharpened
blurred swallow
swallow_sharpened
Sharpened swallow

Part 2.2: Hybrid Images

Here I am going to make a hybrid image from two images by combining the low frequency part of an image (obtained by passing through a Guassian filter) and the high frequency part of another.

manim2
man
catim1
cat
hybrid
hybrid

A quick frequency analysis of the hybrid image of cat in low frequency and msn in high frequency.

hybrid_spectrum
hybrid_spectrum

The first image is a hybrid peach and cat, kind of a failure

peach
peach in low frequency
cat
cat in high frequency
hybrid2

The second image is a hybrid hotpot and noodle, kind of a success I think

hotpot
hotpot in low frequency
noodle
noodle in high frequency
hybrid3

Part 2.3: Gaussian and Laplacian Stacks

In this section, I will build a Gaussian stack and a Laplacian stack for an image. The Gaussian stack is built by applying a Gaussian filter to the image multiple times. The Laplacian stack is built by subtracting the Gaussian stack from the original image. The Laplacian stack is used to reconstruct the original image.

oraple_stacks
Gaussian, and Laplacian stacks of apples and oranges

Here we apply a mask stacks (a Gaussian stack of masks)

oraple_stacks_mask
Mask
oraple_stacks_masked
Masked Laplacian stacks of apples and oranges

Part 2.4: Multiresolution Blending (a.k.a. the oraple!)

With the Laplacian stacks of both images and the mask image stack built, we can now blend two images together by blending each stage of the stacks and then reconstruct the final image by summing them up. Then we have an oraple!

oraple
Oraple

Here is a blended image of lacroix grapefruit and lacroix lime

lacroix_grapefruit
lacroix_grapefruit
lacroix_lime
lacroix_lime

To blend this two images, we build a laplacian stacks of both images and a mask image stack. Then we blend each stage of the stacks and reconstruct the final image by summing them up. Here is the result.

oraple_lacroix
lacroix_stacks
oraple_lacroix
lacroix_stacks_masked
lacroix_grapefruit
lacroix_grapefruit_lime
lacroix_lime
mask

Here is a blended image of sun and moon

sun
sun
moon
moon
sun
sun_moon
moon
mask

Here is a blended image of a toy and a potato mine from Plant vs. Zombies. The mask is irregular.

toy
toy
potato_mine
potato_mine
toy
toy_potato_mine
potato_mine
mask