Image stitching with tone mapping

This is a course project from EECS 332 Intro to Computer Vision, Northwestern University. (2016 winter)

Conventional image stitching method comes as the following four steps:

  1. Scale-Invariant Feature Transform (SIFT)
  2. Feature matching
  3. Random sample consensus (RANSAC)
  4. Center-weighted blending.



But this is only local blending. It is quite obvious if two images have different white balance. The images above are two pictures I took within a minute and the white balance is changing just as I moved my phone. This color inconsistency happens a lot for outdoor scenes, the camera may arise larger noise and different white balance. We came up with the idea to use the overlapped region for color blending globally.

  • Method: K mean clustering
  • Reason: pixel locations cannot match precisely. Pixel-wise adjustment is not a good choice.¬†Alternatively, we find mutual color regions to match pixel values.


Overlap region


k-mean clustering (k = 2)

The k mean clustering is a broadly used method for image segmentation. It can help find the principle clusters.


Stitching results: up–local stitching; middle–fusing two images; lower–from right to left

The previous case solves the problem of dim environment. What about sunny days?


When the sun is in the scene, our camera will adjust the exposure time. This will lead to an inconsistency from irradiance to pixel values.



image stitching results.

We take the idea by high dynamic range (HDR) tone mapping.




Final tone-mapped result.