 Open Access
 Total Downloads : 271
 Authors : Rupali M. Bora, Smita N. Chaudhari, Prajakta S. Vispute
 Paper ID : IJERTV3IS20746
 Volume & Issue : Volume 03, Issue 02 (February 2014)
 Published (First Online): 17022014
 ISSN (Online) : 22780181
 Publisher Name : IJERT
 License: This work is licensed under a Creative Commons Attribution 4.0 International License
A Comparative Analysis of MotionBlur Magnitude Parameter Estimation Techniques
Rupali M. Bora,
KKWIEER, Nashik
Smita N. Chaudhari,
KKWIEER,Nashik
Prajakta S. Vispute
KKWIEER, Nashik
AbstractMotion blur in a digital image is caused due to motion of an object while capturing it. This motion blur may cause sig nificant problems for image processing algorithms. So, it may be required to remove it from the image or identify inconsistency of it in an image for tempering detection.
The magnitude and direction are two parameters of mo tion blur used to distinguish it from rest of the image. Here, three algorithms are discussed for estimation of this magnitude parameter. They are viz. a. Radon transform based b. Cepstral Method c.PBM (Perceptual Blur Metric) Based. The results are compared for accuracy and execution time required. PBM based method gives optimum results considering both parameters.

INTRODUCTION
Motion blur is caused by the relative motion between the camera and the pictured object during the time the shutter is open. As blurring can significantly degrade the visual quality of images, many researchers have been working either on preventing motion blurring during image capturing or on postprocessing of the image to remove motion blur.
The slow speed of the camera shutter relative to the
For uniform motion blur, the process of blurring is usually modelled as the following convolution:
I=(H*P)+ N (1)
where I is the blurred image, H is the sharp image, P is the blurring kernel, and N is the noise present.
For a horizontal uniform velocity motion blur, the blur ring kernel P can be modelled as P = 1/L[1 1.1]1xL, where L is the length of the kernel. Note that a directional blurring kernel P can be formulated by rotating P by degrees about the xaxis.
To identify the amount of blurring from its blurred version I, parametric knowledge of the blurring kernel P is required. Methods for calculating the magnitude parameter of motion blur are discussed next [1][4].

Radon Transform method based on image gradient
A periodic pattern that is easier to detect also exists in the gradient of blurred image in the spectral domain. Differentiat
object being imaged is one of the possible causes of motion blur. Camera shake is found to be the culprit for the presence
ing (1),
I HP N
(2)
of motion blur in many images. Reducing the exposure inter
Taking the Fourier transform and omitting the noise term
val of the camera is a possible solution, but this often affects the parameters like amount of noise or depth of field ad
I() H ()P()
(3)
versely. Some hardware like tripods and flashes also offer solutions to the problem of motion blur by allowing for more stable exposures or greater illumination in a short interval of time, respectively, but these are often impractical. Hence, many images containing motion blur do exist and so, it is useful to utilize the inconsistencies in motion blur in order to detect image tampering.


PROBLEM FORMULATION
Motion blur can be modelled by averaging the instantaneous
intensity falling on a pixel over the shutter interval. Such an averaging process can be weighted by a soft Gaussian win
The Radon transform, which is widely used for detect
ing straight lines in noisy images, is used[5][6]. For a motion blurred image, there are periodic large negative lines in
log  I  with slope and periodicity proportional to L value. Denoting the Radon transform by R, R( log  I  ), will have periodic peaks located at (Â±1/L,90), (Â±2/L,90), (Â±3/L,90)..Therefore, this should correspond to a peak in the Fourier transform of R( log  I ). Calculating Fourier transform of this Radon transform, the peak occurs at
( ,)
dow instead of using the idealized shutter interval, in order to allow for nonideal mechanical shutter effects. Alternatively,
Then
1 and
L
90 0
(4)
blurs arising from motion, like other types of blur, can also be considered as convolving an infocus image with a blur kernel in the spatial domain. The motion blur kernel is determined by the relative velocities of the camera and the objects in the image.
Blur Model:
This estimated motion blur is represented as a twoelement
vector = [mag, dir], where mag = L and dir = . Here L and are motion blur parameter [1].

Cepstral Method
As the degraded image is the result of convolution with the blur model, it is impossible to separate the blur in spatial do main. However, the blur information can be easily extracted in cepstrum domain. The cepstrum of an image I is defined as [3],
C =F1 {log( )} (5)
where is the Fourier transform of motion blurred image I,

Radon Transform Method
The graph of fig. 3 shows the linearity of blur parameter magnitude values for a range of nearly 10 to 50 are accurate.

PBM variation Method
The graph of fig.4 shows the linearity of blur parameter mag nitude values for a wide range of 150 and nearly accurate.
60
and F1 is the inverse Fourier transform. As Eq. (5) shows, the image in cepstrumdomain is the inverse Fourier trans form of the logarithm power spectrum of the original image.
2.3 PBM variation method
The L value is calculated using Perceptual Blur Metric [2]. Let S be the set of edge pixels in the binary edge map of im age obtained by applying the Sobel operator in the vertical, horizontal and diagonal directions. A modified metric, named as oriented blur metric PBM, is defined as [1]
50
Output 40
magnitu 30
de 20
10
0
1 9 17 25 33 41 49
Input magnitude (L)
Actual Blur Magnitu de
Cepstral Method Magnitu de
E( p)
S
PBM (I ) pS
(6)
Fig. 2 Actual Vs. Cepstral method magnitude
300
where E(p) is the width of the edge along the direction per pendicular to at the edge pixel p and . denotes cardinality. The oriented PBMs are computed for orientations i, i=1 to t, where t is the number of orientations evaluated and then de
250
Output 200
Magnit 150
Actual
Blur
fine the overall PBM as,
PBM(I)=max(PBMi)i (7)


Results and Comparison

Test data for parameters estimation
The images with the range 150 of motion blur magnitude and a fixed theta are generated and checked against all three
ude
100
50
0
1 8 15 22 29 36 43 50
Input Magnitude (L)
Magnitu
de
methods for magnitude estimation. The fig.1 shows different images for testing.
Fig. 3 Actual Vs. Radon Transform method magnitude
60
50 Actual
Output magnit ude
40
30
20
10
0
1 9 17 25 33 41 49
Input Magnitude (L)
Blur Magnit ude
Calculat ed PBM
Fig. 1 test images a. Cameraman b. Lena c. Car d. Jerusalem

Cepstral Method
The graph of fig. 2 shows the linearity of blur parameter magnitude values in the range of middle values from 525 only.
Fig. 4 Actual Vs. PBM variation method magnitude

Performance bounds
The time required for PBM based method used for estimation of motion blur magnitude parameters is minimum among all. The accurac for PBM based method is good for a wide range of input values (150).
As shown in fig. 5, time computation is done for a set of images of magnitude values in range of 150 and direction
angle 45. It shows Radon transform method requires maxi mum time for magnitude estimation. Cepstral method requires minimum time but accuracy is less in this method as shown in fig. 2. Hence PBM variation is best to use for further process ing.
Time require d (milisec onds)
Execution Time
0.5
0.4
0.3
0.2
0.1
0
Fig. 5 Execution Time of Cepstral, Radon transform and PBM variation methods for magnitude estimation



CONCLUSIONS
As motion blur in an image can be characterised by two pa rameters viz. Magnitude and direction. For magnitude pa rameter estimation, a comparative analysis of the three tech niques, viz. PBM based, Cepstral based, Radon Transform based, is done.
The techniques are compared for 4 test images with variation of magnitude values from 150. The experimental results show that the PBM based method gives fast results for an image with a good accuracy over a wide range of magni tude parameter values.
REFERENCES

P. Kakar, N. Sudha, Wee Ser, Exposing Digital Image For geries by Detecting Discrepancies in Motion Blur, IEEE TRANSACTIONS ON MULTIMEDIA, VOL. 13, NO. 3, JUNE 2011

P. Marziliano, F. Dufaux, S.Winkler, and T. Ebrahimi, A no reference perceptual blur metric, in Proc. IEEE Int. Conf. Image Processing, Citeseer, 2002, vol. 3, pp. 5760.

Shiqian Wu, Zhongkang Lu, Ee Ping Ong, Weisi Lin, Blind image blur identification in Cepstrum Domain, IEEE 2007, pp.11661171

Bora, R.M., Shahane, N.M. , Image forgery detection through motion blur estimates, in Proc. Computational Intelligence & Computing Research (ICCIC), 2012 IEEE Int. Conf., pp. 14

H. Ji and C. Liu, Motion blur identification from image gra dients, in Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2008, pp. 18.

Hongwei SUN, Michel Desvignes, Yunhui YAN, Weiwei LIU, Motion Blur Parameters Identification from Radon Transform Image Gradients, IEEE 2009, pp. 2098 – 2103.