Image Quality Assessment From Free Image Interpolation

DOI : 10.17577/IJERTV1IS8025

Download Full-Text PDF Cite this Publication

Text Only Version

Image Quality Assessment From Free Image Interpolation

B.Vinay Kumar 1 G.Narsimhulu2

Sree Chaitanya College of Engineering, Karimnagar, Andhra Pradesh, India

Abstract

DIGITAL images are subject to a wide variety of distortions during acquisition, processing, compression, storage, transmission and reproduction, any of which may result in a degradation of visual quality. For applications in which images are ultimately to be viewed by human beings, the only correct method of quantifying visual image quality is through subjective evaluation. In practice, however, subjective evaluation is usually too inconvenient, time- consuming and expensive. The goal of research in objective image quality assessment is to develop quantitative measures that can automatically predict perceived image quality. An objective image quality metric can play a variety of roles in image processing applications.

Index Terms Error sensitivity, HVS, image coding, image quality assessment, JPEG, perceptual quality, SSIM.

  1. Introduction

    Objective methods for assessing perceptual image quality traditionally attempted to quantify the visibility of errors (differences) between a distorted image and a reference image using a variety of known properties of the human visual system. Under the assumption that human visual perception is highly adapted for extracting structural information from a scene, we introduce an alternative complementary framework for quality assessment based on the degradation of structural information. As a specific example of this concept, we develop a Structural Similarity Index and demonstrate its promise through a set of intuitive examples, as well as comparison to both subjective ratings and state- of-the-art objective methods on a database of images

    compressed with JPEG and JPEG2000. Image processing systems and algorithms. Objective image quality metrics can be classified according to the availability of an original (distortion-free) image, with which the distorted image is to be compared. Most existing approaches are known as full- reference, meaning that a complete reference image is assumed to be known. In many practical applications, however, the reference image is not available, and a no-reference or "blind" quality assessment approach is desirable. In a third type of method, the reference image is only partially available, in the form of a set of extracted features made available as side information to help evaluate the quality of the distorted image. This is referred to as reduced-reference quality assessment. This paper focuses on full-reference image quality assessment. The simplest and most widely used full-reference quality metric is the mean squared error (MSE), computed by averaging the squared intensity differences of distorted and reference image pixels, along with the related quantity of peak signal-to- noise ratio (PSNR). These are appealing because they are simple to calculate, have clear physical meanings, and are mathematically convenient in the context of optimization. But they are not very well matched to perceived visual quality . In the last three decades, a great deal of effort has gone into the development of quality assessment methods that take advantage of known characteristics of the human visual system (HVS). limitations. In Section III, we describe a new paradigm for quality assessment, based on the hypothesis that the HVS is highly adapted for extracting structural information. As a specific example, we develop a measure of structural similarity (SSIM) that compares local patterns of pixel intensities that have been normalized for luminance and contrast. we compare the test results of different quality assessment models against a large set of subjective ratings gathered for a database

    of 344 images compressed with JPEG and JPEG2000.

    This paper is organized as follows. Section II describes the proposed technique. In section III, we show the experimental results on the synthetic noisy images, prostate ultrasound images, and knee joints in CT images. Section IV concludes this paper.

  2. Image Quality Assessment

    Fig. 1. A prototypical quality assessment system based on error sensitivity

    A.The problem of creating artifact-free upscaled im- ages appearing sharp and natural to the human observer is probably more interesting and less trivial than it may appear. The solution to the problem, often referred to also as single image super- resolution, is related both to the statistical relationship between low resolution and high resolution image sampling and to the human perception of image quality. In many practical applications, simple linear or cubic inter- potation algorithms are applied for this task, but the results obtained are not really satisfactory, being affected by relevant artifacts like blurring an. Several methods have been proposed to obtain better results, involving simple heuristics, edge modeling or statistical learning. The most powerful ones, however, present a high computational complexity and are not suitable for real time applications, while fast methods, even if edge-adaptive, are not able to provide artifacts-free images. In this paper we describe a new upscaling method (ICBI, Iterative Curvature Based Interpolation) based on a two step grid lling and an iterative correction of the interpolated pixels obtained by minimizing an objective function depending on the second order directional derivatives of the image intensity. We show that the constraints used to derive the function are related with those applied in another well known interpolation method providing good results but computationally heavy. The high quality of the

    images enlarged with the new method is demonstrated with objective and subjective tests, while the computation time is reduced of 1-2 orders of magnitude with respect to NEDI, so that we were able, using a GPU implementation based on the nVidia CUDA technology, to obtain real time performances.

    B. Limitations

    The underlying principle of the error-sensitivity approach is that perceptual quality is best estimated by quantifying the visibility of errors. This is essentially accomplished by simulating the functional properties of early stages of the HVS, as characterized by both psychophysical and physiological experiments. Although this bottom-up approach to the problem has found nearly universal acceptance, it is important to recognize its limitations. In particular, the HVS is a complex and highly nonlinear system, but most models of early vision are based on linear or quasilinear operators that have been characterized using restricted and simplistic stimuli. Thus, error-sensitivity approaches must rely on a number of strong assumptions and generalizations. These have been noted by many previous authors, and we provide only a brief summary here.

    C.The Quality Definition Problem:

    The most fundamental problem with the traditional approach is the definition of image quality. In particular, it is not clear that error visi- bility should be equated with loss of quality, as some dis- tortions may be clearly visible but not so objectionable. An obvious example would be multiplication of the image intensities by a global scale factor. The study in also suggested that the correlation between image fidelity and image quality is only moderate. The Natural Image Complexity Problem. Most psychophysical experiments are conducted using relatively simple patterns, such as spots, bars, or sinusoidal gratings. For example, the CSF is typically obtained from threshold experiments using global sinusoidal images. The masking phenomea are usually characterized using a superposi- tion of two (or perhaps a few) different patterns. But all such patterns are much simpler than real world images, which can be thought of as a superposition of a much larger number of simple patterns.

    D. The Cognitive Interaction Problem.

    It is widely known that cognitive understanding and interactive visual pro- cessing (e.g., eye movements) influence the perceived quality of images. For

    example, a human observer will give different quality scores to the same image if s/he is provided with different instructions [4], [30]. Prior information regarding the image content, or attention and fixation, may also affect the evaluation of the image quality [4],. But most image quality metrics do not consider these effects, as they are difficult to quantify and not well understood.

  3. Experimental Results

    Fig. 1&2: Image upscaling by exploiting local scale-similarity in natural images. The patches (yellow) when downscaled are very similar to their cropped version (red). This relation holds for various types of singularities.

  4. Conclusion

    We presented a new example-based image upscaling method that performs less nearest-patch computations and uses a custom designed lter banks to synthesize the image explicitly. The faster search is based on a local self-similarity observation that we point out in natural images, where edges and other singularities are locally scale invariant. The tests we performed to measure this invariance show that this assumption holds best for small scaling factors.

    propose fully local high-quality up-scaling algorithms that operate in real-time when implemented in a GPU.

  5. Acknowledgement

We thank our thesis advisors, Associate professors J.Seetaram, and G..Narsimhulu for their support and guidance. Their energy and insight at all levels of constant inspiration.

References

  1. M.J. Chen, C.H. Huang, and W.L. Lee. A fast edge-oriented algorithm for image interpolation. Image and Vision Computing, 23:791798, 2005.

  2. R. Fattal. Image up sampling via imposed edge statistics. ACM Transactions on Graphics, 26(3):95, 2007.

  3. W.T. Freeman, T.R. Jones, and E.C. Pasztor. Example-based super- resolution. IEEE Computer Graphics and Applications, 22(2):5665, 2002. [7]

    A. Giachetti and N. Asuni. Fast artifact free image interpolation. In

    Proc. BMVC 2008, 2008.

  4. D. Glasner, S. Bagon, and Michal Irani. Super- resolution from a single image. In proc. 12th International Conference on Computer Vision, pages 349356. IEEE, 2009.

  5. N. Asuni and A. Giachetti. Accuracy improvements and artifacts removal

in edge based image interpolation. In Proc. 3rd Int. Conf. Computer Vision Theory and Applications (VISAPP08), 2008

Comparisons show that the localized search, permitted by the local scale invariance, outperforms approximate global searches both in quality and running time. We formulated and designed novel lter banks that allow us to perform such small, non-dyadic, image scalings. These lter banks extend the common dyadic transformations and are designed to model the image up-scaling process.

With these lters we achieve, using explicit computations, up-scaled image which is highly consistent with the input image. Altogether, we

Leave a Reply