Autofocus Mechanism for Telescope

DOI : 10.17577/IJERTV8IS030226

Download Full-Text PDF Cite this Publication

  • Open Access
  • Total Downloads : 5
  • Authors : Mr. Aniket A. Kamat , Mr. Vipul V. Kulkarni , Mr. Manish A. Jadhav , Miss. Saishwari B. Deshmukh, Mr. Vijaykumar V. Patil
  • Paper ID : IJERTV8IS030226
  • Volume & Issue : Volume 08, Issue 03 (March – 2019)
  • Published (First Online): 30-03-2019
  • ISSN (Online) : 2278-0181
  • Publisher Name : IJERT
  • License: Creative Commons License This work is licensed under a Creative Commons Attribution 4.0 International License

Text Only Version

Autofocus Mechanism for Telescope

Mr. Aniket A. Kamat1, Mr. Vipul V. Kulkarni1, Mr. Manish A. Jadhav1, Miss. Saishwari B. Deshmukp

B.E. final year students, Department of Electronic engineering,

Kolhapur Institute of Technology's College of Engineering, Kolhapur

Mr. Vijaykumar V. Patil2,

Assistant Professor, Department of Electronic engineering,

Kolhapur Institute of Technology's college of Engineering, Kolhapur

Abstract- Focusing is a vital parameter in image recording. Autofocusing is a process of getting a sharp image projection on the camera sensor by using closed loop system. The basic requirements for a practical autofocusing system are speed, sharpness and robustness to noise. This paper presents autofocusing algorithm in spatial domain specifically applicable for high magnification imaging. We have used Laplacian operator to detect sharp edges in the selected region of interest for focusing and rotate the rack and pinion focusing mechanism by stepper motor to get sharp image projection on camera sensor. This method has been tested on several images and scenarios.

Keywords – Autofocus; telescope; laplacian; image processing, spatial domain.

  1. INTRODUCTION

    In the field of astrophotography digital images are captured with extreme zoom by an optical telescope. Because of extreme zoom and larger focal ratios of the telescope an overall brightness is low in the image. Hence it is difficult to adjust the focus for getting the sharp image. In the observatories and professional astronomy research institutes the images are captured with an advanced instruments. But in an amateur astrophotography, people usually suffer from focus problem and as a result they get blurred images.

    In this project we have worked on an auto-focusing mechanism which can be connected to a telescope to get the sharp focus on camera sensor. As manual focus control provided by telescope manufacturer is no longer sufficient for imaging. The mechanism consists rack and pinion telescope focuser which has driven by a stepper motor. The camera is attached to an eyepiece and the whole assembly is going to be controlled by a low power single board computer.

    Sharpness measure

    Advantages

    Disadvantages

    Correlation based

    Quick response Applicable to high magnification images Response slope is adjustable

    Slightly increased computations

    Transform based

    Applicable to high magnification images

    Slightly increased computations

    Edge based

    Computational complexity, Separation of strong edges, Not applicable to high magnification images

    Sharpness measure

    Advantages

    Disadvantages

    Correlation based

    Quick response Applicable to high magnification images Response slope is adjustable

    Slightly increased computations

    Transform based

    Applicable to high magnification images

    Slightly increased computations

    Edge based

    Computational complexity, Separation of strong edges, Not applicable to high magnification images

    Number of stars visible in an image are drastically reduced when a captured image is not sharply focused as far as astrophotography is concerned. Specially in amateur asteroid hunting, searching is only possible when the image is very sharply focused. Hence an auto-focusing mechanism will help an amateur astro-photographers and asteroid hunters to the great extend.

  2. LITERATURE REVIEW

    Autofocus technique is mainly divided into two types, one is active autofocus and another is passive autofocus. In the active autofocus method, a distance measuring device that may be either an ultrasonic sound wave or infrared reflection based. The sensor measures

    the distance between the camera and the object and by measuring the distance the algorithm decides the lens position at which we will get a focused image on the camera sensor.[4]

    Another type of autofocus is passive autofocus. Passive autofocus is mainly divided into two types. Phase detection and contrast detection. In phase detection autofocus incoming light ray is split into two rays. The phase detection hardware has two different tiny image recorders along with their micro lenses. When the image gets perfectly focused on the sensor, two image recorders in the hardware get identical images. In contrast detection autofocus technique, pixel by pixel contrast is measured of successive images and maximum contrast frame get found by the algorithm. The maximum contrast frame is a sharply focused image.[1,3]

    In higher magnification imaging, the gradient-based sharpness measure has excellent focusing capability than algorithms such as correlation based measure, statistics based measure, Transform based measure and edge-based measure. Gray level difference in neighboring pixels represents sharpness of an image. Either Tenengrad measure with the horizontal and vertical gradients using Sobel operator or Laplacian filter is a great choice for the gradient- based measure.[5,2]

    Table 1. Comparison of various sharpness detection methods[5]

    Sharpness measure

    Advantages

    Disadvantages

    Gradient based

    Applicable to high magnification images Quick response

    Large portion of saturation region

    Statistics based

    Low accuracy and noise response

    Courtesy – Yi Yao, Besma Abidi, Narjes Doggaz, and Mongi Abid."Evaluation of Sharpness Measures and Search Algorithms for the Auto-Focusing of High Magnification Images"

  3. PROPOSED METHODOLOGY

    We are going to implement auto-focusing mechanism on self assembled 15×50 telescope which has the Celestron optics. It has a precise rack and pinion focuser which will be driven by a stepper motor. An open source single board computer Raspberry Pi will be the heart of a system. Raspberry Pi will capture the image through a telescope and an algorithm in the form of python script will detect sharpness of the image. Raspberry Pi has got GPIOs so the image processing result will be reflected on GPIOs as a stepper motor rotation signal[6]. This is all about hardware and real world implementation.

    In image sharpness measurement, we are going to implement gradient based measurement scheme. In an image, gray level differences among neighboring pixels provide a reasonable representation of an image's sharpness. To get the focus coordinates, a LCD display will be interfaced with hardware and that will be continuously projecting current camera data. The user has to give focus coordinates either by finger touch or by console click event.

    1. As the sharpness goes on increasing, at some point of the focuser movement variance will reach its maximum value and again start decreasing toward zero. This is quite similar to hill climb algorithm.

    2. The values of the variance of every image are stored in the array. Every time the maximum value of variance from the array is calculated. When the recent value reaches less than 20% of the maximum value from the array it is concluded that the hill descending is happening.

    3. At this point in the variance value, the motor direction is changed to reverse which, as a result, move the rack and pinion focuser in the opposite direction and sharpness of the image starts to increase

    4. In this last step of the algorithm, every recent value is compared with the maximum variance value from the defined array. and when the recent value reaches 95% of the maximum value. the motor is stopped. So the resultant image is sharply focused image.

      Stream camera data on the display

      Stream camera data on the display

    5. It has been found that due to telescope vibrations a slight shake due to inaccuracies in the mount, some variance values have an abrupt deviation in successive readings. Average of successive five values have been taken in order to suppress those abrupt deviations. Hence the graph of focus position vs variance become more accurate and smooth.

      Fig 1. Block Diagram of the system

  4. PROPOSED ALGORITHM

    In this algorithm successive variance comparison is used to get a sharp image automatically. First, the image is recorded by the camera and the user is been asked to give a region of interest for focusing. And on the defined region of interest further processing is done. The detailed sequence of the algorithm is given below.

      1. Project the live streaming from the telescope camera on the user screen. Ask the user to give focus coordinates by the provided console and select the region of interest which is to be considered for further mathematical operations to measure the sharpness of it.

      2. When the user gives focus coordinates, it is necessary to park the focuser to its reference point, as a rack and pinion mechanism has two ends. When the focuser gets parked reference point to set to zero by a limit switch.

      3. Now, the focuser is at the reference position, hence moving it to predefined direction will vary the distance between two lenses of the telescope and hence sharpness of the image will change.

        No ROI

        Defined ?

        Crop ROI and convert it to the grayscale image

        Crop ROI and convert it to the grayscale image

        Yes

        Park the focuser to

        The reference point

        Park the focuser to

        The reference point

        Rotate the motor anticlockwise

        Rotate the motor anticlockwise

        V < 0.1Vmax

        No

        Yes

        Rotate the motor clockwise

        Rotate the motor clockwise

        V > 0.9Vmax

        No

        Yes

        Sharp image output

        Sharp image output

        Fig 2. Flow chart

        Fig 3. Focus position vs. Variance without averaging.

        Fig 4. Focus position vs. Variance with averaging.

  5. STATISTICAL PARAMETERS

        1. Laplaican operator

  6. EXPERIMENTAL RESULT

    2

    2

    (, ) =

    22

    +

    1. Variance 139.8

    2. Variance 814.3

    3. Variance 1292.1

    4. Variance 1565.8

    1. Variance 139.8

    2. Variance 814.3

    3. Variance 1292.1

    4. Variance 1565.8

    2

    2

    22

    Where,

        1. Variance

          (, ) is a output image

          (, ) is a pixel intensity value

          2 =

          ()2

          Where,

          2 is variance

          is the value of an initial data point. is the mean of data point

          N is the total number of data points

        2. Averaging of variance

          Fig 5. Focusing algorithm working. From defocused image (a) to sharp image (d). Image captured by Telescope at 15x magnification and USB web camera.

          = 1

          =0

          Where is a variance of latest frame

          13

          Anticlockwise

          2716.945

          Perfect focus

          14

          Anticlockwise

          3157.174

          Perfect focus

          15

          Anticlockwise

          2966.333

          Perfect focus

          16

          Anticlockwise

          2225.528

          Moderate focus

          17

          Anticlockwise

          1479.209

          Moderate focus

          18

          Anticlockwise

          828.8435

          Defocus

          19

          Anticlockwise

          429.8095

          Defocus

          20

          Anticlockwise

          227.6932

          Defocus

          21

          Anticlockwise

          144.8395

          Defocus

          22

          Anticlockwise

          148.7029

          Defocus

          23

          Anticlockwise

          141.8684

          Defocus

          24

          Clockwise

          147.5137

          Defocus

          25

          Clockwise

          247.0296

          Defocus

          26

          Clockwise

          480.1231

          Defocus

          27

          Clockwise

          1029.202

          Moderate focus

          28

          Clockwise

          1761.385

          Moderate focus

          29

          Clockwise

          2642.486

          Perfect focus

          30

          Stopped

          2935.268

          Perfect focus

          13

          Anticlockwise

          2716.945

          Perfect focus

          14

          Anticlockwise

          3157.174

          Perfect focus

          15

          Anticlockwise

          2966.333

          Perfect focus

          16

          Anticlockwise

          2225.528

          Moderate focus

          17

          Anticlockwise

          1479.209

          Moderate focus

          18

          Anticlockwise

          828.8435

          Defocus

          19

          Anticlockwise

          429.8095

          Defocus

          20

          Anticlockwise

          227.6932

          Defocus

          21

          Anticlockwise

          144.8395

          Defocus

          22

          Anticlockwise

          148.7029

          Defocus

          23

          Anticlockwise

          141.8684

          Defocus

          24

          Clockwise

          147.5137

          Defocus

          25

          Clockwise

          247.0296

          Defocus

          26

          Clockwise

          480.1231

          Defocus

          27

          Clockwise

          1029.202

          Moderate focus

          28

          Clockwise

          1761.385

          Moderate focus

          29

          Clockwise

          2642.486

          Perfect focus

          30

          Stopped

          2935.268

          Perfect focus

          (a)

          (b)

          (c)

          Fig 6. Low light performance of proposed algorithm.

          1. Defocused (b) Partially focused (c) Focused

    Table 2. Real time data of variance, motor direction and focus status with respect to iteration steps.

  7. CONCLUSION

    A novel approach to the telescope autofocusing by using gradient-based sharpness measure and hill climb search algorithm is presented. The algorithm is experimentally tested on both celestial and terrestrial objects. In gradient-based measure, a large area of focus vs. variance plot goes in the saturation region. The accuracy of focus by the proposed algorithm is found to be 95%. The accuracy gets increased with a reduction in algorithm malfunctioning chance when variance averaging method was used. To improve the accuracy we can combine multiple sharpness detection algorithms as mentioned in this paper.

  8. REFERENCES

  1. Hashim Mir, Peter Xu, and Peter van Beek, Cheriton School of Computer Science. University of Waterloo "An autofocus algorithm for digital cameras based on supervised machine learning"

  2. J. L. Pech-Pacheco & G. Crist_obal Imaging & Vision Dept. Instituto de Optica (CSIC) Serrano 121, 28006 Madrid. Spain. J. Chamorro-Mart nez & J. Fern_andez-Valdivia Depto. Ciencias de la Computaci_on e I.A. ETS de Ingenier a Inform_atica Avda. Andaluc a, 38, 18071 Granada. Spain. "Diatom autofocusing in brightfield microscopy: a comparative study"

  3. Przemysaw liwiski, Pawe Wachel, Institute of Computer Engineering, Control and Robotics, Wrocaw University of Technology, Wrocaw, Poland. "A Simple Model for On-Sensor Phase-Detection Autofocusing Algorithm" Journal of Computer and Communications, 2013, 1, 11-17

  4. Daniel Vaquero1,2, Natasha Gelfand1, Marius Tico1, Kari Pulli1, Matthew Turk2, 1Nokia Research Center Palo Alto, 2University of California, Santa Barbara "Generalized Autofocus"

    Step number

    Motor direction

    Variance of ROI

    Focus status

    1

    Anticlockwise

    32.62928

    Defocus

    2

    Anticlockwise

    38.18802

    Defocus

    3

    Anticlockwise

    41.80423

    Defocus

    4

    Anticlockwise

    52.72631

    Defocus

    5

    Anticlockwise

    61.19632

    Defocus

    6

    Anticlockwise

    80.38739

    Defocus

    7

    Anticlockwise

    126.9063

    Defocus

    8

    Anticlockwise

    216.4991

    Defocus

    9

    Anticlockwise

    339.7576

    Defocus

    10

    Anticlockwise

    572.5896

    Defocus

    11

    Anticlockwise

    1059.164

    Moderate focus

    12

    Anticlockwise

    1950.688

    Moderate focus

    Step number

    Motor direction

    Variance of ROI

    Focus status

    1

    Anticlockwise

    32.62928

    Defocus

    2

    Anticlockwise

    38.18802

    Defocus

    3

    Anticlockwise

    41.80423

    Defocus

    4

    Anticlockwise

    52.72631

    Defocus

    5

    Anticlockwise

    61.19632

    Defocus

    6

    Anticlockwise

    80.38739

    Defocus

    7

    Anticlockwise

    126.9063

    Defocus

    8

    Anticlockwise

    216.4991

    Defocus

    9

    Anticlockwise

    339.7576

    Defocus

    10

    Anticlockwise

    572.5896

    Defocus

    11

    Anticlockwise

    1059.164

    Moderate focus

    12

    Anticlockwise

    1950.688

    Moderate focus

  5. Yi Yao, Besma Abidi, Narjes Doggaz, and Mongi Abidi , Image, Robotics and Intelligent System Laboratory,317 Ferris Hall, The University of Tennessee, Knoxville, TN 37996, USA "Evaluation of Sharpness Measures and Search Algorithms for the Auto-Focusing of High Magnification Images"

  6. Achim Mester, "Contrast Detection Autofocus (CDAF) for Telescopes using a Stepper Motor and a Raspberry Pi" July 6, 2014

Leave a Reply