## Abstract

This paper analyzes the effects due to the angular motion of a small-sized imaging system equipped with an optical image stabilizer (OIS) on image quality. Accurate lens moving distances for the OIS required to compensate the ray distortion induced by the angular motion are determined. To calculate the associated modulation transfer function, the integrated and the compensated point spread functions are defined. Finally, the deterioration of the image resolution due to angular motion and the restorative performance of the OIS are analyzed by isolating seven types of angular motion.

©2008 Optical Society of America

## 1. Introduction

As small-sized optical imaging systems for mobile phones are becoming more widespread, the demand for high-quality optical imaging systems is growing. Reduction in image quality of optical imaging systems can be caused by various factors; e.g., hand tremors can result in severe image blurring. Here, we use image stabilizers to compensate for hand tremors, which may be due to various conditions. First, as the exposure time is increased, image blurring due to hand tremors becomes more severe. Therefore, under indoor photographic conditions, extreme hand tremors generate extreme image blurring. Second, the typical shape of mobile phones is not suitable for taking good photographs. This poor shape can lead to sizeable hand tremors, and since image blurring significantly reduces image resolution, an image stabilizer is required to improve image quality. Image stabilizers can be divided into two types, *viz*. digital and optical image stabilizers (DIS and OIS, respectively). A DIS reduces the image blur by using an image-restoration filter [1, 2]. This technique measures image motions from the blurred images without the need for external measurement devices such as gyroscopes or accelerometers [3]. However, a DIS requires additional buffer memory for image processing and takes a long time to measure and correct the image. In addition, the compensation performance of a DIS is limited. An OIS compensates for image motion via mechanical movements of the lens system or sensor [4, 5]. The image blurring due to the imaging system’s motion is generated by differences between the initial and final positions of the optical rays on the image sensor. Moving the lens system or sensor can change the optical path in an imaging system. Therefore, image blurring can be reduced efficiently through optimization of the movements of either the lens group or the sensor.

The optical transfer function (OTF) is the most convenient form for quantifying image degradation due to image motion. The magnitude of OTF is modulation transfer function (MTF), which is expressed the resolution of an imaging system. A. Stern *et al.* calculated the OTF of an image blurred by arbitrary motion [6]. I. Klapp *et al.* defined the OTF by considering the space-variant effect in angular motion [7]. From this research, image resolution degradation due to imaging system motion can be predicted. Even though image-restoration filters for DIS have been designed and optimized [3, 8], the compensation effects of OIS have not been researched. Therefore, to improve the performance of an OIS, an accurate simulation model must be considered for OIS as well as DIS. In this paper, we describe our study of the OIS compensation effects. We calculated the position of deviated rays on the sensor due to the imaging system’s motion and the moving distance of the overall lens, with the aim of compensating for the imaging system’s motion. To support our results on OIS compensation effects, we calculated the compensated OTF including the effect of the imaging system’s motion and the movement of lens system.

## 2. Calculation of the ray distortion based on imaging system motion and the lens moving distance required for compensation

We designed a small-sized optical imaging system consisting of four aspherical plastic lenses, as shown in Fig. 1. The optical specifications of the imaging system are listed in Table 1. The *F*-number of the optical imaging system is 2.8. The effective focal length (EFL) and the overall distance from the aperture to the image sensor are 4.84 and 5.6 mm, respectively.

Rays originating from an object are focused on the image sensor by an optical imaging system. An imaging system generates an image by mapping everything within the field of view onto a plane image sensor. The angular motion of the imaging system causes transverse variation of the light beam on the image plane, which causes image blurring. Linear motions in the imaging system can also generate image blurring. However, we assumed that only angular motion affected the image blurring, since the effect of linear motion is much smaller than the effect of angular motion [9]. The OIS compensates for ray distortion by moving the overall lens system transversally, as shown in Fig. 2. The periphery of the imaging system has a greater refractive power than its center. It is therefore possible to compensate for image distortion caused by the system’s motion through transverse lens movements.

In this section, we calculate the deviations of the incident rays caused by the imaging system’s motion and the moving distance of the lens required for correcting the ray distortion. For our calculation, we assumed the following conditions.

1. The imaging system moves in only one angular direction.

2. The object is at an infinite distance.

Table 2 includes the original position of the chief-ray and its deviation after an angular motion of the imaging system of 1°. We also list the change of chief-ray position after compensation. In Table 2, the compensation rate, η* _{comp}*, is expressed as

where, *e*
_{motion} and *e*
_{comp} are the changes of the chief-ray position from its original position due to the angular motion of the imaging system and due to the lens motion, respectively. And *abs* means the absolute value. The first column lists the field angle. For example, 0.0 field indicates that the incident angle of the light onto the imaging system is 0° and 1.0 field means that the incident angle of the light onto the imaging system is the maximum field angle (i.e., 30°).

The change in the chief-ray position increases as the field angle increases. To compensate for these variations, we attempted to move the overall lens system by as much as -90 µm in the vertical direction. After lens movement, the variation in the chief-ray position becomes very small. Although the compensation rate varies as a function of field angle, the compensation rates from 0.0 to 0.8 fields are more than 90%.

We also find a suitable lens moving distance for the maximum compensation rate as a function of the angular motion of the imaging system. To obtain the compensation rate for the overall field, we calculated the overall compensation rates, *η*
_{overall}, taking into account the weight by area of each field on the image sensor. Figure 3 represents the lens moving distances and compensation rates as a function of tilt angle due to the angular motion.

The lens moving distance required for image compensation is proportional to the tilt angle due to the angular motion of the imaging system. On the basis of our calculations, we conclude that the moving distance required for image compensation is determined by the EFL and the tilt angle,

where, *d* is the lens moving distance required for image correction and *θ* is the tilt angle. Substituting the EFL (4.84 mm) into Eq. (2), *d* is 84.5 µm for a tilt angle of 1°. However, if we correct for this by moving the lens by as much as 84.5 µm, the overall compensation rate is smaller than the moving distance of 90.0 µm. In general, a wide-angle imaging system is an anisoplantic system for which the EFL varies as a function of the field angle. Therefore, to calculate the exact moving distance, using Eq. (2) for the maximum compensation rate, the EFL for the 0.4 or 0.5 field positions must be adopted. The overall compensation rate gradually decreases as the tilt angle is increased, but it is in excess of 94.6% for a tilt angle of 1°.

## 3. Calculation of the optical-transfer function taking into account the angular motion of the optical imaging system and the optical image stabilizer

Image blurring due to angular motion of the imaging system causes a significant reduction in the image resolution. For a quantitative analysis of the image resolution, in this section, we calculate the OTF, taking into account the angular motion of the imaging system. We also calculate the OTF for the OIS after compensating for image motion through adjustment of the lens position. To calculate the OTF, we applied linear systems theory using Fourier optics. First, we required the pupil function of the imaging system. A pupil function defines the irradiance and aberration characteristics over the pupil of the imaging system. A pupil function with a wavefront error can be represented mathematically as

where *P*
_{0}(*x*,*y*) is the ideal exit pupil function, *W*(*x*,*y*) is the aberration function generated in imaging system, and λ is the wavelength. The Fourier transform of a pupil function is an amplitude impulse response; thus, we can calculate the PSF by multiplying amplitude impulse response by its complex conjugate [10]. And, the incoherent normalized OTF can be expressed as a Fourier transform of the point spread function (PSF),

where *S*(*ω*
* _{x}*,

*ω*

*) is the Fourier transform of the PSF in the spatial-frequency domain [11]. The overall angular motion of the PSF can be calculated by integrating the PSF for each position in the image plane covered during the exposure [7]. These PSFs, integrated over the full exposure time, provide OTFs that include the effects of the angular motion of the imaging system, compensated for by the overall lens movement. Consequently, to calculate the integrated PSF, we must first know the PSF as a function of position in the field and of the changes induced by the angular motion of the imaging system and the compensation. “Motion PSF” here means the PSF affected by the angular motion of camera, which can thus be expressed as*

_{y}where *t* and *t*
_{exp} are the exposure time of the imaging system and the maximum exposure time, respectively; *x*
_{motion}(*t*) and *y*
_{motion}(*t*) are the coordinates in the image plane during the imaging system’s motion; and α_{motion}(*t*) is the incident angle during the angular motion of the imaging system. Moving the overall lens system to compensate for image blurring does not induce variations in the incident angle after the angular motion of the imaging system α_{motion}(*t*) (i.e., the combination of the static incident angle and the tilt angle); however, the PSF coordinates approach their original position. Therefore, the compensated PSF is expressed as

where *x*
_{comp}(*t*) and *y*
_{comp}(*t*) are the coordinates in the image plane during compensation using OIS. For example, we calculate the motion PSF and compensated PSF by applying two simple linear motions. The first has an assumed linear velocity of 0.5 deg/s and an exposure time of 0.1 s. For the second, the assumed linear velocity is 0.5 deg/s for an exposure time of 0.2 s. The assumed direction of motion is vertical. Figure 4 shows the original, motion, and compensated PSFs.

On the basis of our observations of the motion PSF shape, we can predict that the resolution of the image affected by angular motion is reduced. Little difference exists between the compensated and the original PSF. Substituting the integrated PSF into Eq. (4), the modulation-transfer function (MTF) is calculated for the 0.0, 0.3, 0.7, and 0.95 fields, as shown in Figs. 5 and 6. The MTF is reduced by the angular motion (for an exposure time of 0.1 s) by 52.6, 52.5, 38.2, and 26.5% for the 0.0, 0.3, 0.7, and 0.95 fields, respectively, for a spatial frequency of 100 line-pairs/mm (LP/mm). The compensated MTF is shown in Figs. 5 and 6 as the triangle shape lines. For the 0.0 and 0.3 fields, the image is corrected perfectly by moving the lens. However, for large incident angles, the reduction in the MTF does not compensate perfectly, despite application of the OIS.

## 4. Measurements of hand tremor and application of the simulation model

To define the motion of small-sized imaging systems in mobile phones, we measured hand-tremors involved in photography using a small-sized gyroscope [9]. In the experiment, the gyroscope was attached to a dummy solid body with a size and shape similar to a mobile phone. Figure 7 is an example of measurement result. The hand-tremor can be generated randomly as shown in Fig. 7. To define the characteristics of hand tremors, we measure the average value of the maximum and momentary variation of the tilt angle on the basis of eight identical experiments. We assumed that the maximum exposure time of the small-sized optical imaging systems in mobile phones is 0.1 s. The maximum variation of the tilt angle for the 0.1 s exposure time is 0.26°, and the momentary variation in any 0.01 s interval is 0.04°. Using these results, we defined five motion patterns for mobile phones, as shown in Fig. 8.

In types I and II, the tilt angles increased linearly with an angular velocity of 1.3 and 2.6 deg/s, respectively. In types III, IV, and V, the angular accelerations were -40, 20, and 40 deg/s^{2}, respectively. In types VI and VII, tilt angles varied with frequencies of 5 Hz and 10 Hz, respectively, and the maximum tilt angle was 0.08 degrees. Limits on the angular velocity and angular acceleration were selected to avoid exceeding the measured results. Figure 9 shows the MTF of each pattern applied to the 0.7 field. The MTF reduction rates achieved by OIS for motions of types I, II, III, IV, V, VI, and VII were 91.9, 83.6, 86.0, 95.2, 92.3, 92.6, and 92.7 %, respectively, for 200 LP/mm. We concluded from the simulation results that angular velocity and acceleration of imaging system affect MTF degradation and OIS performance. As velocity and acceleration increased, the MTF level due to the imaging system’s motion and compensation effect by OIS decreased. Comparing types III and V, even though the final tilt angle was equivalent, the initial velocity of angular motion affected the OIS performance. OIS performance for the small initial angular velocity case was better than for the large initial angular velocity case. Motion frequency affected MTF degradation only. Frequency of motion only slightly affected the OIS performance in types VI and VII.

The time delay induced by detection of hand-tremors and lens movements can affect OIS performance. Generally, gyroscopes can detect high-frequency motions larger than 10 kHz. The frequency of hand-tremor is approximately 10 Hz [9]. Therefore, detection of hand-tremor is possible in real-time. However, when moving the lens with a small-sized actuator, the frictional force can generate a time delay. To examine the actuator delay, we calculated the OIS performance of motion in types II, III, VI, and VII in Fig. 8 for time delays of 0.01, 0.03, and 0.05 second. Figure 10 shows the simulation results. In motion type II, OIS performance for a delay of 0.01 and 0.03 second did not vary significantly, as the imaging system moved in the same linear direction. However, given a 0.05 second delay, the OIS performance decreased significantly. In motion type III including a time delay, OIS performance decreased significantly, except in the case of a 0.01 second delay. When considering the frequency of motion, time delay is the important factor in OIS performance. Given a large time delay, the MTF level for OIS showed no improvement over the MTF level without OIS. Therefore, we conclude that the OIS actuator should be designed with a time-delay of less than 1/20 of the minimum period of motion.

## 5. Conclusions

To prevent deterioration of the image quality due to the motion of small-sized imaging systems in mobile phones, the application of an OIS is necessary. In this paper, we analyzed the effects of the angular motion of optical imaging systems and the application of an OIS on image resolution. Using ray-tracing analysis, the compensation rate was calculated as a function of the field’s focal position while taking into account angular motions and compensating lens movements. The compensation rate is more than 94.6 % for an angular motion of 1°.

Through analysis of PSF changes during exposures, we calculated the deterioration of the image resolution induced by angular motion as well as the compensating effects of the OIS. The deterioration of the image resolution is different for each motion type, and image resolution is reduced across the frequency domain due to angular motion. However, we calculated that the use of an OIS can compensate for reduced image resolution by 83.6 to 95.2 % for a spatial frequency of 200 LP/mm (0.7 field). And, in compensating using a small-sized actuator, time delay can be generated. We concluded that the actuator should be designed with a time delay of less than 1/20 of the minimum period of motion for efficient OIS performance.

It is necessary to analyze image resolution degradation due to the angular motion of the imaging system and the OIS compensation effects to derive more accurate moving algorithms for the OIS. For more accurate results, our analysis considered the effects of the space-variant and anisotropic properties of a practical wide-angle imaging system. Based on this analysis, we can predict OIS performance and optimize OIS moving algorithms.

## Acknowledgments

This study was supported by a Korea Science and Engineering Foundation (KOSEF) grant funded by the Korean government (MEST) (No. R17-2008-040-01001-0).

## References and links

**1. **O. Hadar, M. Robbins, Y. Novogrozky, and D. Kaplan, “Image motion restoration from a sequence of images,” Opt. Eng. **35**, 2898–2904 (1996) [CrossRef]

**2. **A. Stern and N.S. Kopeika, “General restoration filter for vibrated-image restoration,” Appl. Opt. **37**, 7596–7603 (1998) [CrossRef]

**3. **Y. Yitzhaky, I. Mor, A. Lantzman, and N.S. Kopeika, “Direct method for restoration of motion-blurred images,” J. Opt. Soc. Am. A **15**, 1512–1519 (1998) [CrossRef]

**4. **C.W. Chiu, P.C.-P. Chao, and D.Y. Wu, “Optimal design of magnetically actuated optical image stabilizer mechanism for cameras in mobile phones via genetic algorithm,” IEEE T. Magn. **43**, 2582–2584 (2007) [CrossRef]

**5. **B. Golik and D. Wueller, “Measurement method for image stabilizing systems,” Proc. SPIE **6502**, 65020O-1-10 (2007)

**6. **A. Stern and N. S. Kopeika, “Analytical method to calculate optical transfer functions for image motion and vibrations using moments,” J. Opt. Soc. Am. A **14**, 388–396 (1997) [CrossRef]

**7. **I. Klapp and Y. Yitzhaky, “Angular motion point spread function model considering aberrations and defocus effects,” J. Opt. Soc. Am. A **23**, 1856–1864 (2006) [CrossRef]

**8. **A. Stern and N. S. Kopeika, “Analytical method to calculate optical transfer functions for image motion and its implementation in vibrated image restoration,” in Proceedings of Nineteenth Convention of Electrical and Electronics Engineers in Israel (Institute of Electrical and Electronics Engineers, Israel, 1996), pp. 379–382. [CrossRef]

**9. **D. Sachs, S. Nasiri, and D. Goehl, “Image stabilization technology overview,” http://www.invensense.com/shared/pdf/ImageStabilizationWhitepaper_051606.pdf

**10. **J. W. Goodman, *Introduction to Fourier Optics*, Chap. 6, Roberts and Company Publishers (*2005*)

**11. **N. S. Kopeika, *A System Engineering Approach to Imaging*, (SPIE, 1998) Chap. 8.