Abstract

The slanted-edge method specified in ISO Standard 12233, which measures the modulation transfer function (MTF) by analyzing an image of a slightly slanted knife-edge target, is not robust against noise because it takes the derivative of each data line in the edge-angle estimation. We propose here a modified method that estimates the edge angle by fitting a two-dimensional function to the image data. The method has a higher accuracy, precision, and robustness against noise than the ISO 12233 method and is applicable to any arbitrary pixel array, enabling a multidirectional MTF estimate in a single measurement of a starburst image.

© 2014 Optical Society of America

1. Introduction

Ultra-high definition television (UHDTV) has high pixel counts of 4K (3840 × 2160 pixels) and 8K (7680 × 4320 pixels), which provide a better viewing experience than HDTV [1]. To fully exploit the features and effectiveness of UHDTV, the image resolution characteristics of 4K and 8K cameras are primarily important. However, we currently have no clear criterion or measurement method for their spatial resolution performance.

Most end users prefer a single-valued criterion for a rapid pass/fail test of the resolution performance of their camera. For assessing broadcast HDTV cameras, the output signals are measured using a bar pattern to determine if a minimum threshold criterion is met: typically a contrast transfer function (CTF) of 45% at 800 TV lines per picture height (TVL/ph). For example, we can perform such a test using the bar pattern in the central area of the Institute of Image Information and Television Engineers (ITE) resolution chart. The camera being tested is positioned to properly frame the target marked with the horizontal and vertical framing arrows. Bar charts are easy to manufacture into targets and intuitive for measuring the CTF by visually observing the amplitude displayed on a waveform monitor. However, there is some ambiguity in measuring waveforms with amplitude fluctuations because of both signal noise and shift variance from the sampling process [2]. In addition, CTFs measured at different spatial frequencies using other bar patterns arranged horizontally in the chart are usually affected by degradation of the off-axis spatial resolution performance of the camera. The use of an ITE resolution chart for 4K and 8K cameras also raises some practical problems: a large chart will have to be prepared because wide-angle single-focus lenses are often used for these cameras, but even if a large-size ITE chart is available, it is often difficult to precisely adjust the focus because of the rough control of the focal ring and narrow depth of focus. The latter could result in a critical anisotropic modulation transfer function (MTF) and make the single-valued criterion misleading and insufficient. In that sense, a MTF measured in multiple directions is preferable for evaluating the performance of 4K and 8K cameras.

The MTF describes the magnitude response of an optical system to sinusoids of different spatial frequencies and can be used to evaluate the ability of the imaging system or its components in reproducing fine detail. For an MTF analysis, the imaging system under consideration is required to be approximately linear and shift invariant, and furthermore, to preserve the convenience of a transfer-function approach for a shift-variant sampled imaging system, it is also assumed that the object being imaged has spatial frequency components with phases that are random and uniformly distributed with respect to the sampling sites. Thus, the MTF is an ensemble average of all possible MTFs and assumed to be a shift-invariant quantity [24]. MTFs can be used in singular metrics, and the MTF(f) can be approximated from the CTF(f) as MTF(f)(π/4)CTF(f) when higher harmonic components are negligible. If we express the spatial frequency f in cycles per pixel, which is a more convenient unit for digital sensors, then the “45% CTF at 800 TVL/ph” criterion for broadcast HDTV cameras is equivalent to >35% MTF at 0.37 cycles/pixel. Assuming that some 4K or 8K images are to be cropped into HDTV size images, it would be reasonable to adopt the same MTF criterion for broadcast 4K and 8K cameras.

The slanted-edge method specified in ISO Standard 12233 [5] measures the MTF of digital image acquisition devices by analyzing the image data of a simple knife-edge target captured by the device. The target is framing free, enabling precise focus by adjusting the shooting distance instead of the rough focus ring. The estimated MTF is, however, a function of the horizontal or vertical spatial frequencies, not multidirectional. In addition, the method is not robust against noise [6].

Our research objective is to realize a method that estimates a multidirectional MTF with a high accuracy, precision, and robustness against noise. In this paper, we briefly discuss the ISO 12233 slanted-edge method in Section 2 and present in Section 3 a modified slanted-edge method for comparison with the ISO 12233 method in terms of the accuracy and precision of the MTF estimates. Two multidirectional MTF estimates based on the modified slanted-edge method are demonstrated in Section 4.

2. ISO 12233 slanted-edge method

The slanted-edge method specified in ISO 12233 measures the MTF as a function of the horizontal or vertical spatial frequencies of digital image acquisition devices by analyzing the image data in a user-defined rectangular region of interest (ROI) in an image of a knife-edge target captured by the device. Figure 1 shows an example slanted vertical edge and illustrates the procedure of the slanted-edge method. The data must be linear with respect to the light intensity. To measure the MTF as a function of the vertical spatial frequency, a slightly slanted horizontal edge is used, and the digital image data is rotated by 90° before performing the calculation. The edge must be slightly slanted so that the gradient of the edge is measured at various phases relative to the photoelements of the image sensor. The edge location is estimated after taking a one-dimensional derivative of each data line and finding the centroid to a subpixel accuracy. A simple linear regression is then performed on the collected line-to-line edge locations to estimate the edge angle. The pixels in the ROI are then projected along the direction of the estimated edge onto the horizontal axis, which is divided into bins with widths equal to a quarter of the sampling pitch to reduce the influence of signal aliasing on the estimated MTF. The values of the pixels collected in each bin are averaged, which generates a one-dimensional edge profile with 4 × oversampling. The pixel count in each bin must be large enough to obtain a reasonable average pixel value. The derivative of the edge profile yields the line spread function (LSF), and after applying a smoothing Hamming window to the LSF, performing a discrete Fourier transform, and normalizing, the MTF over a range of horizontal spatial frequencies beyond the Nyquist frequency can be estimated.

 

Fig. 1 Slanted-edge method: (a) knife-edge target, and (b) projection, binning, and averaging before forming a one-dimensional edge profile.

Download Full Size | PPT Slide | PDF

The simplicity of the target and small test image area for the analysis are its main advantages. We can focus precisely on the target with the camera or target mounted on a linear positioning stage by shooting at several distances at regular intervals [3]. However, the ISO 12233 method is not robust against noise [6] because it takes the derivative of each data line in the edge-angle estimation. Errors in the estimation introduce a negative bias into the computed MTF, resulting in an underestimation of the actual MTF.

3. Modified slanted-edge method

We propose a modified slanted-edge method that estimates the edge angle by fitting a two-dimensional function vfit to the ROI data rather than taking the derivative of each data line. In the edge-angle estimation, when the right-hand side of the ROI has higher level than the left-hand side,

vfit(x,y)=(vHvL)normcdf(x,μ+αy,σ)+vL,
where x and y are the coordinates of the pixels in the ROI, and normcdf(x,μ+αy,σ) is the cumulative distribution function of the normal distribution with a mean of μ+αy and a standard deviation of σ evaluated at the value of x. Here, α is the inverse of the edge slope, vH is the white level, and vL is the black level. When the left-hand side of the ROI has a higher level than the right-hand side, the sign of x is reversed. The values of α, μ, vH, and vL are optimized so as to minimize the sum of the squares of the differences between the input ROI data v(x,y) and the fitted vfit(x,y); σ is fixed to 0.5 to stabilize the optimization with fewer parameters; actually, it is not necessary to optimize σ in the edge-angle estimation because σ as well as the optimized parameter values of μ, vH, and vL are not used in the following analysis. This optimization can be implemented easily using the MATLAB function “fminsearch” with proper termination criteria for the optimization run. The pixels in the ROI are then projected onto the horizontal axis along the estimated edge slope of 1/α, which forms the edge profile in superresolution after binning and averaging. After that, the MTF is estimated in the same way as that in the ISO 12233 method.

Forming an edge profile in the ISO 12233 method introduces negative bias into the computed MTF. Firstly, the binning and averaging procedures introduce a bias of sinc(f/nbin) into the computed MTF(f) [2], where f is the spatial frequency of the original image sampling, and nbin is the number of bins per native pixel. Secondly, an array of [−0.5 0 0.5], which is used when the one-dimensional discrete derivative of the supersampled edge profile is calculated, adds a further bias of sinc(2f/nbin) [3,7]. Figure 2(a) shows the bias curves of sinc(f/nbin)sinc(2f/nbin) for nbin= 4 (used in ISO 12233) and nbin= 8 as well as the estimated MTFs of a synthetically generated slanted-edge image with and without compensation of the MTF degradation for nbin= 4. The slanted-edge image was 100 (W) × 200 (H) pixels in size with an edge angle of 5° and had an MTF of |sinc(f)|4. Compensation is recommended for nbin= 4 at least, although this is neglected in the ISO 12233 method. The estimated MTF curve with compensation conforms to |sinc(f)|4. We also compared the precision of the edge-angle estimation of the ISO 12233 method with that of the modified method using the synthetically generated slanted-edge image by including Gaussian noise with a peak signal-to-noise ratio (PSNR) ranging from 20 to 60 dB in intervals of 2.5 dB. Figure 2(b) shows the means and standard deviations of the edge angles estimated by each method 1000 times. Figure 2(c) shows the standard deviations of the errors of the MTF estimates at 0.37 cycles/pixel estimated by each method with compensation of the MTF degradation for nbin= 4 and 8. The true MTF estimates were computed with the edge angle fixed to 5° for each method and each nbin. We see that the modified method with nbin= 8 estimates the MTF to a higher precision and robustness against noise than the others.

 

Fig. 2 Errors in the slanted-edge methods: (a) bias due to binning, averaging, and derivative filtering, (b) the means and standard deviations of the estimated edge angles, and (c) the standard deviations of the errors of the MTF estimates at 0.37 cycles/pixel.

Download Full Size | PPT Slide | PDF

The modified method can also be used for images of edges that are curved due to lens distortions without the need to straighten the edge by a non-integer resampling of the images. The misalignment between a curved edge and a fitted straight edge also introduces a negative bias on the estimated MTF [6]. To accommodate the fit to a curved edge, we simply introduce into Eq. (1) a second-order term in y as follows:

vfit(x,y)=(vHvL)normcdf(x,μ+αy+βy2,σ)+vL.
The pixels in the ROI are then projected along the estimated curve onto the horizontal axis. Figure 3 shows a synthetically generated curved-edge image simulating a barrel distortion with an MTF of |sinc(f)|4 and a portion of a curved-edge image captured with a digital SLR camera (model D200; Nikon Corp., Tokyo, Japan) with a fisheye lens (Ai Fisheye-Nikkor 8 mm f/2.8S; Nikon Corp., Tokyo, Japan) at f/5.6 with the MTF estimates using Eqs. (1) and (2) for each image. The MTF estimated with Eq. (2) conforms with |sinc(f)|4, whereas Eq. (1) underestimates the MTF. We can customize the function vfit further, if necessary, as long as the optimization is stable. Even if the edge appears to be straight, the MTF estimates of Eqs. (1) and (2) could be compared to ascertain if any distortions are present.

 

Fig. 3 MTF estimates of curved-edge images using Eqs. (1) and (2): (a) synthetically generated curved-edge simulating a barrel distortion and (b) curved-edge captured with a fisheye lens.

Download Full Size | PPT Slide | PDF

This method can be applied to digital acquisition devices with a hexagonal-packed array, quincunx grid of a Bayer array, or any arbitrary pixel structure without any preliminary pixel interpolation so long as the pixel coordinates are known. The most important advantage of the modified method is that the pixels in the ROI do not need to be aligned horizontally or vertically, which let us to extend the method to obtain MTF estimates in multiple directions.

4. Multidirectional MTF estimate

The modified method with a custom-made chart is compared in this section to an existing method to examine the performance of the proposed method.

4.1 Modulated Siemens star method

The modulated Siemens star method is often used for multidirectional MTF estimates [8]. The modulated Siemens star chart (TE253; Image Engineering, Frechen, Germany) has a pattern that is sinusoidally modulated in the circumferential direction such that the spatial frequency increases with decreasing radius. In the analysis, an image of the modulated pattern is partitioned into several pie-shaped segments, and the MTF in each segment is estimated. The measurable spatial-frequency range with the modulated Siemens star method is, however, limited by both the chart size and the shooting conditions. Furthermore, a very high precision is required for fabricating the sinusoidally modulated pattern.

4.2 Multidirectional slanted-edge method

We applied the modified slanted-edge method described in Sec. 3 to estimate a multidirectional MTF in a single measurement of a starburst image. In this method, a bi-tonal image consisting of N radial spokes (N/2 white spokes and N/2 black spokes) set at angles of 2π/N rad is used as the target. Figure 4(a) shows a starburst target (N=24) and selected multidirectional ROIs on the spoke edges. The user selects an ROI (ROI0) and the center point on the image; the other ROIs (ROIn, n=1,2,,N1) are located automatically by rotating the coordinates of the user-selected ROI0 in the clockwise direction by 2πn/N rad. When N is large, the ROIs have to be narrow or off the target center to minimize interference from neighboring stripes. The pixels in each ROI are rotated in the counterclockwise direction by 2πn/N rad and projected along the estimated edge slope onto the horizontal axis as illustrated in Fig. 4(b). The edge angles of the ROIs are estimated separately, although the estimated edge angles are identical if there are no lens distortions or misalignment of the target setting, and the center point is accurately selected. After the derivative of each edge profile is taken, the LSFs are aligned with their peaks and trimmed to the same length to obtain the MTFs at the same spatial frequency interval.

 

Fig. 4 Multidirectional slanted-edge method: (a) starburst target and multidirectional ROIs selected on the spoke edges and (b) ROI rotation and pixel projection.

Download Full Size | PPT Slide | PDF

To demonstrate the high accuracy and precision of the multidirectional slanted-edge method, we compared the multidirectional MTF estimate with that of the modulated Siemens star method. Both the modulated Siemens star chart (130-mm diameter) and the starburst chart (N=24; Fig. 4(a)) were shot with a charge-coupled device (CCD)-based imaging colorimeter (PM-1400; Radiant Imaging, Inc., Redmond, WA, USA) and a telephoto lens (Ai AF-S Nikkor ED 300 mm f/4D IF-ED; Nikon Corp., Tokyo, Japan) at f/16. The colorimeter had a 3072 × 2048 pixel CCD and a motorized internal filter-wheel housing a color filter simulating the CIE photopic spectral luminous efficiency curve. Figure 5(a) shows the mean and standard deviation of the 24 MTF curves estimated using each method. The mean MTF estimated with the modulated Siemens star method is lower than that of the multidirectional slanted-edge method at the low spatial frequencies because the low-frequency sinusoidal pattern is relatively far from the optical axis. In addition, we see that there are some variations in the results of the modulated Siemens star method at high spatial frequencies. The contour plot of the multidirectional slanted-edge method results in polar coordinates (Fig. 5(b)) is approximately circularly symmetric, whereas that of the Siemens star method results (Fig. 5(c)) is slightly oval shaped. This discrepancy is due to an artifact of the Siemens star chart: when the chart is rotated, the resulting oval contour plot is also rotated. Next, we estimated the multidirectional MTF of our 8K video camera using a zoom lens with a focal length of 50 mm at f/4.0 [9] and the multidirectional slanted-edge method with a starburst chart made from a black anodized aluminum plate with chamfered holes on a 750-mm-diameter integral sphere illuminated by a simulated daylight lamp (SOLAX XC-100A; Seric Ltd., Tokyo, Japan). In this case, signal clipping due to the high contrast of the target can be prevented because the pedestal and white levels of the broadcast camera are adjustable. Figure 6 shows the estimated multidirectional MTF. The anisotropcity is thought to be due to the narrow depth of focus and small misalignment of the optical axis of the camera. The MTF at 0.37 cycles/pixel was higher than 35%, which met the current criterion for HDTV broadcast cameras.

 

Fig. 5 Multidirectional MTF estimates of an imaging colorimeter: (a) means (dashed) and standard deviations (solid), the contour plots in polar coordinates using the (b) slanted-edge method and (c) modulated Siemens star method.

Download Full Size | PPT Slide | PDF

 

Fig. 6 Multidirectional MTF estimate of an 8K video camera [9] using the multidirectional slanted-edge method: (a) the mean (dashed) and standard deviation (solid) and (b) the contour plot in polar coordinates.

Download Full Size | PPT Slide | PDF

5. Conclusions

We have proposed a modified slanted-edge method for estimating the MTF of digital image acquisition devices. The modified method offers a higher accuracy, precision, and robustness against noise than the ISO 12233 method and accommodates curved edges. Furthermore, the modified method can be applied to multidirectional MTF estimates over a range of spatial frequencies beyond the Nyquist frequency in a single measurement of a starburst image. The chart is framing free, which enables easy and precise focusing on the target. This method will also be useful in other situations besides broadcast cameras.

References and Links

1. K. Masaoka, Y. Nishida, M. Sugawara, E. Nakasu, and Y. Nojiri, “Sensation of realness from high-resolution images of real objects,” IEEE Trans. Broadcast 59(1), 72–83 (2013). [CrossRef]  

2. S. E. Reichenbach, S. K. Park, and R. Narayanswamy, “Characterizing digital image acquisition devices,” Opt. Eng. 30(2), 170–177 (1991). [CrossRef]  

3. F. Chazallet and J. Glasser, “Theoretical bases and measurement of the MTF of integrated image sensors,” Proc. SPIE 549, 131–144 (1985). [CrossRef]  

4. G. D. Boreman, Modulation Transfer Function in Optical and Electro-Optical Systems (SPIE, 2001).

5. Photography–Electronic Still Picture Cameras–Resolution Measurements, ISO Standard 12233: 2000.

6. P. D. Burns and D. Williams, “Refined slanted-edge measurement practical camera and scanner testing,” in Proceedings of PICS 2002 (Society for Imaging Science and Technology, Springfield, VA, 1998), pp. 191–195.

7. P. D. Burns, “Slanted-edge MTF for digital camera and scanner analysis,” in Proceedings of PICS 2000 (Society for Imaging Science and Technology, Springfield, VA, 1998), pp. 135–138.

8. C. Loebich, D. Wueller, B. Klingen, and A. Jaeger, “Digital camera resolution measurement using sinusoidal Siemens stars,” Proc. SPIE 6502, 65020N (2007). [CrossRef]  

9. T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE Motion Imaging J. 120(8), 24–31 (2011). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. K. Masaoka, Y. Nishida, M. Sugawara, E. Nakasu, and Y. Nojiri, “Sensation of realness from high-resolution images of real objects,” IEEE Trans. Broadcast 59(1), 72–83 (2013).
    [Crossref]
  2. S. E. Reichenbach, S. K. Park, and R. Narayanswamy, “Characterizing digital image acquisition devices,” Opt. Eng. 30(2), 170–177 (1991).
    [Crossref]
  3. F. Chazallet and J. Glasser, “Theoretical bases and measurement of the MTF of integrated image sensors,” Proc. SPIE 549, 131–144 (1985).
    [Crossref]
  4. G. D. Boreman, Modulation Transfer Function in Optical and Electro-Optical Systems (SPIE, 2001).
  5. Photography–Electronic Still Picture Cameras–Resolution Measurements, ISO Standard 12233: 2000.
  6. P. D. Burns and D. Williams, “Refined slanted-edge measurement practical camera and scanner testing,” in Proceedings of PICS 2002 (Society for Imaging Science and Technology, Springfield, VA, 1998), pp. 191–195.
  7. P. D. Burns, “Slanted-edge MTF for digital camera and scanner analysis,” in Proceedings of PICS 2000 (Society for Imaging Science and Technology, Springfield, VA, 1998), pp. 135–138.
  8. C. Loebich, D. Wueller, B. Klingen, and A. Jaeger, “Digital camera resolution measurement using sinusoidal Siemens stars,” Proc. SPIE 6502, 65020N (2007).
    [Crossref]
  9. T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE Motion Imaging J. 120(8), 24–31 (2011).
    [Crossref]

2013 (1)

K. Masaoka, Y. Nishida, M. Sugawara, E. Nakasu, and Y. Nojiri, “Sensation of realness from high-resolution images of real objects,” IEEE Trans. Broadcast 59(1), 72–83 (2013).
[Crossref]

2011 (1)

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE Motion Imaging J. 120(8), 24–31 (2011).
[Crossref]

2007 (1)

C. Loebich, D. Wueller, B. Klingen, and A. Jaeger, “Digital camera resolution measurement using sinusoidal Siemens stars,” Proc. SPIE 6502, 65020N (2007).
[Crossref]

1991 (1)

S. E. Reichenbach, S. K. Park, and R. Narayanswamy, “Characterizing digital image acquisition devices,” Opt. Eng. 30(2), 170–177 (1991).
[Crossref]

1985 (1)

F. Chazallet and J. Glasser, “Theoretical bases and measurement of the MTF of integrated image sensors,” Proc. SPIE 549, 131–144 (1985).
[Crossref]

Chazallet, F.

F. Chazallet and J. Glasser, “Theoretical bases and measurement of the MTF of integrated image sensors,” Proc. SPIE 549, 131–144 (1985).
[Crossref]

Funatsu, R.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE Motion Imaging J. 120(8), 24–31 (2011).
[Crossref]

Glasser, J.

F. Chazallet and J. Glasser, “Theoretical bases and measurement of the MTF of integrated image sensors,” Proc. SPIE 549, 131–144 (1985).
[Crossref]

Jaeger, A.

C. Loebich, D. Wueller, B. Klingen, and A. Jaeger, “Digital camera resolution measurement using sinusoidal Siemens stars,” Proc. SPIE 6502, 65020N (2007).
[Crossref]

Klingen, B.

C. Loebich, D. Wueller, B. Klingen, and A. Jaeger, “Digital camera resolution measurement using sinusoidal Siemens stars,” Proc. SPIE 6502, 65020N (2007).
[Crossref]

Loebich, C.

C. Loebich, D. Wueller, B. Klingen, and A. Jaeger, “Digital camera resolution measurement using sinusoidal Siemens stars,” Proc. SPIE 6502, 65020N (2007).
[Crossref]

Masaoka, K.

K. Masaoka, Y. Nishida, M. Sugawara, E. Nakasu, and Y. Nojiri, “Sensation of realness from high-resolution images of real objects,” IEEE Trans. Broadcast 59(1), 72–83 (2013).
[Crossref]

Mitani, K.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE Motion Imaging J. 120(8), 24–31 (2011).
[Crossref]

Nakasu, E.

K. Masaoka, Y. Nishida, M. Sugawara, E. Nakasu, and Y. Nojiri, “Sensation of realness from high-resolution images of real objects,” IEEE Trans. Broadcast 59(1), 72–83 (2013).
[Crossref]

Narayanswamy, R.

S. E. Reichenbach, S. K. Park, and R. Narayanswamy, “Characterizing digital image acquisition devices,” Opt. Eng. 30(2), 170–177 (1991).
[Crossref]

Nishida, Y.

K. Masaoka, Y. Nishida, M. Sugawara, E. Nakasu, and Y. Nojiri, “Sensation of realness from high-resolution images of real objects,” IEEE Trans. Broadcast 59(1), 72–83 (2013).
[Crossref]

Nojiri, Y.

K. Masaoka, Y. Nishida, M. Sugawara, E. Nakasu, and Y. Nojiri, “Sensation of realness from high-resolution images of real objects,” IEEE Trans. Broadcast 59(1), 72–83 (2013).
[Crossref]

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE Motion Imaging J. 120(8), 24–31 (2011).
[Crossref]

Park, S. K.

S. E. Reichenbach, S. K. Park, and R. Narayanswamy, “Characterizing digital image acquisition devices,” Opt. Eng. 30(2), 170–177 (1991).
[Crossref]

Reichenbach, S. E.

S. E. Reichenbach, S. K. Park, and R. Narayanswamy, “Characterizing digital image acquisition devices,” Opt. Eng. 30(2), 170–177 (1991).
[Crossref]

Sugawara, M.

K. Masaoka, Y. Nishida, M. Sugawara, E. Nakasu, and Y. Nojiri, “Sensation of realness from high-resolution images of real objects,” IEEE Trans. Broadcast 59(1), 72–83 (2013).
[Crossref]

Wueller, D.

C. Loebich, D. Wueller, B. Klingen, and A. Jaeger, “Digital camera resolution measurement using sinusoidal Siemens stars,” Proc. SPIE 6502, 65020N (2007).
[Crossref]

Yamashita, T.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE Motion Imaging J. 120(8), 24–31 (2011).
[Crossref]

Yanagi, T.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE Motion Imaging J. 120(8), 24–31 (2011).
[Crossref]

Yoshida, T.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE Motion Imaging J. 120(8), 24–31 (2011).
[Crossref]

IEEE Trans. Broadcast (1)

K. Masaoka, Y. Nishida, M. Sugawara, E. Nakasu, and Y. Nojiri, “Sensation of realness from high-resolution images of real objects,” IEEE Trans. Broadcast 59(1), 72–83 (2013).
[Crossref]

Opt. Eng. (1)

S. E. Reichenbach, S. K. Park, and R. Narayanswamy, “Characterizing digital image acquisition devices,” Opt. Eng. 30(2), 170–177 (1991).
[Crossref]

Proc. SPIE (2)

F. Chazallet and J. Glasser, “Theoretical bases and measurement of the MTF of integrated image sensors,” Proc. SPIE 549, 131–144 (1985).
[Crossref]

C. Loebich, D. Wueller, B. Klingen, and A. Jaeger, “Digital camera resolution measurement using sinusoidal Siemens stars,” Proc. SPIE 6502, 65020N (2007).
[Crossref]

SMPTE Motion Imaging J. (1)

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE Motion Imaging J. 120(8), 24–31 (2011).
[Crossref]

Other (4)

G. D. Boreman, Modulation Transfer Function in Optical and Electro-Optical Systems (SPIE, 2001).

Photography–Electronic Still Picture Cameras–Resolution Measurements, ISO Standard 12233: 2000.

P. D. Burns and D. Williams, “Refined slanted-edge measurement practical camera and scanner testing,” in Proceedings of PICS 2002 (Society for Imaging Science and Technology, Springfield, VA, 1998), pp. 191–195.

P. D. Burns, “Slanted-edge MTF for digital camera and scanner analysis,” in Proceedings of PICS 2000 (Society for Imaging Science and Technology, Springfield, VA, 1998), pp. 135–138.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 Slanted-edge method: (a) knife-edge target, and (b) projection, binning, and averaging before forming a one-dimensional edge profile.
Fig. 2
Fig. 2 Errors in the slanted-edge methods: (a) bias due to binning, averaging, and derivative filtering, (b) the means and standard deviations of the estimated edge angles, and (c) the standard deviations of the errors of the MTF estimates at 0.37 cycles/pixel.
Fig. 3
Fig. 3 MTF estimates of curved-edge images using Eqs. (1) and (2): (a) synthetically generated curved-edge simulating a barrel distortion and (b) curved-edge captured with a fisheye lens.
Fig. 4
Fig. 4 Multidirectional slanted-edge method: (a) starburst target and multidirectional ROIs selected on the spoke edges and (b) ROI rotation and pixel projection.
Fig. 5
Fig. 5 Multidirectional MTF estimates of an imaging colorimeter: (a) means (dashed) and standard deviations (solid), the contour plots in polar coordinates using the (b) slanted-edge method and (c) modulated Siemens star method.
Fig. 6
Fig. 6 Multidirectional MTF estimate of an 8K video camera [9] using the multidirectional slanted-edge method: (a) the mean (dashed) and standard deviation (solid) and (b) the contour plot in polar coordinates.

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

v f i t ( x , y ) = ( v H v L ) normcdf ( x , μ + α y , σ ) + v L ,
v f i t ( x , y ) = ( v H v L ) normcdf ( x , μ + α y + β y 2 , σ ) + v L .

Metrics