Abstract

It is challenging to calibrate multiple camera-projector pairs for multi-view 3D surface reconstruction based on structured light. Here, we present a new automated calibration method for high-speed multi-camera-projector systems. The method uses printed and projected dot patterns on a planar calibration target, which is moved by hand in the calibration volume. Calibration is enabled by automated image processing and bundle-adjusted parameter optimization. We determined the performance of our method by 3D reconstructing a sphere. The accuracy is −0.03 ± 0.09 % as a percentage of the diameter of the calibration volume. Applications include quality control, autonomous systems, engineering measurements, and motion capture, such as the preliminary 3D reconstruction of a bird in flight we present here.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
High-speed and high-accuracy 3D surface measurement using a mechanical projector

Jae-Sang Hyun, George T.-C. Chiu, and Song Zhang
Opt. Express 26(2) 1474-1487 (2018)

Method for large-range structured light system calibration

Yatong An, Tyler Bell, Beiwen Li, Jing Xu, and Song Zhang
Appl. Opt. 55(33) 9563-9572 (2016)

Toward dynamic recalibration and three-dimensional reconstruction in a structured light system

Y. F. Li and B. Zhang
J. Opt. Soc. Am. A 24(3) 785-793 (2007)

References

  • View by:
  • |
  • |
  • |

  1. J. D. Ackerman, K. Keller, and H. Fuchs, “Surface reconstruction of abdominal organs using laparoscopic structured light for augmented reality,” Three-Dimensional Image Capture Appl. 4661, 3946 (2002).
    [Crossref]
  2. R. T. McKeon and P. J. Flynn, “Three-dimensional facial imaging using a static light screen (SLS) and a dynamic subject,” IEEE Transactions on Instrumentation Meas. 59, 774–783 (2010).
    [Crossref]
  3. W. Lohry and S. Zhang, “High-speed absolute three-dimensional shape measurement using three binary dithered patterns,” Opt. Express 22, 26752 (2014).
    [Crossref] [PubMed]
  4. Y. Wang, J. I. Laughner, I. R. Efimov, and S. Zhang, “3D absolute shape measurement of live rabbit hearts with a superfast two-frequency phase-shifting technique,” Opt. Express 21, 6631–6636 (2013).
  5. J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
    [Crossref]
  6. C. Guan, L. Hassebrook, and D. Lau, “Composite structured light pattern for three-dimensional video,” Opt. express 11, 406–417 (2003).
    [Crossref] [PubMed]
  7. J. Lenar, M. Witkowski, V. Carbone, S. Kolk, M. Adamczyk, R. Sitnik, M. van der Krogt, and N. Verdonschot, “Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement,” J. biomedical optics 18, 56014 (2013).
    [Crossref]
  8. R. Sagawa, K. Sakashita, N. Kasuya, H. Kawasaki, R. Furukawa, and Y. Yagi, “Grid-based active stereo with single-colored wave pattern for dense one-shot 3D scan,” in IEEE 2nd Joint 3DIM/3DPVT International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, (2012), pp. 363–370.
  9. W.-H. Su and H. Liu, “Calibration-based two-frequency projected fringe profilometry: a robust, accurate, and single-shot measurement for objects with large depth discontinuities,” Opt. express 14, 9178–9187 (2006).
    [Crossref] [PubMed]
  10. H. Sarbolandi, D. Lefloch, and A. Kolb, “Kinect range sensing: structured-light versus time-of-flight Kinect,” Comput. Vis. Image Underst. 139, 1–20 (2015).
    [Crossref]
  11. L. Keselman, J. I. Woodfill, A. Grunnet-Jepsen, and A. Bhowmik, “Intel RealSense stereoscopic depth cameras,” CoRR pp. 1–10 (2017).
  12. S. Ryan Fanello, C. Rhemann, V. Tankovich, A. Kowdle, S. Orts Escolano, D. Kim, and S. Izadi, “Hyperdepth: learning depth from structured light without matching,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), pp. 5441–5450.
  13. T. Wolf and R. Konrath, “Avian wing geometry and kinematics of a free-flying barn owl in flapping flight,” Exp. Fluids 56, 28 (2015).
    [Crossref]
  14. M. E. Deetjen, A. A. Biewener, and D. Lentink, “High-speed surface reconstruction of a flying bird using structured light,” J. Exp. Biol. 220, 1956–1961 (2017).
    [Crossref] [PubMed]
  15. D. C. Brown, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).
  16. I. Sobel, “On calibrating computer controlled cameras for perceiving 3-D scenes,” Artif. Intell. 5, 185–198 (1974).
    [Crossref]
  17. D. H. Theriault, N. W. Fuller, B. E. Jackson, E. Bluhm, D. Evangelista, Z. Wu, M. Betke, and T. L. Hedrick, “A protocol and calibration method for accurate multi-camera field videography,” J. Exp. Biol. 217, 1843–1848 (2014).
    [Crossref] [PubMed]
  18. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis Mach. Intell. 22, 1330–1334 (2000).
    [Crossref]
  19. J. Y. Bouguet, “Camera calibration toolbox for Matlab,” (2000).
  20. O. D. Faugeras, Three-Dimensional Computer Vision: a Geometric Viewpoint (MIT University, 1993).
  21. M. Sonka, V. Hlavac, and R. Boyle, Image Processing, Analysis, and Machine Vision (PWS Publishing, 1999), ii ed.
  22. R. Sitnik, “Phase scaling using characteristic polynomials,” Proc. SPIE 5532, 211–217 (2004).
    [Crossref]
  23. I. Léandry, C. Brèque, and V. Valle, “Calibration of a structured-light projection system: development to large dimension objects,” Opt. Lasers Eng. 50, 373–379 (2012).
    [Crossref]
  24. S. J. Maybank and O. D. Faugeras, “A theory of self-calibration of a moving camera,” Int. J. Comput. Vis. 8, 123–151 (1992).
    [Crossref]
  25. W. Schreiber and G. Notni, “Theory and arrangements of self-calibrating whole-body three-dimensional measurement systems using fringe projection technique,” Soc. Photo-Optical Instrumentation Eng. 39, 159–169 (2000).
  26. P. Beardsley and R. Raskar, “A self-correcting projector,” IEEE Comput. Soc. Conf. on Comput. Vis. Pattern Recognit. 2, 504–508 (2001).
  27. S. Y. Chen and Y. F. Li, “Self recalibration of a structured light vision system from a single view,” IEEE Int. Conf. on Robotics & Autom. pp. 2539–2544 (2002).
  28. J. Yuan, Q. Wang, and B. Li, “A flexile and high precision calibration method for binocular structured light scanning system,” Sci. World J. 2014, 753932 (2014).
  29. B. Zhao and Z. Hu, “Camera self-calibration from translation by referring to a known camera,” Appl. Opt. 54, 7789–7798 (2015).
    [Crossref] [PubMed]
  30. Q. Sun, X. Wang, J. Xu, L. Wang, H. Zhang, J. Yu, T. Su, and X. Zhang, “Camera self-calibration with lens distortion,” Optik 127, 4506–4513 (2016).
    [Crossref]
  31. D. Wu, T. Chen, and A. Li, “A high precision approach to calibrate a structured light vision sensor in a robot-based three-dimensional measurement system,” Sensors 16, 1–18 (2016).
    [Crossref]
  32. F. Sadlo, T. Weyrich, R. Peikert, and M. Gross, “A practical structured light acquisition system for point-based geometry and texture,” IEEE Eurographics Symp. on Point-Based Graph. pp. 89–98 (2005).
  33. J. Liao and L. Cai, “A calibration method for uncoupling projector and camera of a structured light system,” IEEE/ASME Int. Conf. on Adv. Intell. Mechatronics pp. 770–774 (2008).
  34. K. Yamauchi, H. Saito, and Y. Sato, “Calibration of a structured light system by observing planar object from unknown viewpoints,” IEEE Int. Conf. on Pattern Recognit. pp. 1–4 (2008).
  35. W. Gao, L. Wang, and Z.-Y. Hu, “Flexible calibration of a portable structured light system through surface plane,” Acta Autom. Sinica 34, 1358–1362 (2008).
    [Crossref]
  36. R. Legarda-Saenz, T. Bothe, and W. P. Juptner, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
    [Crossref]
  37. Z. Li, “Accurate calibration method for a structured light system,” Opt. Eng. 47, 053604 (2008).
    [Crossref]
  38. X. Chen, J. Xi, Y. Jin, and J. Sun, “Accurate calibration for a camera-projector measurement system based on structured light projection,” Opt. Lasers Eng. 47, 310–319 (2009).
    [Crossref]
  39. S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45, 083601 (2006).
  40. M. Ren, J. Liang, B. Wei, and W. Pai, “Novel projector calibration method for monocular structured light system based on digital image correlation,” Optik 132, 337–347 (2017).
    [Crossref]
  41. P. Kuehmstedt, G. Notni, W. Schreiber, and J. Gerber, “Full-hemisphere automatic optical 3D measurement system,” SPIE 3100, 261–265 (1997).
  42. A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38, 339–344 (1999).
    [Crossref]
  43. L. Nie, Y. Ye, and Z. Song, “Method for calibration accuracy improvement of projector-camera-based structured light system,” Opt. Eng. 56, 074101 (2017).
    [Crossref]
  44. T. Svoboda, D. Martinec, and T. Pajdla, “A convenient multicamera self-calibration for virtual environments,” Presence: Teleoperators Virtual Environ.  14, 407–422 (2005).
    [Crossref]
  45. B. Li, L. Heng, K. Koser, and M. Pollefeys, “A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern,” IEEE Int. Conf. on Intell. Robots Syst. pp. 1301–1307 (2013).
  46. ARPG, “Autonomous robotics perception group: visual-inertial calibration tool: vicalib,” (2018).
  47. A. Grunnet-jepsen, P. Winer, A. Takagi, J. Sweetser, K. Zhao, T. Khuong, D. Nie, and J. Woodfill, “Using the RealSense D4xx depth sensors in multi-camera configurations,” (2018).
  48. D. Bradley and G. Roth, “Adaptive thresholding using the integral image,” J. Graph. Tools 12, 13–21 (2007).
    [Crossref]
  49. H. Edelsbrunner, D. G. Kirkpatrick, and R. Seidel, “On the shape of a set of points in the plane,” IEEE Transactions on Inf. Theory 29, 551–559 (1983).
    [Crossref]
  50. S. P. Lloyd, “Least squares quantization in PCM,” IEEE Transactions on Inf. Theory 28, 129–137 (1982).
    [Crossref]
  51. M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24, 381–395 (1981).
    [Crossref]
  52. A. Trujillo-Pino, K. Krissian, M. Alemán-Flores, and D. Santana-Cedrés, “Accurate subpixel edge location based on partial area effect,” Image Vis. Comput. 31, 72–90 (2013).
    [Crossref]
  53. T. F. Coleman and Y. Li, “An interior trust region approach for nonlinear minimization subject to bounds,” SIAM J. on Optim. 6, 418–445 (1996).
    [Crossref]
  54. E. Trucco and A. Verri, Introductory Techniques for 3-D Computer Vision(Prentice Hall, 1998).
  55. B. Triggs, P. F. McLauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment -a modern synthesis,” Vis. Algorithms: Theory Pract. 1883, 298–372 (2000).
  56. S. W. Lee, S. Y. Lee, and H. J. Pahk, “Precise edge detection method using sigmoid function in blurry and noisy image for TFT-LCD 2D critical dimension measurement,” Curr. Opt. Photonics 2, 69–78 (2018).
  57. K. N. Kutulakos and S. M. Seitz, “A theory of shape by space carving,” Proc. Seventh IEEE Int. Conf. on Comput. Vis. 3, 197–216 (1999).

2018 (1)

S. W. Lee, S. Y. Lee, and H. J. Pahk, “Precise edge detection method using sigmoid function in blurry and noisy image for TFT-LCD 2D critical dimension measurement,” Curr. Opt. Photonics 2, 69–78 (2018).

2017 (3)

M. E. Deetjen, A. A. Biewener, and D. Lentink, “High-speed surface reconstruction of a flying bird using structured light,” J. Exp. Biol. 220, 1956–1961 (2017).
[Crossref] [PubMed]

M. Ren, J. Liang, B. Wei, and W. Pai, “Novel projector calibration method for monocular structured light system based on digital image correlation,” Optik 132, 337–347 (2017).
[Crossref]

L. Nie, Y. Ye, and Z. Song, “Method for calibration accuracy improvement of projector-camera-based structured light system,” Opt. Eng. 56, 074101 (2017).
[Crossref]

2016 (2)

Q. Sun, X. Wang, J. Xu, L. Wang, H. Zhang, J. Yu, T. Su, and X. Zhang, “Camera self-calibration with lens distortion,” Optik 127, 4506–4513 (2016).
[Crossref]

D. Wu, T. Chen, and A. Li, “A high precision approach to calibrate a structured light vision sensor in a robot-based three-dimensional measurement system,” Sensors 16, 1–18 (2016).
[Crossref]

2015 (3)

H. Sarbolandi, D. Lefloch, and A. Kolb, “Kinect range sensing: structured-light versus time-of-flight Kinect,” Comput. Vis. Image Underst. 139, 1–20 (2015).
[Crossref]

T. Wolf and R. Konrath, “Avian wing geometry and kinematics of a free-flying barn owl in flapping flight,” Exp. Fluids 56, 28 (2015).
[Crossref]

B. Zhao and Z. Hu, “Camera self-calibration from translation by referring to a known camera,” Appl. Opt. 54, 7789–7798 (2015).
[Crossref] [PubMed]

2014 (3)

W. Lohry and S. Zhang, “High-speed absolute three-dimensional shape measurement using three binary dithered patterns,” Opt. Express 22, 26752 (2014).
[Crossref] [PubMed]

J. Yuan, Q. Wang, and B. Li, “A flexile and high precision calibration method for binocular structured light scanning system,” Sci. World J. 2014, 753932 (2014).

D. H. Theriault, N. W. Fuller, B. E. Jackson, E. Bluhm, D. Evangelista, Z. Wu, M. Betke, and T. L. Hedrick, “A protocol and calibration method for accurate multi-camera field videography,” J. Exp. Biol. 217, 1843–1848 (2014).
[Crossref] [PubMed]

2013 (3)

A. Trujillo-Pino, K. Krissian, M. Alemán-Flores, and D. Santana-Cedrés, “Accurate subpixel edge location based on partial area effect,” Image Vis. Comput. 31, 72–90 (2013).
[Crossref]

Y. Wang, J. I. Laughner, I. R. Efimov, and S. Zhang, “3D absolute shape measurement of live rabbit hearts with a superfast two-frequency phase-shifting technique,” Opt. Express 21, 6631–6636 (2013).

J. Lenar, M. Witkowski, V. Carbone, S. Kolk, M. Adamczyk, R. Sitnik, M. van der Krogt, and N. Verdonschot, “Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement,” J. biomedical optics 18, 56014 (2013).
[Crossref]

2012 (1)

I. Léandry, C. Brèque, and V. Valle, “Calibration of a structured-light projection system: development to large dimension objects,” Opt. Lasers Eng. 50, 373–379 (2012).
[Crossref]

2010 (2)

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

R. T. McKeon and P. J. Flynn, “Three-dimensional facial imaging using a static light screen (SLS) and a dynamic subject,” IEEE Transactions on Instrumentation Meas. 59, 774–783 (2010).
[Crossref]

2009 (1)

X. Chen, J. Xi, Y. Jin, and J. Sun, “Accurate calibration for a camera-projector measurement system based on structured light projection,” Opt. Lasers Eng. 47, 310–319 (2009).
[Crossref]

2008 (2)

W. Gao, L. Wang, and Z.-Y. Hu, “Flexible calibration of a portable structured light system through surface plane,” Acta Autom. Sinica 34, 1358–1362 (2008).
[Crossref]

Z. Li, “Accurate calibration method for a structured light system,” Opt. Eng. 47, 053604 (2008).
[Crossref]

2007 (1)

D. Bradley and G. Roth, “Adaptive thresholding using the integral image,” J. Graph. Tools 12, 13–21 (2007).
[Crossref]

2006 (2)

2005 (1)

T. Svoboda, D. Martinec, and T. Pajdla, “A convenient multicamera self-calibration for virtual environments,” Presence: Teleoperators Virtual Environ.  14, 407–422 (2005).
[Crossref]

2004 (2)

R. Legarda-Saenz, T. Bothe, and W. P. Juptner, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
[Crossref]

R. Sitnik, “Phase scaling using characteristic polynomials,” Proc. SPIE 5532, 211–217 (2004).
[Crossref]

2003 (1)

2002 (1)

J. D. Ackerman, K. Keller, and H. Fuchs, “Surface reconstruction of abdominal organs using laparoscopic structured light for augmented reality,” Three-Dimensional Image Capture Appl. 4661, 3946 (2002).
[Crossref]

2001 (1)

P. Beardsley and R. Raskar, “A self-correcting projector,” IEEE Comput. Soc. Conf. on Comput. Vis. Pattern Recognit. 2, 504–508 (2001).

2000 (3)

W. Schreiber and G. Notni, “Theory and arrangements of self-calibrating whole-body three-dimensional measurement systems using fringe projection technique,” Soc. Photo-Optical Instrumentation Eng. 39, 159–169 (2000).

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis Mach. Intell. 22, 1330–1334 (2000).
[Crossref]

B. Triggs, P. F. McLauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment -a modern synthesis,” Vis. Algorithms: Theory Pract. 1883, 298–372 (2000).

1999 (2)

K. N. Kutulakos and S. M. Seitz, “A theory of shape by space carving,” Proc. Seventh IEEE Int. Conf. on Comput. Vis. 3, 197–216 (1999).

A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38, 339–344 (1999).
[Crossref]

1997 (1)

P. Kuehmstedt, G. Notni, W. Schreiber, and J. Gerber, “Full-hemisphere automatic optical 3D measurement system,” SPIE 3100, 261–265 (1997).

1996 (1)

T. F. Coleman and Y. Li, “An interior trust region approach for nonlinear minimization subject to bounds,” SIAM J. on Optim. 6, 418–445 (1996).
[Crossref]

1992 (1)

S. J. Maybank and O. D. Faugeras, “A theory of self-calibration of a moving camera,” Int. J. Comput. Vis. 8, 123–151 (1992).
[Crossref]

1983 (1)

H. Edelsbrunner, D. G. Kirkpatrick, and R. Seidel, “On the shape of a set of points in the plane,” IEEE Transactions on Inf. Theory 29, 551–559 (1983).
[Crossref]

1982 (1)

S. P. Lloyd, “Least squares quantization in PCM,” IEEE Transactions on Inf. Theory 28, 129–137 (1982).
[Crossref]

1981 (1)

M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24, 381–395 (1981).
[Crossref]

1974 (1)

I. Sobel, “On calibrating computer controlled cameras for perceiving 3-D scenes,” Artif. Intell. 5, 185–198 (1974).
[Crossref]

1971 (1)

D. C. Brown, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).

Ackerman, J. D.

J. D. Ackerman, K. Keller, and H. Fuchs, “Surface reconstruction of abdominal organs using laparoscopic structured light for augmented reality,” Three-Dimensional Image Capture Appl. 4661, 3946 (2002).
[Crossref]

Adamczyk, M.

J. Lenar, M. Witkowski, V. Carbone, S. Kolk, M. Adamczyk, R. Sitnik, M. van der Krogt, and N. Verdonschot, “Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement,” J. biomedical optics 18, 56014 (2013).
[Crossref]

Alemán-Flores, M.

A. Trujillo-Pino, K. Krissian, M. Alemán-Flores, and D. Santana-Cedrés, “Accurate subpixel edge location based on partial area effect,” Image Vis. Comput. 31, 72–90 (2013).
[Crossref]

Asundi, A.

A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38, 339–344 (1999).
[Crossref]

Beardsley, P.

P. Beardsley and R. Raskar, “A self-correcting projector,” IEEE Comput. Soc. Conf. on Comput. Vis. Pattern Recognit. 2, 504–508 (2001).

Betke, M.

D. H. Theriault, N. W. Fuller, B. E. Jackson, E. Bluhm, D. Evangelista, Z. Wu, M. Betke, and T. L. Hedrick, “A protocol and calibration method for accurate multi-camera field videography,” J. Exp. Biol. 217, 1843–1848 (2014).
[Crossref] [PubMed]

Bhowmik, A.

L. Keselman, J. I. Woodfill, A. Grunnet-Jepsen, and A. Bhowmik, “Intel RealSense stereoscopic depth cameras,” CoRR pp. 1–10 (2017).

Biewener, A. A.

M. E. Deetjen, A. A. Biewener, and D. Lentink, “High-speed surface reconstruction of a flying bird using structured light,” J. Exp. Biol. 220, 1956–1961 (2017).
[Crossref] [PubMed]

Bluhm, E.

D. H. Theriault, N. W. Fuller, B. E. Jackson, E. Bluhm, D. Evangelista, Z. Wu, M. Betke, and T. L. Hedrick, “A protocol and calibration method for accurate multi-camera field videography,” J. Exp. Biol. 217, 1843–1848 (2014).
[Crossref] [PubMed]

Bolles, R. C.

M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24, 381–395 (1981).
[Crossref]

Bothe, T.

R. Legarda-Saenz, T. Bothe, and W. P. Juptner, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
[Crossref]

Bouguet, J. Y.

J. Y. Bouguet, “Camera calibration toolbox for Matlab,” (2000).

Boyle, R.

M. Sonka, V. Hlavac, and R. Boyle, Image Processing, Analysis, and Machine Vision (PWS Publishing, 1999), ii ed.

Bradley, D.

D. Bradley and G. Roth, “Adaptive thresholding using the integral image,” J. Graph. Tools 12, 13–21 (2007).
[Crossref]

Brèque, C.

I. Léandry, C. Brèque, and V. Valle, “Calibration of a structured-light projection system: development to large dimension objects,” Opt. Lasers Eng. 50, 373–379 (2012).
[Crossref]

Brown, D. C.

D. C. Brown, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).

Cai, L.

J. Liao and L. Cai, “A calibration method for uncoupling projector and camera of a structured light system,” IEEE/ASME Int. Conf. on Adv. Intell. Mechatronics pp. 770–774 (2008).

Carbone, V.

J. Lenar, M. Witkowski, V. Carbone, S. Kolk, M. Adamczyk, R. Sitnik, M. van der Krogt, and N. Verdonschot, “Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement,” J. biomedical optics 18, 56014 (2013).
[Crossref]

Chen, S. Y.

S. Y. Chen and Y. F. Li, “Self recalibration of a structured light vision system from a single view,” IEEE Int. Conf. on Robotics & Autom. pp. 2539–2544 (2002).

Chen, T.

D. Wu, T. Chen, and A. Li, “A high precision approach to calibrate a structured light vision sensor in a robot-based three-dimensional measurement system,” Sensors 16, 1–18 (2016).
[Crossref]

Chen, X.

X. Chen, J. Xi, Y. Jin, and J. Sun, “Accurate calibration for a camera-projector measurement system based on structured light projection,” Opt. Lasers Eng. 47, 310–319 (2009).
[Crossref]

Coleman, T. F.

T. F. Coleman and Y. Li, “An interior trust region approach for nonlinear minimization subject to bounds,” SIAM J. on Optim. 6, 418–445 (1996).
[Crossref]

Deetjen, M. E.

M. E. Deetjen, A. A. Biewener, and D. Lentink, “High-speed surface reconstruction of a flying bird using structured light,” J. Exp. Biol. 220, 1956–1961 (2017).
[Crossref] [PubMed]

Edelsbrunner, H.

H. Edelsbrunner, D. G. Kirkpatrick, and R. Seidel, “On the shape of a set of points in the plane,” IEEE Transactions on Inf. Theory 29, 551–559 (1983).
[Crossref]

Efimov, I. R.

Y. Wang, J. I. Laughner, I. R. Efimov, and S. Zhang, “3D absolute shape measurement of live rabbit hearts with a superfast two-frequency phase-shifting technique,” Opt. Express 21, 6631–6636 (2013).

Evangelista, D.

D. H. Theriault, N. W. Fuller, B. E. Jackson, E. Bluhm, D. Evangelista, Z. Wu, M. Betke, and T. L. Hedrick, “A protocol and calibration method for accurate multi-camera field videography,” J. Exp. Biol. 217, 1843–1848 (2014).
[Crossref] [PubMed]

Faugeras, O. D.

S. J. Maybank and O. D. Faugeras, “A theory of self-calibration of a moving camera,” Int. J. Comput. Vis. 8, 123–151 (1992).
[Crossref]

O. D. Faugeras, Three-Dimensional Computer Vision: a Geometric Viewpoint (MIT University, 1993).

Fernandez, S.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

Fischler, M. A.

M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24, 381–395 (1981).
[Crossref]

Fitzgibbon, A. W.

B. Triggs, P. F. McLauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment -a modern synthesis,” Vis. Algorithms: Theory Pract. 1883, 298–372 (2000).

Flynn, P. J.

R. T. McKeon and P. J. Flynn, “Three-dimensional facial imaging using a static light screen (SLS) and a dynamic subject,” IEEE Transactions on Instrumentation Meas. 59, 774–783 (2010).
[Crossref]

Fuchs, H.

J. D. Ackerman, K. Keller, and H. Fuchs, “Surface reconstruction of abdominal organs using laparoscopic structured light for augmented reality,” Three-Dimensional Image Capture Appl. 4661, 3946 (2002).
[Crossref]

Fuller, N. W.

D. H. Theriault, N. W. Fuller, B. E. Jackson, E. Bluhm, D. Evangelista, Z. Wu, M. Betke, and T. L. Hedrick, “A protocol and calibration method for accurate multi-camera field videography,” J. Exp. Biol. 217, 1843–1848 (2014).
[Crossref] [PubMed]

Furukawa, R.

R. Sagawa, K. Sakashita, N. Kasuya, H. Kawasaki, R. Furukawa, and Y. Yagi, “Grid-based active stereo with single-colored wave pattern for dense one-shot 3D scan,” in IEEE 2nd Joint 3DIM/3DPVT International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, (2012), pp. 363–370.

Gao, W.

W. Gao, L. Wang, and Z.-Y. Hu, “Flexible calibration of a portable structured light system through surface plane,” Acta Autom. Sinica 34, 1358–1362 (2008).
[Crossref]

Gerber, J.

P. Kuehmstedt, G. Notni, W. Schreiber, and J. Gerber, “Full-hemisphere automatic optical 3D measurement system,” SPIE 3100, 261–265 (1997).

Gross, M.

F. Sadlo, T. Weyrich, R. Peikert, and M. Gross, “A practical structured light acquisition system for point-based geometry and texture,” IEEE Eurographics Symp. on Point-Based Graph. pp. 89–98 (2005).

Grunnet-jepsen, A.

A. Grunnet-jepsen, P. Winer, A. Takagi, J. Sweetser, K. Zhao, T. Khuong, D. Nie, and J. Woodfill, “Using the RealSense D4xx depth sensors in multi-camera configurations,” (2018).

L. Keselman, J. I. Woodfill, A. Grunnet-Jepsen, and A. Bhowmik, “Intel RealSense stereoscopic depth cameras,” CoRR pp. 1–10 (2017).

Guan, C.

Hartley, R. I.

B. Triggs, P. F. McLauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment -a modern synthesis,” Vis. Algorithms: Theory Pract. 1883, 298–372 (2000).

Hassebrook, L.

Hedrick, T. L.

D. H. Theriault, N. W. Fuller, B. E. Jackson, E. Bluhm, D. Evangelista, Z. Wu, M. Betke, and T. L. Hedrick, “A protocol and calibration method for accurate multi-camera field videography,” J. Exp. Biol. 217, 1843–1848 (2014).
[Crossref] [PubMed]

Heng, L.

B. Li, L. Heng, K. Koser, and M. Pollefeys, “A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern,” IEEE Int. Conf. on Intell. Robots Syst. pp. 1301–1307 (2013).

Hlavac, V.

M. Sonka, V. Hlavac, and R. Boyle, Image Processing, Analysis, and Machine Vision (PWS Publishing, 1999), ii ed.

Hu, Z.

Hu, Z.-Y.

W. Gao, L. Wang, and Z.-Y. Hu, “Flexible calibration of a portable structured light system through surface plane,” Acta Autom. Sinica 34, 1358–1362 (2008).
[Crossref]

Huang, P. S.

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45, 083601 (2006).

Izadi, S.

S. Ryan Fanello, C. Rhemann, V. Tankovich, A. Kowdle, S. Orts Escolano, D. Kim, and S. Izadi, “Hyperdepth: learning depth from structured light without matching,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), pp. 5441–5450.

Jackson, B. E.

D. H. Theriault, N. W. Fuller, B. E. Jackson, E. Bluhm, D. Evangelista, Z. Wu, M. Betke, and T. L. Hedrick, “A protocol and calibration method for accurate multi-camera field videography,” J. Exp. Biol. 217, 1843–1848 (2014).
[Crossref] [PubMed]

Jin, Y.

X. Chen, J. Xi, Y. Jin, and J. Sun, “Accurate calibration for a camera-projector measurement system based on structured light projection,” Opt. Lasers Eng. 47, 310–319 (2009).
[Crossref]

Juptner, W. P.

R. Legarda-Saenz, T. Bothe, and W. P. Juptner, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
[Crossref]

Kasuya, N.

R. Sagawa, K. Sakashita, N. Kasuya, H. Kawasaki, R. Furukawa, and Y. Yagi, “Grid-based active stereo with single-colored wave pattern for dense one-shot 3D scan,” in IEEE 2nd Joint 3DIM/3DPVT International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, (2012), pp. 363–370.

Kawasaki, H.

R. Sagawa, K. Sakashita, N. Kasuya, H. Kawasaki, R. Furukawa, and Y. Yagi, “Grid-based active stereo with single-colored wave pattern for dense one-shot 3D scan,” in IEEE 2nd Joint 3DIM/3DPVT International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, (2012), pp. 363–370.

Keller, K.

J. D. Ackerman, K. Keller, and H. Fuchs, “Surface reconstruction of abdominal organs using laparoscopic structured light for augmented reality,” Three-Dimensional Image Capture Appl. 4661, 3946 (2002).
[Crossref]

Keselman, L.

L. Keselman, J. I. Woodfill, A. Grunnet-Jepsen, and A. Bhowmik, “Intel RealSense stereoscopic depth cameras,” CoRR pp. 1–10 (2017).

Khuong, T.

A. Grunnet-jepsen, P. Winer, A. Takagi, J. Sweetser, K. Zhao, T. Khuong, D. Nie, and J. Woodfill, “Using the RealSense D4xx depth sensors in multi-camera configurations,” (2018).

Kim, D.

S. Ryan Fanello, C. Rhemann, V. Tankovich, A. Kowdle, S. Orts Escolano, D. Kim, and S. Izadi, “Hyperdepth: learning depth from structured light without matching,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), pp. 5441–5450.

Kirkpatrick, D. G.

H. Edelsbrunner, D. G. Kirkpatrick, and R. Seidel, “On the shape of a set of points in the plane,” IEEE Transactions on Inf. Theory 29, 551–559 (1983).
[Crossref]

Kolb, A.

H. Sarbolandi, D. Lefloch, and A. Kolb, “Kinect range sensing: structured-light versus time-of-flight Kinect,” Comput. Vis. Image Underst. 139, 1–20 (2015).
[Crossref]

Kolk, S.

J. Lenar, M. Witkowski, V. Carbone, S. Kolk, M. Adamczyk, R. Sitnik, M. van der Krogt, and N. Verdonschot, “Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement,” J. biomedical optics 18, 56014 (2013).
[Crossref]

Konrath, R.

T. Wolf and R. Konrath, “Avian wing geometry and kinematics of a free-flying barn owl in flapping flight,” Exp. Fluids 56, 28 (2015).
[Crossref]

Koser, K.

B. Li, L. Heng, K. Koser, and M. Pollefeys, “A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern,” IEEE Int. Conf. on Intell. Robots Syst. pp. 1301–1307 (2013).

Kowdle, A.

S. Ryan Fanello, C. Rhemann, V. Tankovich, A. Kowdle, S. Orts Escolano, D. Kim, and S. Izadi, “Hyperdepth: learning depth from structured light without matching,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), pp. 5441–5450.

Krissian, K.

A. Trujillo-Pino, K. Krissian, M. Alemán-Flores, and D. Santana-Cedrés, “Accurate subpixel edge location based on partial area effect,” Image Vis. Comput. 31, 72–90 (2013).
[Crossref]

Kuehmstedt, P.

P. Kuehmstedt, G. Notni, W. Schreiber, and J. Gerber, “Full-hemisphere automatic optical 3D measurement system,” SPIE 3100, 261–265 (1997).

Kutulakos, K. N.

K. N. Kutulakos and S. M. Seitz, “A theory of shape by space carving,” Proc. Seventh IEEE Int. Conf. on Comput. Vis. 3, 197–216 (1999).

Lau, D.

Laughner, J. I.

Y. Wang, J. I. Laughner, I. R. Efimov, and S. Zhang, “3D absolute shape measurement of live rabbit hearts with a superfast two-frequency phase-shifting technique,” Opt. Express 21, 6631–6636 (2013).

Léandry, I.

I. Léandry, C. Brèque, and V. Valle, “Calibration of a structured-light projection system: development to large dimension objects,” Opt. Lasers Eng. 50, 373–379 (2012).
[Crossref]

Lee, S. W.

S. W. Lee, S. Y. Lee, and H. J. Pahk, “Precise edge detection method using sigmoid function in blurry and noisy image for TFT-LCD 2D critical dimension measurement,” Curr. Opt. Photonics 2, 69–78 (2018).

Lee, S. Y.

S. W. Lee, S. Y. Lee, and H. J. Pahk, “Precise edge detection method using sigmoid function in blurry and noisy image for TFT-LCD 2D critical dimension measurement,” Curr. Opt. Photonics 2, 69–78 (2018).

Lefloch, D.

H. Sarbolandi, D. Lefloch, and A. Kolb, “Kinect range sensing: structured-light versus time-of-flight Kinect,” Comput. Vis. Image Underst. 139, 1–20 (2015).
[Crossref]

Legarda-Saenz, R.

R. Legarda-Saenz, T. Bothe, and W. P. Juptner, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
[Crossref]

Lenar, J.

J. Lenar, M. Witkowski, V. Carbone, S. Kolk, M. Adamczyk, R. Sitnik, M. van der Krogt, and N. Verdonschot, “Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement,” J. biomedical optics 18, 56014 (2013).
[Crossref]

Lentink, D.

M. E. Deetjen, A. A. Biewener, and D. Lentink, “High-speed surface reconstruction of a flying bird using structured light,” J. Exp. Biol. 220, 1956–1961 (2017).
[Crossref] [PubMed]

Li, A.

D. Wu, T. Chen, and A. Li, “A high precision approach to calibrate a structured light vision sensor in a robot-based three-dimensional measurement system,” Sensors 16, 1–18 (2016).
[Crossref]

Li, B.

J. Yuan, Q. Wang, and B. Li, “A flexile and high precision calibration method for binocular structured light scanning system,” Sci. World J. 2014, 753932 (2014).

B. Li, L. Heng, K. Koser, and M. Pollefeys, “A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern,” IEEE Int. Conf. on Intell. Robots Syst. pp. 1301–1307 (2013).

Li, Y.

T. F. Coleman and Y. Li, “An interior trust region approach for nonlinear minimization subject to bounds,” SIAM J. on Optim. 6, 418–445 (1996).
[Crossref]

Li, Y. F.

S. Y. Chen and Y. F. Li, “Self recalibration of a structured light vision system from a single view,” IEEE Int. Conf. on Robotics & Autom. pp. 2539–2544 (2002).

Li, Z.

Z. Li, “Accurate calibration method for a structured light system,” Opt. Eng. 47, 053604 (2008).
[Crossref]

Liang, J.

M. Ren, J. Liang, B. Wei, and W. Pai, “Novel projector calibration method for monocular structured light system based on digital image correlation,” Optik 132, 337–347 (2017).
[Crossref]

Liao, J.

J. Liao and L. Cai, “A calibration method for uncoupling projector and camera of a structured light system,” IEEE/ASME Int. Conf. on Adv. Intell. Mechatronics pp. 770–774 (2008).

Liu, H.

Llado, X.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

Lloyd, S. P.

S. P. Lloyd, “Least squares quantization in PCM,” IEEE Transactions on Inf. Theory 28, 129–137 (1982).
[Crossref]

Lohry, W.

Martinec, D.

T. Svoboda, D. Martinec, and T. Pajdla, “A convenient multicamera self-calibration for virtual environments,” Presence: Teleoperators Virtual Environ.  14, 407–422 (2005).
[Crossref]

Maybank, S. J.

S. J. Maybank and O. D. Faugeras, “A theory of self-calibration of a moving camera,” Int. J. Comput. Vis. 8, 123–151 (1992).
[Crossref]

McKeon, R. T.

R. T. McKeon and P. J. Flynn, “Three-dimensional facial imaging using a static light screen (SLS) and a dynamic subject,” IEEE Transactions on Instrumentation Meas. 59, 774–783 (2010).
[Crossref]

McLauchlan, P. F.

B. Triggs, P. F. McLauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment -a modern synthesis,” Vis. Algorithms: Theory Pract. 1883, 298–372 (2000).

Nie, D.

A. Grunnet-jepsen, P. Winer, A. Takagi, J. Sweetser, K. Zhao, T. Khuong, D. Nie, and J. Woodfill, “Using the RealSense D4xx depth sensors in multi-camera configurations,” (2018).

Nie, L.

L. Nie, Y. Ye, and Z. Song, “Method for calibration accuracy improvement of projector-camera-based structured light system,” Opt. Eng. 56, 074101 (2017).
[Crossref]

Notni, G.

W. Schreiber and G. Notni, “Theory and arrangements of self-calibrating whole-body three-dimensional measurement systems using fringe projection technique,” Soc. Photo-Optical Instrumentation Eng. 39, 159–169 (2000).

P. Kuehmstedt, G. Notni, W. Schreiber, and J. Gerber, “Full-hemisphere automatic optical 3D measurement system,” SPIE 3100, 261–265 (1997).

Orts Escolano, S.

S. Ryan Fanello, C. Rhemann, V. Tankovich, A. Kowdle, S. Orts Escolano, D. Kim, and S. Izadi, “Hyperdepth: learning depth from structured light without matching,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), pp. 5441–5450.

Pahk, H. J.

S. W. Lee, S. Y. Lee, and H. J. Pahk, “Precise edge detection method using sigmoid function in blurry and noisy image for TFT-LCD 2D critical dimension measurement,” Curr. Opt. Photonics 2, 69–78 (2018).

Pai, W.

M. Ren, J. Liang, B. Wei, and W. Pai, “Novel projector calibration method for monocular structured light system based on digital image correlation,” Optik 132, 337–347 (2017).
[Crossref]

Pajdla, T.

T. Svoboda, D. Martinec, and T. Pajdla, “A convenient multicamera self-calibration for virtual environments,” Presence: Teleoperators Virtual Environ.  14, 407–422 (2005).
[Crossref]

Peikert, R.

F. Sadlo, T. Weyrich, R. Peikert, and M. Gross, “A practical structured light acquisition system for point-based geometry and texture,” IEEE Eurographics Symp. on Point-Based Graph. pp. 89–98 (2005).

Pollefeys, M.

B. Li, L. Heng, K. Koser, and M. Pollefeys, “A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern,” IEEE Int. Conf. on Intell. Robots Syst. pp. 1301–1307 (2013).

Pribanic, T.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

Raskar, R.

P. Beardsley and R. Raskar, “A self-correcting projector,” IEEE Comput. Soc. Conf. on Comput. Vis. Pattern Recognit. 2, 504–508 (2001).

Ren, M.

M. Ren, J. Liang, B. Wei, and W. Pai, “Novel projector calibration method for monocular structured light system based on digital image correlation,” Optik 132, 337–347 (2017).
[Crossref]

Rhemann, C.

S. Ryan Fanello, C. Rhemann, V. Tankovich, A. Kowdle, S. Orts Escolano, D. Kim, and S. Izadi, “Hyperdepth: learning depth from structured light without matching,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), pp. 5441–5450.

Roth, G.

D. Bradley and G. Roth, “Adaptive thresholding using the integral image,” J. Graph. Tools 12, 13–21 (2007).
[Crossref]

Ryan Fanello, S.

S. Ryan Fanello, C. Rhemann, V. Tankovich, A. Kowdle, S. Orts Escolano, D. Kim, and S. Izadi, “Hyperdepth: learning depth from structured light without matching,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), pp. 5441–5450.

Sadlo, F.

F. Sadlo, T. Weyrich, R. Peikert, and M. Gross, “A practical structured light acquisition system for point-based geometry and texture,” IEEE Eurographics Symp. on Point-Based Graph. pp. 89–98 (2005).

Sagawa, R.

R. Sagawa, K. Sakashita, N. Kasuya, H. Kawasaki, R. Furukawa, and Y. Yagi, “Grid-based active stereo with single-colored wave pattern for dense one-shot 3D scan,” in IEEE 2nd Joint 3DIM/3DPVT International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, (2012), pp. 363–370.

Saito, H.

K. Yamauchi, H. Saito, and Y. Sato, “Calibration of a structured light system by observing planar object from unknown viewpoints,” IEEE Int. Conf. on Pattern Recognit. pp. 1–4 (2008).

Sakashita, K.

R. Sagawa, K. Sakashita, N. Kasuya, H. Kawasaki, R. Furukawa, and Y. Yagi, “Grid-based active stereo with single-colored wave pattern for dense one-shot 3D scan,” in IEEE 2nd Joint 3DIM/3DPVT International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, (2012), pp. 363–370.

Salvi, J.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

Santana-Cedrés, D.

A. Trujillo-Pino, K. Krissian, M. Alemán-Flores, and D. Santana-Cedrés, “Accurate subpixel edge location based on partial area effect,” Image Vis. Comput. 31, 72–90 (2013).
[Crossref]

Sarbolandi, H.

H. Sarbolandi, D. Lefloch, and A. Kolb, “Kinect range sensing: structured-light versus time-of-flight Kinect,” Comput. Vis. Image Underst. 139, 1–20 (2015).
[Crossref]

Sato, Y.

K. Yamauchi, H. Saito, and Y. Sato, “Calibration of a structured light system by observing planar object from unknown viewpoints,” IEEE Int. Conf. on Pattern Recognit. pp. 1–4 (2008).

Schreiber, W.

W. Schreiber and G. Notni, “Theory and arrangements of self-calibrating whole-body three-dimensional measurement systems using fringe projection technique,” Soc. Photo-Optical Instrumentation Eng. 39, 159–169 (2000).

P. Kuehmstedt, G. Notni, W. Schreiber, and J. Gerber, “Full-hemisphere automatic optical 3D measurement system,” SPIE 3100, 261–265 (1997).

Seidel, R.

H. Edelsbrunner, D. G. Kirkpatrick, and R. Seidel, “On the shape of a set of points in the plane,” IEEE Transactions on Inf. Theory 29, 551–559 (1983).
[Crossref]

Seitz, S. M.

K. N. Kutulakos and S. M. Seitz, “A theory of shape by space carving,” Proc. Seventh IEEE Int. Conf. on Comput. Vis. 3, 197–216 (1999).

Sitnik, R.

J. Lenar, M. Witkowski, V. Carbone, S. Kolk, M. Adamczyk, R. Sitnik, M. van der Krogt, and N. Verdonschot, “Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement,” J. biomedical optics 18, 56014 (2013).
[Crossref]

R. Sitnik, “Phase scaling using characteristic polynomials,” Proc. SPIE 5532, 211–217 (2004).
[Crossref]

Sobel, I.

I. Sobel, “On calibrating computer controlled cameras for perceiving 3-D scenes,” Artif. Intell. 5, 185–198 (1974).
[Crossref]

Song, Z.

L. Nie, Y. Ye, and Z. Song, “Method for calibration accuracy improvement of projector-camera-based structured light system,” Opt. Eng. 56, 074101 (2017).
[Crossref]

Sonka, M.

M. Sonka, V. Hlavac, and R. Boyle, Image Processing, Analysis, and Machine Vision (PWS Publishing, 1999), ii ed.

Su, T.

Q. Sun, X. Wang, J. Xu, L. Wang, H. Zhang, J. Yu, T. Su, and X. Zhang, “Camera self-calibration with lens distortion,” Optik 127, 4506–4513 (2016).
[Crossref]

Su, W.-H.

Sun, J.

X. Chen, J. Xi, Y. Jin, and J. Sun, “Accurate calibration for a camera-projector measurement system based on structured light projection,” Opt. Lasers Eng. 47, 310–319 (2009).
[Crossref]

Sun, Q.

Q. Sun, X. Wang, J. Xu, L. Wang, H. Zhang, J. Yu, T. Su, and X. Zhang, “Camera self-calibration with lens distortion,” Optik 127, 4506–4513 (2016).
[Crossref]

Svoboda, T.

T. Svoboda, D. Martinec, and T. Pajdla, “A convenient multicamera self-calibration for virtual environments,” Presence: Teleoperators Virtual Environ.  14, 407–422 (2005).
[Crossref]

Sweetser, J.

A. Grunnet-jepsen, P. Winer, A. Takagi, J. Sweetser, K. Zhao, T. Khuong, D. Nie, and J. Woodfill, “Using the RealSense D4xx depth sensors in multi-camera configurations,” (2018).

Takagi, A.

A. Grunnet-jepsen, P. Winer, A. Takagi, J. Sweetser, K. Zhao, T. Khuong, D. Nie, and J. Woodfill, “Using the RealSense D4xx depth sensors in multi-camera configurations,” (2018).

Tankovich, V.

S. Ryan Fanello, C. Rhemann, V. Tankovich, A. Kowdle, S. Orts Escolano, D. Kim, and S. Izadi, “Hyperdepth: learning depth from structured light without matching,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), pp. 5441–5450.

Theriault, D. H.

D. H. Theriault, N. W. Fuller, B. E. Jackson, E. Bluhm, D. Evangelista, Z. Wu, M. Betke, and T. L. Hedrick, “A protocol and calibration method for accurate multi-camera field videography,” J. Exp. Biol. 217, 1843–1848 (2014).
[Crossref] [PubMed]

Triggs, B.

B. Triggs, P. F. McLauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment -a modern synthesis,” Vis. Algorithms: Theory Pract. 1883, 298–372 (2000).

Trucco, E.

E. Trucco and A. Verri, Introductory Techniques for 3-D Computer Vision(Prentice Hall, 1998).

Trujillo-Pino, A.

A. Trujillo-Pino, K. Krissian, M. Alemán-Flores, and D. Santana-Cedrés, “Accurate subpixel edge location based on partial area effect,” Image Vis. Comput. 31, 72–90 (2013).
[Crossref]

Valle, V.

I. Léandry, C. Brèque, and V. Valle, “Calibration of a structured-light projection system: development to large dimension objects,” Opt. Lasers Eng. 50, 373–379 (2012).
[Crossref]

van der Krogt, M.

J. Lenar, M. Witkowski, V. Carbone, S. Kolk, M. Adamczyk, R. Sitnik, M. van der Krogt, and N. Verdonschot, “Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement,” J. biomedical optics 18, 56014 (2013).
[Crossref]

Verdonschot, N.

J. Lenar, M. Witkowski, V. Carbone, S. Kolk, M. Adamczyk, R. Sitnik, M. van der Krogt, and N. Verdonschot, “Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement,” J. biomedical optics 18, 56014 (2013).
[Crossref]

Verri, A.

E. Trucco and A. Verri, Introductory Techniques for 3-D Computer Vision(Prentice Hall, 1998).

Wang, L.

Q. Sun, X. Wang, J. Xu, L. Wang, H. Zhang, J. Yu, T. Su, and X. Zhang, “Camera self-calibration with lens distortion,” Optik 127, 4506–4513 (2016).
[Crossref]

W. Gao, L. Wang, and Z.-Y. Hu, “Flexible calibration of a portable structured light system through surface plane,” Acta Autom. Sinica 34, 1358–1362 (2008).
[Crossref]

Wang, Q.

J. Yuan, Q. Wang, and B. Li, “A flexile and high precision calibration method for binocular structured light scanning system,” Sci. World J. 2014, 753932 (2014).

Wang, X.

Q. Sun, X. Wang, J. Xu, L. Wang, H. Zhang, J. Yu, T. Su, and X. Zhang, “Camera self-calibration with lens distortion,” Optik 127, 4506–4513 (2016).
[Crossref]

Wang, Y.

Y. Wang, J. I. Laughner, I. R. Efimov, and S. Zhang, “3D absolute shape measurement of live rabbit hearts with a superfast two-frequency phase-shifting technique,” Opt. Express 21, 6631–6636 (2013).

Wei, B.

M. Ren, J. Liang, B. Wei, and W. Pai, “Novel projector calibration method for monocular structured light system based on digital image correlation,” Optik 132, 337–347 (2017).
[Crossref]

Weyrich, T.

F. Sadlo, T. Weyrich, R. Peikert, and M. Gross, “A practical structured light acquisition system for point-based geometry and texture,” IEEE Eurographics Symp. on Point-Based Graph. pp. 89–98 (2005).

Winer, P.

A. Grunnet-jepsen, P. Winer, A. Takagi, J. Sweetser, K. Zhao, T. Khuong, D. Nie, and J. Woodfill, “Using the RealSense D4xx depth sensors in multi-camera configurations,” (2018).

Witkowski, M.

J. Lenar, M. Witkowski, V. Carbone, S. Kolk, M. Adamczyk, R. Sitnik, M. van der Krogt, and N. Verdonschot, “Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement,” J. biomedical optics 18, 56014 (2013).
[Crossref]

Wolf, T.

T. Wolf and R. Konrath, “Avian wing geometry and kinematics of a free-flying barn owl in flapping flight,” Exp. Fluids 56, 28 (2015).
[Crossref]

Woodfill, J.

A. Grunnet-jepsen, P. Winer, A. Takagi, J. Sweetser, K. Zhao, T. Khuong, D. Nie, and J. Woodfill, “Using the RealSense D4xx depth sensors in multi-camera configurations,” (2018).

Woodfill, J. I.

L. Keselman, J. I. Woodfill, A. Grunnet-Jepsen, and A. Bhowmik, “Intel RealSense stereoscopic depth cameras,” CoRR pp. 1–10 (2017).

Wu, D.

D. Wu, T. Chen, and A. Li, “A high precision approach to calibrate a structured light vision sensor in a robot-based three-dimensional measurement system,” Sensors 16, 1–18 (2016).
[Crossref]

Wu, Z.

D. H. Theriault, N. W. Fuller, B. E. Jackson, E. Bluhm, D. Evangelista, Z. Wu, M. Betke, and T. L. Hedrick, “A protocol and calibration method for accurate multi-camera field videography,” J. Exp. Biol. 217, 1843–1848 (2014).
[Crossref] [PubMed]

Xi, J.

X. Chen, J. Xi, Y. Jin, and J. Sun, “Accurate calibration for a camera-projector measurement system based on structured light projection,” Opt. Lasers Eng. 47, 310–319 (2009).
[Crossref]

Xu, J.

Q. Sun, X. Wang, J. Xu, L. Wang, H. Zhang, J. Yu, T. Su, and X. Zhang, “Camera self-calibration with lens distortion,” Optik 127, 4506–4513 (2016).
[Crossref]

Yagi, Y.

R. Sagawa, K. Sakashita, N. Kasuya, H. Kawasaki, R. Furukawa, and Y. Yagi, “Grid-based active stereo with single-colored wave pattern for dense one-shot 3D scan,” in IEEE 2nd Joint 3DIM/3DPVT International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, (2012), pp. 363–370.

Yamauchi, K.

K. Yamauchi, H. Saito, and Y. Sato, “Calibration of a structured light system by observing planar object from unknown viewpoints,” IEEE Int. Conf. on Pattern Recognit. pp. 1–4 (2008).

Ye, Y.

L. Nie, Y. Ye, and Z. Song, “Method for calibration accuracy improvement of projector-camera-based structured light system,” Opt. Eng. 56, 074101 (2017).
[Crossref]

Yu, J.

Q. Sun, X. Wang, J. Xu, L. Wang, H. Zhang, J. Yu, T. Su, and X. Zhang, “Camera self-calibration with lens distortion,” Optik 127, 4506–4513 (2016).
[Crossref]

Yuan, J.

J. Yuan, Q. Wang, and B. Li, “A flexile and high precision calibration method for binocular structured light scanning system,” Sci. World J. 2014, 753932 (2014).

Zhang, H.

Q. Sun, X. Wang, J. Xu, L. Wang, H. Zhang, J. Yu, T. Su, and X. Zhang, “Camera self-calibration with lens distortion,” Optik 127, 4506–4513 (2016).
[Crossref]

Zhang, S.

W. Lohry and S. Zhang, “High-speed absolute three-dimensional shape measurement using three binary dithered patterns,” Opt. Express 22, 26752 (2014).
[Crossref] [PubMed]

Y. Wang, J. I. Laughner, I. R. Efimov, and S. Zhang, “3D absolute shape measurement of live rabbit hearts with a superfast two-frequency phase-shifting technique,” Opt. Express 21, 6631–6636 (2013).

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45, 083601 (2006).

Zhang, X.

Q. Sun, X. Wang, J. Xu, L. Wang, H. Zhang, J. Yu, T. Su, and X. Zhang, “Camera self-calibration with lens distortion,” Optik 127, 4506–4513 (2016).
[Crossref]

Zhang, Z.

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis Mach. Intell. 22, 1330–1334 (2000).
[Crossref]

Zhao, B.

Zhao, K.

A. Grunnet-jepsen, P. Winer, A. Takagi, J. Sweetser, K. Zhao, T. Khuong, D. Nie, and J. Woodfill, “Using the RealSense D4xx depth sensors in multi-camera configurations,” (2018).

Zhou, W.

A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38, 339–344 (1999).
[Crossref]

Acta Autom. Sinica (1)

W. Gao, L. Wang, and Z.-Y. Hu, “Flexible calibration of a portable structured light system through surface plane,” Acta Autom. Sinica 34, 1358–1362 (2008).
[Crossref]

Appl. Opt. (1)

Artif. Intell. (1)

I. Sobel, “On calibrating computer controlled cameras for perceiving 3-D scenes,” Artif. Intell. 5, 185–198 (1974).
[Crossref]

Commun. ACM (1)

M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24, 381–395 (1981).
[Crossref]

Comput. Vis. Image Underst. (1)

H. Sarbolandi, D. Lefloch, and A. Kolb, “Kinect range sensing: structured-light versus time-of-flight Kinect,” Comput. Vis. Image Underst. 139, 1–20 (2015).
[Crossref]

Curr. Opt. Photonics (1)

S. W. Lee, S. Y. Lee, and H. J. Pahk, “Precise edge detection method using sigmoid function in blurry and noisy image for TFT-LCD 2D critical dimension measurement,” Curr. Opt. Photonics 2, 69–78 (2018).

Exp. Fluids (1)

T. Wolf and R. Konrath, “Avian wing geometry and kinematics of a free-flying barn owl in flapping flight,” Exp. Fluids 56, 28 (2015).
[Crossref]

IEEE Comput. Soc. Conf. on Comput. Vis. Pattern Recognit. (1)

P. Beardsley and R. Raskar, “A self-correcting projector,” IEEE Comput. Soc. Conf. on Comput. Vis. Pattern Recognit. 2, 504–508 (2001).

IEEE Transactions on Inf. Theory (2)

H. Edelsbrunner, D. G. Kirkpatrick, and R. Seidel, “On the shape of a set of points in the plane,” IEEE Transactions on Inf. Theory 29, 551–559 (1983).
[Crossref]

S. P. Lloyd, “Least squares quantization in PCM,” IEEE Transactions on Inf. Theory 28, 129–137 (1982).
[Crossref]

IEEE Transactions on Instrumentation Meas. (1)

R. T. McKeon and P. J. Flynn, “Three-dimensional facial imaging using a static light screen (SLS) and a dynamic subject,” IEEE Transactions on Instrumentation Meas. 59, 774–783 (2010).
[Crossref]

IEEE Transactions on Pattern Analysis Mach. Intell. (1)

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis Mach. Intell. 22, 1330–1334 (2000).
[Crossref]

Image Vis. Comput. (1)

A. Trujillo-Pino, K. Krissian, M. Alemán-Flores, and D. Santana-Cedrés, “Accurate subpixel edge location based on partial area effect,” Image Vis. Comput. 31, 72–90 (2013).
[Crossref]

Int. J. Comput. Vis. (1)

S. J. Maybank and O. D. Faugeras, “A theory of self-calibration of a moving camera,” Int. J. Comput. Vis. 8, 123–151 (1992).
[Crossref]

J. biomedical optics (1)

J. Lenar, M. Witkowski, V. Carbone, S. Kolk, M. Adamczyk, R. Sitnik, M. van der Krogt, and N. Verdonschot, “Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement,” J. biomedical optics 18, 56014 (2013).
[Crossref]

J. Exp. Biol. (2)

M. E. Deetjen, A. A. Biewener, and D. Lentink, “High-speed surface reconstruction of a flying bird using structured light,” J. Exp. Biol. 220, 1956–1961 (2017).
[Crossref] [PubMed]

D. H. Theriault, N. W. Fuller, B. E. Jackson, E. Bluhm, D. Evangelista, Z. Wu, M. Betke, and T. L. Hedrick, “A protocol and calibration method for accurate multi-camera field videography,” J. Exp. Biol. 217, 1843–1848 (2014).
[Crossref] [PubMed]

J. Graph. Tools (1)

D. Bradley and G. Roth, “Adaptive thresholding using the integral image,” J. Graph. Tools 12, 13–21 (2007).
[Crossref]

Opt. Eng. (5)

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45, 083601 (2006).

A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38, 339–344 (1999).
[Crossref]

L. Nie, Y. Ye, and Z. Song, “Method for calibration accuracy improvement of projector-camera-based structured light system,” Opt. Eng. 56, 074101 (2017).
[Crossref]

R. Legarda-Saenz, T. Bothe, and W. P. Juptner, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
[Crossref]

Z. Li, “Accurate calibration method for a structured light system,” Opt. Eng. 47, 053604 (2008).
[Crossref]

Opt. express (2)

Opt. Lasers Eng. (2)

X. Chen, J. Xi, Y. Jin, and J. Sun, “Accurate calibration for a camera-projector measurement system based on structured light projection,” Opt. Lasers Eng. 47, 310–319 (2009).
[Crossref]

I. Léandry, C. Brèque, and V. Valle, “Calibration of a structured-light projection system: development to large dimension objects,” Opt. Lasers Eng. 50, 373–379 (2012).
[Crossref]

Optik (2)

M. Ren, J. Liang, B. Wei, and W. Pai, “Novel projector calibration method for monocular structured light system based on digital image correlation,” Optik 132, 337–347 (2017).
[Crossref]

Q. Sun, X. Wang, J. Xu, L. Wang, H. Zhang, J. Yu, T. Su, and X. Zhang, “Camera self-calibration with lens distortion,” Optik 127, 4506–4513 (2016).
[Crossref]

Pattern Recognit. (1)

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

Photogramm. Eng. (1)

D. C. Brown, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).

Presence: Teleoperators Virtual Environ (1)

T. Svoboda, D. Martinec, and T. Pajdla, “A convenient multicamera self-calibration for virtual environments,” Presence: Teleoperators Virtual Environ.  14, 407–422 (2005).
[Crossref]

Proc. Seventh IEEE Int. Conf. on Comput. Vis. (1)

K. N. Kutulakos and S. M. Seitz, “A theory of shape by space carving,” Proc. Seventh IEEE Int. Conf. on Comput. Vis. 3, 197–216 (1999).

Proc. SPIE (1)

R. Sitnik, “Phase scaling using characteristic polynomials,” Proc. SPIE 5532, 211–217 (2004).
[Crossref]

Sci. World J. (1)

J. Yuan, Q. Wang, and B. Li, “A flexile and high precision calibration method for binocular structured light scanning system,” Sci. World J. 2014, 753932 (2014).

Sensors (1)

D. Wu, T. Chen, and A. Li, “A high precision approach to calibrate a structured light vision sensor in a robot-based three-dimensional measurement system,” Sensors 16, 1–18 (2016).
[Crossref]

SIAM J. on Optim. (1)

T. F. Coleman and Y. Li, “An interior trust region approach for nonlinear minimization subject to bounds,” SIAM J. on Optim. 6, 418–445 (1996).
[Crossref]

Soc. Photo-Optical Instrumentation Eng. (1)

W. Schreiber and G. Notni, “Theory and arrangements of self-calibrating whole-body three-dimensional measurement systems using fringe projection technique,” Soc. Photo-Optical Instrumentation Eng. 39, 159–169 (2000).

SPIE (1)

P. Kuehmstedt, G. Notni, W. Schreiber, and J. Gerber, “Full-hemisphere automatic optical 3D measurement system,” SPIE 3100, 261–265 (1997).

Three-Dimensional Image Capture Appl. (1)

J. D. Ackerman, K. Keller, and H. Fuchs, “Surface reconstruction of abdominal organs using laparoscopic structured light for augmented reality,” Three-Dimensional Image Capture Appl. 4661, 3946 (2002).
[Crossref]

Vis. Algorithms: Theory Pract. (1)

B. Triggs, P. F. McLauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment -a modern synthesis,” Vis. Algorithms: Theory Pract. 1883, 298–372 (2000).

Other (14)

E. Trucco and A. Verri, Introductory Techniques for 3-D Computer Vision(Prentice Hall, 1998).

B. Li, L. Heng, K. Koser, and M. Pollefeys, “A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern,” IEEE Int. Conf. on Intell. Robots Syst. pp. 1301–1307 (2013).

ARPG, “Autonomous robotics perception group: visual-inertial calibration tool: vicalib,” (2018).

A. Grunnet-jepsen, P. Winer, A. Takagi, J. Sweetser, K. Zhao, T. Khuong, D. Nie, and J. Woodfill, “Using the RealSense D4xx depth sensors in multi-camera configurations,” (2018).

R. Sagawa, K. Sakashita, N. Kasuya, H. Kawasaki, R. Furukawa, and Y. Yagi, “Grid-based active stereo with single-colored wave pattern for dense one-shot 3D scan,” in IEEE 2nd Joint 3DIM/3DPVT International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, (2012), pp. 363–370.

L. Keselman, J. I. Woodfill, A. Grunnet-Jepsen, and A. Bhowmik, “Intel RealSense stereoscopic depth cameras,” CoRR pp. 1–10 (2017).

S. Ryan Fanello, C. Rhemann, V. Tankovich, A. Kowdle, S. Orts Escolano, D. Kim, and S. Izadi, “Hyperdepth: learning depth from structured light without matching,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), pp. 5441–5450.

J. Y. Bouguet, “Camera calibration toolbox for Matlab,” (2000).

O. D. Faugeras, Three-Dimensional Computer Vision: a Geometric Viewpoint (MIT University, 1993).

M. Sonka, V. Hlavac, and R. Boyle, Image Processing, Analysis, and Machine Vision (PWS Publishing, 1999), ii ed.

S. Y. Chen and Y. F. Li, “Self recalibration of a structured light vision system from a single view,” IEEE Int. Conf. on Robotics & Autom. pp. 2539–2544 (2002).

F. Sadlo, T. Weyrich, R. Peikert, and M. Gross, “A practical structured light acquisition system for point-based geometry and texture,” IEEE Eurographics Symp. on Point-Based Graph. pp. 89–98 (2005).

J. Liao and L. Cai, “A calibration method for uncoupling projector and camera of a structured light system,” IEEE/ASME Int. Conf. on Adv. Intell. Mechatronics pp. 770–774 (2008).

K. Yamauchi, H. Saito, and Y. Sato, “Calibration of a structured light system by observing planar object from unknown viewpoints,” IEEE Int. Conf. on Pattern Recognit. pp. 1–4 (2008).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 The multi-view structured light system. a) Schematic showing the configuration of four camera and projector pairs. To reduce cross-talk between the four projected patterns, we use three different colors matched by camera color filters for pairs 1–3. The color of the 4th pair is duplicated from the 2nd pair, and the signals are separated temporally by alternating reconstruction frames. The other signals are temporally synchronized frame-by-frame. The calibration volume is indicated by the dashed contour, while the 2D calibration target that is moved through the volume is indicated by the rectangle with dots on it. b) Actual setup with the four pairs of high-speed cameras and projectors that we used to reconstruct the flying bird from four camera view angles. The calibration volume and target are shown as in (a). The setup used to reconstruct the sphere and in the majority of the calibration accuracy results had similar spacing and the same camera-projector configuration.
Fig. 2
Fig. 2 Multiple-view structured light calibration technique. a) An aluminum plate covered on both sides with printed dots is moved around the calibration volume by the user (two calibration target positions recorded by a single camera are shown). The gray, pseudo-random dots are printed on two paper stickers which are adhered to each side of the calibration target, while black tape covers the edges of the target. The printed dots for camera calibration are recorded by each high-speed camera when its paired projector is fully on (each pixel is illuminated). The images are 12-bit and thus have much better contrast than can be conveyed in print. b) The same two calibration target positions are shown as in (a), but now pseudo-random dots are projected onto the target and recorded by the high-speed camera for projector calibration. c) A sequence of 6 frames is projected and recorded at 1000 Hz every 0.5 seconds for our specific implementation. Each projector projects four different frames imaged by different cameras. First, a projector projects uniform light (a fully on frame) in the color of the associated camera filter to make the gray dots on the calibration target visible to the paired camera. Second, a pseudo-random dot pattern is projected in all three colors so they can be seen by the other cameras to determine the connectivity of the multi-view system.
Fig. 3
Fig. 3 Image processing steps to detect dots. a-d) Identification of the printed dots captured by the high-speed camera. e-h) Identification of the projected dots captured by the high-speed camera. a,e) Original image as captured by the camera. The images captured by the cameras are 12-bit and thus have much better contrast than can be conveyed in print. b,f) Black and white image after local adaptive thresholding, which we designed to detect dots even in low contrast lighting conditions leading to excess dot detection in low contrast regions with finite camera noise. Black regions of pixels are denoted as blobs and recorded as possible locations of printed or projected dots. Red points (·) are blobs eliminated by K-means clustering based on the following blob parameters: number of similar blobs, area, ellipticity, x position, and y position. Red crosses (X) indicate elliptical blobs that have been eliminated in a second round of filtering in which they were identified as ellipses that are substantially dissimilar from the rest. Blue circles (O) are the final set of detected dots. c,g) A subpixel edge detector is used to identify the edges of elliptical blobs (blue points). We then eliminate outlier edges due to overlapping printed and projected dots (labeled in red in g) using the RANSAC algorithm applied to an elliptic fit. The fitted ellipses are used to compute and display the center, major and minor axis, and orientation of each elliptical blob. d,h) Finally, we apply skew correction to the image (c,g) based on the median orientation and major to minor axis ratio of the detected elliptical dots. This correction transforms the ellipses back into circles and ensures that the spacing between circles is proportional to the spacing between printed or projected dots.
Fig. 4
Fig. 4 Automated matching of detected dots identified using image processing (blue) with pre-determined projected or printed dots (green). a) Detected dots overlaid on a skew-corrected image captured by the high-speed camera. The skewed pose of the elliptical detected dots has been corrected so that the spacing matches the original spacing when the dots were circles. b) Pre-determined printed dots. c) Detected dots are matched with pre-determined printed dots using the same procedure as they are matched with pre-determined projected dots. A sample detected dot is selected along with its seven nearest neighbor dots. The dot is connected to its nearest neighbor with a dashed black line. d) All eight dots from (c) are then normalized according to the transformations needed to move the dot at the center to (0,0) and its nearest neighbor to (1,0). The same is done for a random pre-determined printed dot and its five nearest neighbor dots. In this case, the overlay of the two sets of dots shows a bad match, because the red lines that connect pairs of closest overlapping dots are long. This indicates high error for this particular match between two sets of dots. e) The best match of the set of dots in (c) with a subset of the dots from (b) occurs when the red error lines are barely visible. The matched dots in (e) are used to iteratively propagate matches across the list of all detected dots identified using image processing. f,g) The outcome of the propagated matching process is that the pre-determined dot positions (green) can be overlaid on top of the detected dot positions (blue) for both the printed dots (f) and projected dots (g). For visualization purposes, the green dots displayed here are shown to the full extent of the printed or projected area, even if a matching dot was not detected with image processing.
Fig. 5
Fig. 5 Flow chart of the entire calibration method. At the beginning of the calibration process, the only known variables are the pre-determined printed and projected dot patterns. At the end, dots have been detected, matched, and all calibration parameters optimized. The variables that are known at each procedural calibration step are labeled in green throughout and summarized in the leftmost vertical set of boxes. The notation of select variables has been shortened in the figure, and the full notation can be found in the equations referenced in the figure. a-e) Each procedural calibration step is shown in the next column of boxes with greater detail shown in Figs. 2-4 for the first three steps. e) The large box outlined in gray on the right side of this figure is a zoomed-in view of the full simultaneous calibration algorithm detailed in Section 2.6 of the text. Within this optimization, the calibration parameters are computed by minimizing the reprojection error of the 3D locations of the pre-determined dots compared to the detected dots. f) Part of the way through the full optimization (e), the positions of the detected dots are refined as detailed in Section 2.7 of the text. The key at the bottom is a quick reference with further detail given in the text and summarized in Table 4. The meanings of superscripts and subscripts, which can be applied to multiple variables, are generalized by replacing the main variable with a dashed box.
Fig. 6
Fig. 6 Final calibration of four cameras and four projectors that are calibrated simultaneously [Fig. 1]. Three different colors are used for the four camera-projector pairs (the two different shades of blue used in this figure are solely used to graphically distinguish the pairs using the same blue projection color and filter). The 3D locations of the detected dots and the corresponding detected calibration target positions are displayed in the color corresponding to the camera in which the dot was detected. The coordinate system is relative to the first camera, subscript c1, and all units are in millimeters. The mean absolute value of the reprojection error is 0.094 mm, which is 0.023 % of the diameter of the calibration volume.
Fig. 7
Fig. 7 We demonstrate the accuracy of our structured light system by automatically 3D reconstructing a sphere of known dimensions (structured light system: Fig. 1; calibration: Fig. 6). a) The 3D reconstructed surface of a sphere based on 8 separate images plotted on top of each other for 8 locations in the calibrated volume [Fig. 6]. b-c) The best accuracy is achieved at the stripe intersections, because they correspond to two known projection planes. The remaining points are reconstructed based on one projection plane, which reduces accuracy. The mean absolute value of the error of the 3D surface reconstruction is −0.11 ± 0.38 mm for intersection points (b) and −0.25 ± 0.79 mm for all of the points combined (c). d) Surface reconstruction of the sphere in position 2 (Table 3) based on 8 imaging frames. Filled circles indicate 3D reconstructed stripe intersections of the hash pattern. e) Error in the reconstruction of the sphere in (d) based on the more accurate stripe intersections. f) Error in the reconstruction for all pixel positions along each stipe.
Fig. 8
Fig. 8 Snapshot of a preliminary 3200 fps 3D-surface reconstruction of a Pacific Parrotlet during takeoff, flapping its morphing wings at 20 Hz. a) Color recording of the four projected color patterns on the bird (recorded with a 5th camera not shown in Fig. 1; Phantom LC-310; Vision Research, Wayne, NJ, USA). b) The images captured by the four cameras recording the structured light projected on the bird’s surface (contrast enhanced). From left to right, the views show the camera looking down at the bird (blue light), looking up at the bird (blue light), looking at the right side of the bird (green light), and looking at the left side of the bird (red light). c) A preliminary 3D surface reconstruction of a downstroke of the Pacific Parrotlet in flight. The image processing artifacts and gaps seen in the reconstructions are similar to others found in this region of the downstroke. The bottom surface of the bird (light blue) is reconstructed from frames that are delayed by 1/3200 seconds from the other camera-projector pairs, so that the two blue camera-projector pairs do not overlap. The third reconstructed frame in (c) corresponds to all of the frames in (a) and (b). The first three frames in (c) correspond to 73 %, 87 %, and 100 % of the first downstroke after takeoff, and the final frame corresponds to 13 % of the next upstroke.

Tables (4)

Tables Icon

Table 1 Calibration pattern descriptions. Each calibration pattern consists of uniform dots spread in a pseudo-random pattern with locations determined by a custom Matlab code. The printed dots are used for camera calibration while the projected dots are used for both camera and projector calibration. There are two separate printed dot patterns, one for each side of the calibration target.

Tables Icon

Table 2 Summary of variables that are optimized during simultaneous calibration of all cameras and projectors. These variables are optimized by minimizing the reprojection error function in Eq. (18) using a nonlinear least-squares solver. The number of variables that are optimized for our particular setup are listed for each category of variables (cameras, projectors, and calibration target positions). The a indicates that the rotation and translation of the first camera are set to the identity matrix and origin respectively by design. Including more optimization variables correlates with higher computational cost, so we see that the primary expense lies in solving for the calibration target positions (as exemplified for this particular calibration trial). While these calibration target positions are not the primary goal of the calibration, they are necessary building blocks to compute the primary parameters of interest: the camera and projector calibration variables.

Tables Icon

Table 3 Accuracy of the 3D reconstruction of a sphere of known dimensions (diameter 82.55 ± 0.13 mm) averaged over 8 imaging frames for each of the eight locations in Fig. 7(a). Stripe intersection points are more accurate than the rest of the points, because they are reconstructed based on two projection planes rather than one. The accuracy is the mean error ± precision (s.d.). The absolute error is given in mm and the relative error as a percentage of the average calibration volume diameter 406 mm.

Tables Icon

Table 4 Variable definitions.

Equations (35)

Equations on this page are rendered with MathJax. Learn more.

S b = n b ( E b ¯ ) 2 ( A b ¯ ) ,
min H P H P ˜ 2 ,
KX = [ a 1 a 2 a n b 1 b 2 b n c 1 c 2 c n ] = [ a 1 / c 1 a 2 / c 2 a n / c n b 1 / c 1 b 2 / c 2 b n / c n 1 1 1 ] [ c 1         c 2                 c n ] = P λ ,
min c i R b k : i , c i T c i b k : i , K c i k = 1 K   c i b k P ˜ c i c i b k P c i 2 ,
K = [ α 0 u 0 0 β v 0 0 0 1 ] ,
  c i b k P c i λ = K c i c i b k X c i = K c i (   c i R b k:i c i b k X b k:i + c i T c i b k:i J 1 , n ) ,
min K p i , p i R b k : i ,   p i T b k : i k = 1 K   c i p i b k P ˜ p i c i p i b k P p i 2 ,
  c i p i b k P ˜ c i λ = K c i c i p i b k X ˜ c i = K c i (   c i R b k:i c i p i b k X ˜ b ki + c i T c i b k:i J 1 , n ) ,
  c i p i b k P ˜ c i λ = K c i [   c i R b k:i [ : , 1 ] c i R b k:i [ : , 2 ] c i T c i b k:i ] [   c i p i b k X ˜ b k:i [ 1 , : ]   c i p i b k X ˜ b k:i [ 2 , : ] 1 ] ,
[   c i p i b k X ˜ b k:i [ 1 ]   c i p i b k X ˜ b k:i [ 2 ] 1 ] = ( K c i [   c i R 1 b k:i c i R 2 b k:i c i T c i b k:i ] ) 1 (   c i p i b k P ˜ c i λ ) .
  c i p i b k P ˜ p i λ = K p i c i p i b k X ˜ p i = K p i (   p i R b k:i c i p i b k X ˜ b k:i + p i T p i b k:i J 1 , n ) .
min K c i , K p i , c i R p i , c i T c i p i k = 1 K   c i p i b k P ˜ p i c i p i b k P p i 2 ,
  c i R p i = c i R b k:i (   p i R b k:i ) T ,
  c i T c i p i = c i T c i b k:i c i R p i p i T p i b k:i ,
min   c i R c j , c i T c i c j k = 1 K   c i b k P ˜ c i c j b k P ˜ c i 2 .
  c i R c j = c i R b k:i (   c j R b k:i ) T ,
  c i T c i c j = c i T c i b k:i c i R c j c j T c j b k:i .
min All terms in Table 2 ( E + E ) ,
E = i = 1 I k = 1 K   c i b k X ˜ b 1 , k c i b k X b 1 , k 2 .
[   c i b k X ˜ b 1 ,k [ 1 , : ]   c i b k X ˜ b 1 ,k [ 2 , : ] 1 ] = [ R F [ : , 1 ] R F [ : , 2 ] T F ] 1 [   c i b k P ˜ c i ] D .
R F = c i R w rod 1 ( rod (   w R b 1 , k ) + B R b k ( f r ( i ) F + 1 2 ) ) R * ,
T F = c i R w (   w T w b 1 , k + B T b k ( f r ( i ) F + 1 2 ) w T w c i T * ) ,
R * = b 2 , k R b 1 , k ( θ ) ,
T * = w R b 1 , k b 2 , k T b 1 , k b 1 , k ,
[   c i b k P ˜ c i [ : , m ] ] D = [ x y 1 ] ,
min x , y   c i b k P ˜ c i [ : , m ] λ K c i D ( x , y ) 2 ,
D ( x , y ) = [ ( 1 + d r 1 r 2 + d r 2 r 4 + d r 3 r 6 ) x + 2 d t 1 x y + d t 2 ( r 2 + 2 x 2 ) ( 1 + d r 1 r 2 + d r 2 r 4 + d r 3 r 6 ) y + 2 d t 2 x y + d t 1 ( r 2 + 2 y 2 ) 1 ] ,
K = [ α d s α u 0 0 β v 0 0 0 1 ] .
E = i = 1 I j = 1 J k = 1 K   c i p j b k X ˜ b 1 , k c i p j b k X b 1 , k 2 ,
N Jacobian = 2 ( ( 16 I 6 ) + ( 16 J ) + ( 12 K + 4 ) ) ( N + N   ) ,
N Jacobian adj = 2 ( 16 ( N + N ) 6 N c 1 ) + 2 ( 16 N   ) + 2 ( 12 ( N + N   ) + 4 N b 2 ) .
N Jacobian adj 2 ( 30 6 I ) ( N + N   ) + 2 ( 16 N   ) .
( u c i , v c i , I c i ( u c i , v c i ) ) ( a b k / p j , b b k / p j , I c i ( u c i , v c i ) ) .
min a , b , R , c 1 , c 2 , c 3 Dot pixels c 1 1 + e c 3 ( r ( a , b ) R ) + c 2 I c i ( u c i , v c i ) 2 ,
r ( a , b ) = ( a b ki / p j a ) 2 + ( b b k / p j b ) 2 .

Metrics