Abstract

In this paper, a new measurement method of the rotation angle of a rotor, which is named the ‘visual encoder,’ was proposed. This method is based on the principles of the vision-based method and the optical encoder, and realized by using a high-speed vision system. The visual encoder shows advantageous features such as non-contact, high-resolution and robustness against the free motion and the fluctuation of the rotation axis. A high resolution method to increase the measurement resolution was also suggested. The accuracy and the robustness of the visual encoder were confirmed through the experimental verifications, and the operation was possible at 6,000 rpm even under the fluctuation of rotation axis.

© 2016 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
High-precision rotation angle measurement method based on monocular vision

Jing Jin, Lingna Zhao, and Shengli Xu
J. Opt. Soc. Am. A 31(7) 1401-1407 (2014)

Diffraction-grating-based insitu displacement, tilt, and strain measurements on high-speed composite rotors

Julian Lich, Tino Wollmann, Angelos Filippatos, Maik Gude, Jürgen Czarske, and Robert Kuschmierz
Appl. Opt. 58(29) 8021-8030 (2019)

Self-mixing interferometry for rotational speed measurement of servo drives

Hui Sun, Ji-Gou Liu, Quan Zhang, and Ralph Kennel
Appl. Opt. 55(2) 236-241 (2016)

References

  • View by:
  • |
  • |
  • |

  1. M. Dimmler and C. Dayer, “Optical encoders for small drives,” IEEE/ASME Trans. Mechatron. 1(3), 278–283 (1996).
    [Crossref]
  2. A. Collet, M. Martinez, and S. S. Srinivasa, “The MOPED framework: object recognition and pose estimation for manipulation,” Int. J. Rob. Res. 30(10), 1284–1306 (2011).
    [Crossref]
  3. D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vision 60(2), 91–110 (2004).
    [Crossref]
  4. H. Bay, A. Ess, T. Tuytelaars, and L. V. Gool, “Speeded-up robust features (SURF),” Comput. Vision Image Understanding 110(3), 346–359 (2008).
    [Crossref]
  5. Y. Kwon and W. Kim, “Development of a new high-resolution angle-sensing mechanism using RGB sensor,” IEEE/ASME Trans. Mechatron. 19(5), 1707–1715 (2014).
    [Crossref]
  6. T. Suzuki, T. Endo, O. Sasaki, and J. E. Greivenkamp, “Two-dimensional small-rotation-angle measurement using an imaging method,” Opt. Eng. 45(4), 043604 (2006).
    [Crossref]
  7. W. Li, J. Jin, X. Li, and B. Li., “Method of rotation angle measurement in machine vision based on calibration pattern with spot array,” Appl. Opt. 49(6), 1001–1006 (2010).
    [Crossref] [PubMed]
  8. K. Lee and D. Zhou, “A real-time optical sensor for simultaneous measurement of three-DOF motions,” IEEE/ASME Trans. Mechatron. 9(3), 499–507 (2004).
    [Crossref]
  9. T. Kadowaki, K. Kobayashi, and K. Watanabe, “Rotation angle measurement of high-speed flying object,” in Proceedings of SICE-ICASE International Joint Conference (2006), 5256–5259.
  10. Y. Watnabe, T. Komuro, S. Kagami, and M. Ishikawa, “Multi-target tracking using a vision chip and its application to real-time visual measurement,” J. Adv. Comput. Intelli. Intelli. Inform. 17(2), 121–129 (2005).
  11. H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Manipulation model of thread-rotor object by a robotic hand for high-speed visual feedback control,” in Proceedings of IEEE/ASME International Conference on Advanced Intelligent Mechatronics (2014), pp. 924–930.
  12. H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Robotic manipulation of rotating object via twisted thread using high-speed visual sensing and feedback,” in Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (IEEE, 2015), pp. 265–270.

2014 (1)

Y. Kwon and W. Kim, “Development of a new high-resolution angle-sensing mechanism using RGB sensor,” IEEE/ASME Trans. Mechatron. 19(5), 1707–1715 (2014).
[Crossref]

2011 (1)

A. Collet, M. Martinez, and S. S. Srinivasa, “The MOPED framework: object recognition and pose estimation for manipulation,” Int. J. Rob. Res. 30(10), 1284–1306 (2011).
[Crossref]

2010 (1)

2008 (1)

H. Bay, A. Ess, T. Tuytelaars, and L. V. Gool, “Speeded-up robust features (SURF),” Comput. Vision Image Understanding 110(3), 346–359 (2008).
[Crossref]

2006 (1)

T. Suzuki, T. Endo, O. Sasaki, and J. E. Greivenkamp, “Two-dimensional small-rotation-angle measurement using an imaging method,” Opt. Eng. 45(4), 043604 (2006).
[Crossref]

2005 (1)

Y. Watnabe, T. Komuro, S. Kagami, and M. Ishikawa, “Multi-target tracking using a vision chip and its application to real-time visual measurement,” J. Adv. Comput. Intelli. Intelli. Inform. 17(2), 121–129 (2005).

2004 (2)

K. Lee and D. Zhou, “A real-time optical sensor for simultaneous measurement of three-DOF motions,” IEEE/ASME Trans. Mechatron. 9(3), 499–507 (2004).
[Crossref]

D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vision 60(2), 91–110 (2004).
[Crossref]

1996 (1)

M. Dimmler and C. Dayer, “Optical encoders for small drives,” IEEE/ASME Trans. Mechatron. 1(3), 278–283 (1996).
[Crossref]

Bay, H.

H. Bay, A. Ess, T. Tuytelaars, and L. V. Gool, “Speeded-up robust features (SURF),” Comput. Vision Image Understanding 110(3), 346–359 (2008).
[Crossref]

Collet, A.

A. Collet, M. Martinez, and S. S. Srinivasa, “The MOPED framework: object recognition and pose estimation for manipulation,” Int. J. Rob. Res. 30(10), 1284–1306 (2011).
[Crossref]

Dayer, C.

M. Dimmler and C. Dayer, “Optical encoders for small drives,” IEEE/ASME Trans. Mechatron. 1(3), 278–283 (1996).
[Crossref]

Dimmler, M.

M. Dimmler and C. Dayer, “Optical encoders for small drives,” IEEE/ASME Trans. Mechatron. 1(3), 278–283 (1996).
[Crossref]

Endo, T.

T. Suzuki, T. Endo, O. Sasaki, and J. E. Greivenkamp, “Two-dimensional small-rotation-angle measurement using an imaging method,” Opt. Eng. 45(4), 043604 (2006).
[Crossref]

Ess, A.

H. Bay, A. Ess, T. Tuytelaars, and L. V. Gool, “Speeded-up robust features (SURF),” Comput. Vision Image Understanding 110(3), 346–359 (2008).
[Crossref]

Gool, L. V.

H. Bay, A. Ess, T. Tuytelaars, and L. V. Gool, “Speeded-up robust features (SURF),” Comput. Vision Image Understanding 110(3), 346–359 (2008).
[Crossref]

Greivenkamp, J. E.

T. Suzuki, T. Endo, O. Sasaki, and J. E. Greivenkamp, “Two-dimensional small-rotation-angle measurement using an imaging method,” Opt. Eng. 45(4), 043604 (2006).
[Crossref]

Ishikawa, M.

Y. Watnabe, T. Komuro, S. Kagami, and M. Ishikawa, “Multi-target tracking using a vision chip and its application to real-time visual measurement,” J. Adv. Comput. Intelli. Intelli. Inform. 17(2), 121–129 (2005).

H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Manipulation model of thread-rotor object by a robotic hand for high-speed visual feedback control,” in Proceedings of IEEE/ASME International Conference on Advanced Intelligent Mechatronics (2014), pp. 924–930.

H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Robotic manipulation of rotating object via twisted thread using high-speed visual sensing and feedback,” in Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (IEEE, 2015), pp. 265–270.

Jin, J.

Kadowaki, T.

T. Kadowaki, K. Kobayashi, and K. Watanabe, “Rotation angle measurement of high-speed flying object,” in Proceedings of SICE-ICASE International Joint Conference (2006), 5256–5259.

Kagami, S.

Y. Watnabe, T. Komuro, S. Kagami, and M. Ishikawa, “Multi-target tracking using a vision chip and its application to real-time visual measurement,” J. Adv. Comput. Intelli. Intelli. Inform. 17(2), 121–129 (2005).

Kim, H.

H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Manipulation model of thread-rotor object by a robotic hand for high-speed visual feedback control,” in Proceedings of IEEE/ASME International Conference on Advanced Intelligent Mechatronics (2014), pp. 924–930.

H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Robotic manipulation of rotating object via twisted thread using high-speed visual sensing and feedback,” in Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (IEEE, 2015), pp. 265–270.

Kim, W.

Y. Kwon and W. Kim, “Development of a new high-resolution angle-sensing mechanism using RGB sensor,” IEEE/ASME Trans. Mechatron. 19(5), 1707–1715 (2014).
[Crossref]

Kobayashi, K.

T. Kadowaki, K. Kobayashi, and K. Watanabe, “Rotation angle measurement of high-speed flying object,” in Proceedings of SICE-ICASE International Joint Conference (2006), 5256–5259.

Komuro, T.

Y. Watnabe, T. Komuro, S. Kagami, and M. Ishikawa, “Multi-target tracking using a vision chip and its application to real-time visual measurement,” J. Adv. Comput. Intelli. Intelli. Inform. 17(2), 121–129 (2005).

Kwon, Y.

Y. Kwon and W. Kim, “Development of a new high-resolution angle-sensing mechanism using RGB sensor,” IEEE/ASME Trans. Mechatron. 19(5), 1707–1715 (2014).
[Crossref]

Lee, K.

K. Lee and D. Zhou, “A real-time optical sensor for simultaneous measurement of three-DOF motions,” IEEE/ASME Trans. Mechatron. 9(3), 499–507 (2004).
[Crossref]

Li, W.

Li, X.

Li., B.

Lowe, D. G.

D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vision 60(2), 91–110 (2004).
[Crossref]

Martinez, M.

A. Collet, M. Martinez, and S. S. Srinivasa, “The MOPED framework: object recognition and pose estimation for manipulation,” Int. J. Rob. Res. 30(10), 1284–1306 (2011).
[Crossref]

Sasaki, O.

T. Suzuki, T. Endo, O. Sasaki, and J. E. Greivenkamp, “Two-dimensional small-rotation-angle measurement using an imaging method,” Opt. Eng. 45(4), 043604 (2006).
[Crossref]

Senoo, T.

H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Manipulation model of thread-rotor object by a robotic hand for high-speed visual feedback control,” in Proceedings of IEEE/ASME International Conference on Advanced Intelligent Mechatronics (2014), pp. 924–930.

H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Robotic manipulation of rotating object via twisted thread using high-speed visual sensing and feedback,” in Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (IEEE, 2015), pp. 265–270.

Srinivasa, S. S.

A. Collet, M. Martinez, and S. S. Srinivasa, “The MOPED framework: object recognition and pose estimation for manipulation,” Int. J. Rob. Res. 30(10), 1284–1306 (2011).
[Crossref]

Suzuki, T.

T. Suzuki, T. Endo, O. Sasaki, and J. E. Greivenkamp, “Two-dimensional small-rotation-angle measurement using an imaging method,” Opt. Eng. 45(4), 043604 (2006).
[Crossref]

Tuytelaars, T.

H. Bay, A. Ess, T. Tuytelaars, and L. V. Gool, “Speeded-up robust features (SURF),” Comput. Vision Image Understanding 110(3), 346–359 (2008).
[Crossref]

Watanabe, K.

T. Kadowaki, K. Kobayashi, and K. Watanabe, “Rotation angle measurement of high-speed flying object,” in Proceedings of SICE-ICASE International Joint Conference (2006), 5256–5259.

Watnabe, Y.

Y. Watnabe, T. Komuro, S. Kagami, and M. Ishikawa, “Multi-target tracking using a vision chip and its application to real-time visual measurement,” J. Adv. Comput. Intelli. Intelli. Inform. 17(2), 121–129 (2005).

Yamakawa, Y.

H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Robotic manipulation of rotating object via twisted thread using high-speed visual sensing and feedback,” in Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (IEEE, 2015), pp. 265–270.

H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Manipulation model of thread-rotor object by a robotic hand for high-speed visual feedback control,” in Proceedings of IEEE/ASME International Conference on Advanced Intelligent Mechatronics (2014), pp. 924–930.

Zhou, D.

K. Lee and D. Zhou, “A real-time optical sensor for simultaneous measurement of three-DOF motions,” IEEE/ASME Trans. Mechatron. 9(3), 499–507 (2004).
[Crossref]

Appl. Opt. (1)

Comput. Vision Image Understanding (1)

H. Bay, A. Ess, T. Tuytelaars, and L. V. Gool, “Speeded-up robust features (SURF),” Comput. Vision Image Understanding 110(3), 346–359 (2008).
[Crossref]

IEEE/ASME Trans. Mechatron. (3)

Y. Kwon and W. Kim, “Development of a new high-resolution angle-sensing mechanism using RGB sensor,” IEEE/ASME Trans. Mechatron. 19(5), 1707–1715 (2014).
[Crossref]

M. Dimmler and C. Dayer, “Optical encoders for small drives,” IEEE/ASME Trans. Mechatron. 1(3), 278–283 (1996).
[Crossref]

K. Lee and D. Zhou, “A real-time optical sensor for simultaneous measurement of three-DOF motions,” IEEE/ASME Trans. Mechatron. 9(3), 499–507 (2004).
[Crossref]

Int. J. Comput. Vision (1)

D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vision 60(2), 91–110 (2004).
[Crossref]

Int. J. Rob. Res. (1)

A. Collet, M. Martinez, and S. S. Srinivasa, “The MOPED framework: object recognition and pose estimation for manipulation,” Int. J. Rob. Res. 30(10), 1284–1306 (2011).
[Crossref]

J. Adv. Comput. Intelli. Intelli. Inform. (1)

Y. Watnabe, T. Komuro, S. Kagami, and M. Ishikawa, “Multi-target tracking using a vision chip and its application to real-time visual measurement,” J. Adv. Comput. Intelli. Intelli. Inform. 17(2), 121–129 (2005).

Opt. Eng. (1)

T. Suzuki, T. Endo, O. Sasaki, and J. E. Greivenkamp, “Two-dimensional small-rotation-angle measurement using an imaging method,” Opt. Eng. 45(4), 043604 (2006).
[Crossref]

Other (3)

H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Manipulation model of thread-rotor object by a robotic hand for high-speed visual feedback control,” in Proceedings of IEEE/ASME International Conference on Advanced Intelligent Mechatronics (2014), pp. 924–930.

H. Kim, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Robotic manipulation of rotating object via twisted thread using high-speed visual sensing and feedback,” in Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (IEEE, 2015), pp. 265–270.

T. Kadowaki, K. Kobayashi, and K. Watanabe, “Rotation angle measurement of high-speed flying object,” in Proceedings of SICE-ICASE International Joint Conference (2006), 5256–5259.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1 Concept of visual encoder. (a) The shape of marker is based on the disc of the optical encoder but R-G-B colors replace the repeating transparent-opaque pattern. (b) Rotation angle of rotor is measured by the vision-based method. Tracking of the image center of the marker keeps the position of the color detecting point unique.
Fig. 2
Fig. 2 Pattern design of disc. (a) The pattern used on incremental optical encoder. 2-phase type is generally used to detect the rotating angle and the direction of rotation. (b) The color-gradient type. Reflection model of light is needed to measure the accurate rotation angle. (c) Switching RGB type. The structure of 3 states allows detecting of the direction of rotation, as well as the rotation angle.
Fig. 3
Fig. 3 Flow from image capturing to the calculation of the rotation angle. (a) Determination process to keep the position of detecting point unique. A pixel shifted from the centroid of the object area in the image plane is inspected to detect the dominant color on itself. ROI is set to reduce the calculation cost in order to increase the sampling rate. (b) Extraction process to acquire the rotation variables. Ncnt and Nstate indicate the number that the rotor rotates in turns and sub-turns, respectively. High resolution method can be used to enhance the measurement resolution of the rotation angle and Nsub is the variable for the process. The rotation angle N is calculated from Ncnt, Nstate (and Nsub in case of high resolution). (c) Example of (a). (d) Example of (b).
Fig. 4
Fig. 4 The relation between M and Vmax when Q varies. M is the number of color blocks on the disc, and Vmax is the maximum rotation speed of the rotor that visual encoder can measure the rotation. Q represents the frame rate of the camera (fps) used in visual encoder. The higher Q helps to get the higher resolution in measurement of the rotation angle.
Fig. 5
Fig. 5 Testbed of visual encoder with real-time controller. The whole parts of this system works at 1 kHz, as both of the sampling rate and the control frequency. (a) Visual encoder with high-speed camera of 1,000 fps. (b) Reference system with optical encoder.
Fig. 6
Fig. 6 Measurement of rotation angle at constant speed of the rotor. (upper) Each dashed line indicates the reference angle measured by the optical encoder, and each continuous line by the visual encoder, at each rotation speed [rpm]. (lower) Errors between the two measurements. Two red dashed lines indicate the designed resolution. As long as the errors stay inside two lines, the measurement using the visual encoder is stable.
Fig. 7
Fig. 7 Acceleration test to verify the maximum detectable speed. (upper) The rotation angle measured by the optical encoder (blue) as ground truth, and by the visual encoder (red). (middle) Reference rotor was rotated at constant acceleration. (lower) As the rotation speed is increased, the error can exceed the tolerance at a certain moment (here, at 12.8 sec). This results in the measurement failure (at 7740 rpm).
Fig. 8
Fig. 8 The motion of the rotation axis in fluctuation. When sinusoidal oscillation is introduced in X and Y direction, the trajectory of the rotation axis draws a circle. The amplitude and the frequency were 5 mm and 1 Hz, respectively. The fluctuation in the motion was intentionally generated, to test the robustness of the visual encoder.
Fig. 9
Fig. 9 Rotation angle and the measurement error during the motion shown in Fig. 8. (upper) The rotation angle of the rotor measured by the visual encoder, at 6,000 rpm. Magnified graph shows that the high resolution method improved the measurement accuracy. (lower) The measurement error was substantially reduced by the high resolution method.
Fig. 10
Fig. 10 Sequential images of color pattern on the disc in image processing. Images were taken at the rotation speed of (a) 1,000 rpm and (b) 6,000 rpm, respectively. A color-filled dot indicates the detecting point, and two white dots represent SU and SL in section 3.

Tables (1)

Tables Icon

Table 1 Performance of High Resolution Method (HR)

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

C ( x c , y c ) = ( M 10 M 90 , M 01 M 00 )
M p q = x y x p y q I ( x , y )
N [ rad ] = 2 π { N cnt + ( N state / M ) }
V max [ rpm ] = Q M [ frame / sec ] [ frame / rev ] × 60 [ sec / min ]
[ x s y s ] = [ cos θ sin θ sin θ cos θ ] [ x d x c y d y c ] + [ x c y c ]
N [ rad ] = 2 π { N cnt + ( N state / M ) + N dir ( θ U / ( θ U + θ L ) ) / M }
N dir = { 1 if N i N i 1 1 if N i < N i 1

Metrics