Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Endoscopic displacement measurement based on fiber optic bundles

Open Access Open Access

Abstract

In-line monitoring and routine inspection are essential for using and maintaining complex equipment. The simultaneous implementation of visual positioning and displacement measurement allows the accurate acquisition of characteristics, including object dimensions and mechanical vibrations, while rapidly locking the target position. However, the internal structure of equipment is frequently obscured, making direct visual inspection challenging; therefore, flexible and bendable fiber optic–based endoscopes are extremely valuable in harsh conditions. This study enables all-fiber visual displacement measurement using a single-mode fiber and an imaging fiber bundle. Based on optical triangulation and spot centers extraction method from fiber bundle images, 0.07 mm precision at a measurement distance of 40.12 mm is achieved vertically for rough objects. We demonstrate its surface reconstruction and vibration measurement functions. Factors that affect measurement accuracy, such as light source and object roughness, are also discussed.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

An endoscope is the only effective means of industrial inspection because of confined spaces and complex mechanics that limit the use of conventional measuring instruments. In recent years, industrial endoscopes have been widely applied in industrial science [1], equipment condition monitoring [2] and aerospace manufacturing [3]. It is difficult to maintain the excellent operating performance of electronic endoscopes in environments with high temperatures and strong electromagnetic interference. Rigid optical endoscopes are inflexible, making it challenging to reach the interiors of complex equipment. Fiber optic endoscopes have attracted attention for their bendability and applicability in harsh environments. Fiber optic bundle (FB)-based endoscopes can directly transmit two-dimensional (2D) images back from the distal end without mechanical scanning elements or complicated algorithms used for in-line monitoring and routine inspection of machine interiors [4,5].

During industrial inspection, several important parameters to be detected are related to displacement measurements. Microdisplacement measurements allow effective monitoring of changes in object surface morphology and vibration. Obtaining the absolute position distance from the probe to the detection surface helps in restoring the object’s pose and correcting the acquisition models for physical quantities such as temperature and velocity. A number of optical fiber displacement sensors [69] have been developed to enable in-line displacement measurements. The accuracy of optical fiber displacement sensing systems based on time-of-flight (TOF) method [8,10] is limited by the detector bandwidth. Intensity-based fiber-optic sensors [1118] and laser interferometers [1921] have sub-micron measurement accuracy, but they are more suitable for mirror objects. A laser Doppler profile sensor [22,23] has nanometer-level measurement accuracy for rough objects. However, the setup of the system is more complex and susceptible to environmental impact. These fiber displacement sensors are difficult to monitor the measurement position in real time. Several active optical measurement methods have also been applied to fiber optic endoscopes for visualization. Fiber-optic endoscopes based on time-of-flight methods [24,25] and optical field imaging [26] can achieve video-rate 3D imaging and obtain vertical displacement information, but usually require complex detecting equipment or long measurement times. Depth information can also be obtained using binocular structures and algorithms for objects with distinct features [27,28]. Shape from focus [29] is an effective approach for rough surfaces but the inter-core crosstalk of the FB and the honeycomb structure in images greatly affect the local depth calculation.

Laser-triangulation-based displacement sensors [13,30,31] are widely used for measuring rough surfaces due to their non-contact measurement, high speed, high accuracy and low cost. Many studies have been conducted to improve its performance, such as increasing the measurement accuracy [32], extending the measurement range [33] and enriching the materials to be measured [34]. In addition, some optimization methods have been proposed to eliminate factors causing measurement errors, including laser power fluctuation, scattering noise and surface roughness [3537]. However, system structures and optimization methods usually prove to be effective in space with a large size. When applying laser triangulation to an FB-based endoscope for sensing and measurement, it introduces new challenges for spot extraction such as undersampling and inter-core coupling [3840].

In this work, we designed an all-fiber endoscopic displacement sensor to achieve in-line displacement measurements of rough objects in a narrow area and evaluated its measurement accuracy. Compared to traditional triangulation systems, our probes are at least an order of magnitude reduced in size while maintaining a large dynamic range. Its surface reconstruction and vibration measurement capabilities were demonstrated. Visualizing the target position facilitates multidimensional monitoring. Implementing the displacement measurement function in a flexible FB-based endoscope is extremely valuable for industrial inspection including the defect detection of engine turbine blades, positioning of critical parts in complex equipment or pipelines, and self-awareness of endoscopic robots.

2. Methods

2.1 Experimental setup

Figure 1(a) shows a diagram of the experimental setup based on FB used to achieve distal displacement measurements. The measurement light is generated using a semiconductor laser (λ = 655 nm, MRL-III-655L), transmitted through a single-mode fiber, then irradiated on the object surface through a distal collimation device (Thorlabs, CFS2-532-FC) of the endoscopic probe. A refractive index gradient lens (Edmund Optics) is combined with an FB (FUJIKURA, FIGH-03-215S) to form a relay-receiving system that collects the scattered light reflected by the object’s surface. A collimator (Thorlabs, PAF-X-2-A) projects the proximal pattern of the FB onto the industrial camera (Mindvision U500-C) at the proximal end. The acquisition pattern comprises a 2D visual image and a light spot with displacement information (marked with an orange circle). The commercial FB used contains about 3000 image elements with a diameter of 215 µm and an effective imaging area diameter of approximately 190 µm, with a pixel spacing of 3.3 µm between each core. Figure 1(b) shows the probe’s specific structure. The refractive index gradient cylindrical lens has a diameter of 1 mm and an effective focal length of 0.92 mm, keeping objects in focus over a wide range. The collimating exit and acquisition modules are fixed using a clamping structure with an outer diameter of 10 mm created by three-dimensional (3D) printing. The anterior segments of the two are aligned with a certain clamping angle.

 figure: Fig. 1.

Fig. 1. Diagram of the FB endoscopic displacement measurement based on the triangulation method. (a) An experimental diagram of the displacement sensor based on the FB endoscope. (b) The structure of the endoscopic probe. A collimating lens (CL) is connected to the front end of single-mode fiber (SM) for collimating the outgoing light. A gradient index (GRIN) lens is attached to the end of the fiber optic bundles (FB) for long-range imaging. (c) A diagram of the triangulation principle.

Download Full Size | PDF

2.2 Laser triangulation principle

The oblique configuration of the laser triangulation is used in this study for displacement measurement to reduce the effect of directly reflected light [2]. The measurement principle is described as follows: the laser transmitter, acquisition device and object are in the same plane (Fig. 1(c)). The imaging objective lens is focused on the object’s surface. The transmitter emits an approximately collimated beam of light directed at the surface of object. Due to the diffuse reflection, it is received by the imaging lens at a certain angle and forms a spot at the distal end of the FB. The spot is transmitted via an FB back to the near end and captured by the image sensor. When the object’s displacement changes, the spot position on the distal end of the FB changes, changing the spot position on the image sensor. According to a similar triangle property, the distance from the object to the front of the probe can be calculated from the spot position of the laser on the complementary metal-oxide-semiconductor (CMOS) sensor.

$$d\; = \frac{{f\; \times \; s}}{{\left( {\frac{f}{{\tan ({90^\circ \; - \; \beta } )}}} \right)\; + \; \frac{l}{2}\; - \; \left( {\frac{l}{X}} \right)\; \times \; Xpixel}},$$
where d is the distance from the object to the front of the measurement probe, f is the focal length of the imaging objective, β is the angle between the laser emitter and vertical direction, s is the distance from the laser emitter’s center (A) to the imaging lens’s center (O), l is the diameter of the FB, and Xpixel is the X-axis center pixel position of the spot in the recorded image. X is the total number of pixels in the X direction on the image sensor. The dashed line CD is the line passing through point O parallel to the emitted light and intersecting the FB surface at point C (Fig. 1(c)).

The settings of different structural parameters significantly influence the measurement resolution and range of the triangulation rangefinder. Similar to conventional triangulation-based displacement sensors, there is a tradeoff between the dynamic measurement range and the measurement accuracy. When the system parameters are certain, a closer measurement position means a higher measurement accuracy due to the geometric relationship in Fig. 1(c). An appropriate increase in β improves the displacement resolution, but introduces a loss in measurement range. On the contrary, a larger measuring range can be obtained. Increasing s is beneficial to both the measurement accuracy and the measurement range. However, in order to meet the application requirements of narrow area and large dynamic range, we designed a FB-based triangulation probe with s = 4.5 mm and β=2.5° with an order of magnitude reduction in size compared to conventional triangulation systems.

2.3 Spot center extraction algorithm for FB images

Image acquisition using FB differs from conventional image sensors, where pixels are neatly arranged with small spacing and negligible effect on each other. However, FB has a certain spacing between the cores and only the power coupled into the cores can be received and transmitted to the near end. The incomplete sampling makes it challenging to directly lock the spot boundaries to accurately find the spot center. The measurement accuracy of FB-based triangulation is poor if only the core spacing is used as the minimum resolution of the spot boundary at the image acquisition end. Even though frequency, Gaussian and interpolation filtering can eliminate structure noise, the image resolution is not improved and the original spatial information is destroyed. Small core spacing is set to obtain higher resolution, but this causes a certain amount of power crosstalk to occur between adjacent cores, which changes the intensity distribution of light spots at the near end. Simultaneously, because the light transmission in the core is multimode, the spot image detected is frequently a speckle with uneven distribution, making it further difficult to extract the spot center from the FB images.

This study proposes an uncomplicated solution, the flow of which is shown in Fig. 2(a). The spot pattern collected in real time is binarized after background correction. The processing is based on the Otsu method [41], which is considered the best algorithm for threshold selection in image segmentation; it is computationally easy and independent of image brightness and contrast. The binarized image can be directly calculated using the center-of-mass algorithm to obtain the center of the spot, and morphological processing can be used to obtain the closest contour of the spot. Figure 2(a) displays the original image acquired and the extracted center of mass position after processing at two locations 250 µm apart. A small vertical displacement of the object causes a lateral shift of the measured spot. As a result, the propagation modes of light within each fiber core change. Using the fiber bundle as a relay allows lateral resolution beyond the core spacing limit to be obtained. Furthermore, based on the Anderson localization theory [4244], even the light that is initially coupled into the cladding is confined to a certain region and eventually detected due to the irregular shape of the fiber core. Our method breaks the restriction of core spacing on minimum detectable displacement of the spot.

 figure: Fig. 2.

Fig. 2. Spot center extraction algorithm for FB images. (a) A flow chart of the FB-based spot center extraction algorithm. Original maps of detected spots at different positions and their center-of-mass position. (b) The sensor’s vertical resolution at d = 50 mm and d = 100 mm with the limitations of the FB’s inherent structural (C-B RS) and after algorithmic processing (P-B RS).

Download Full Size | PDF

The minimum detectable lateral displacement of the spot center in the FB output pattern is related to the magnification of the collimating lens and pixel density of the image sensor. In our setup, the theoretical minimum resolvable depth displacement is 0.08 mm at d = 50 mm and 0.34 mm at d = 100 mm (Fig. 2(b)). Our pixel-based method improves the measurement resolution by an order of magnitude compared with the core-based method, which uses the fiber core in the FB as the smallest detection unit. Figure 2(b) shows the system resolution of our method (P-B RS) and measurements based on the intrinsic core spacing of FB (C-B RS) for different β and s values at d = 50 mm and d = 100 mm, respectively.

3. Results and discussions

3.1 Calibration and testing

The vertical measurement resolution of the FB-based displacement sensor is dependent on several parameters, including the spacing, the misalignment and the clamping angle between the transmitter and receiver front ends. Considering the limitation of the probe size and requirements of a large measurement range for in-tube measurement, we set the parameters s = 4.5 mm and β = 2.5° and constructed an FB displacement sensor. An image sensor with a pixel size of 1280 × 960 recorded the spot position and image. The machining and assembly accuracy of the mechanical workpiece is limited; therefore, the displacement sensor must be calibrated. Calibration is performed using a board with white paper, recording the position of the measuring spot in the image at different known distances (sampling from 40 to 125 mm). In Fig. 3(a), the blue dots represent the measured values during calibration. The red curve is the theoretical value of the set parameters. When the spot center is far from the sensor center, there is some deviation between the two, which is considered to be probably caused by the aberration of the GRIN lens. The effect of such deviations can be eliminated by aberration-free lens sets or by calibration methods. Additionally, some machining errors are considered to be acceptable. When using this sensor for measurements, the calibrated curve prevails and the theoretical parameter curve is for reference only.

 figure: Fig. 3.

Fig. 3. Calibration and evaluation of FB displacement sensors. (a) A comparison of theoretical and measured values with set parameters. The blue dots represent the measured values during calibration. The red curve is the theoretical value of the set parameters. (b) Measurement accuracy at different distances. The above two figures show the results of 50 repeated measurements at d = 40 mm and d = 125.5 mm, respectively. (c) The maximum deviation and corresponding error measured at other distances.

Download Full Size | PDF

Using the calibrated curve, we evaluated the measurement accuracy of the FB displacement sensor in the range 40.12–125.76 mm. Fifty repeated measurements were conducted at each displacement position. Figure 3(b) shows the results of 50 measurements at d = 40.12 mm and d = 125.76 mm with a standard deviation of 0.07 and 1.04 mm, respectively. The maximum deviation is 0.28 and 1.9 mm at d = 40.12 mm and d = 125.76 mm, respectively. Theoretically, we designed the probe that could even measure displacements in excess of 400 mm. However, as the measurement distance increases, the measurement deviation also increases (Fig. 3(c)) due to the tradeoff between the accuracy and the dynamic range. Within the range 40.12–125.76 mm, the measurement error can be maintained within 1.9%. Nearly linear and higher accuracy will be obtained when a measurement subinterval is selected.

It is to note that our fiber optic displacement sensor is suitable for large-range displacement measurement of rough object surface, which is difficult for traditional fiber optic displacement sensor. For rough surfaces, a shape profiler based on triangulation [45,46] has a higher accuracy, but the measurement range is usually limited to a few tens of millimeters. Spatial triangulation laser rangefinders [2] have a large measuring range, but they have difficulty reaching narrow areas and maintaining good performance in harsh environments. In addition to optimizing the system structural parameters, some of schemes and algorithms used in traditional triangulation systems [3037] such as the addition of diffraction gratings are applicable to improve the performance of our fiber optic sensors including accuracy and measurement range. Besides, FBs with tighter cores and no crosstalk can be used to improve measurement accuracy.

3.2 Surface reconstruction and vibration measurement

With displacement measurement, various functions can be realized in a narrow area. We demonstrate the application value of our work by demonstrating a 3D surface profile measurement as a typical example. When observing a laterally moving object, we can obtain the 2D reflectance information of the object through the endoscope and realize its 3D surface profile measurements. Figure 4(a) shows the results of imaging the surface of a 3D-printed structure. The stepped shape is reconstructed using an FB displacement sensor. The red dotted line marks the scanning area. The red scatter in Fig. 4(b) represents the measured displacement, and the solid blue line shows the mean values (40.24, 44.07 and 50.01 mm) measured at different steps, corresponding to standard deviations of 0.11, 0.21 and 0.16 mm, respectively. Without much knowledge about the measurement target, the real-time measurement can be realized using the FB-based displacement sensors due to the large dynamic displacement measurement range while observing the target. Precise surface reconstruction can be achieved after target position is locked.

 figure: Fig. 4.

Fig. 4. Application of FB displacement sensor in surface reconstruction and vibration measurement. (a) Surface reconstruction of stepped objects. (b) Analysis of surface measurement. Red Dot: measured value on different steps. Blue lines: mean of measurements. (c) Displacement measurement results at different vibration speeds: v = 1.25 mm/s and v = 0.625 mm/s.

Download Full Size | PDF

Another outstanding advantage of triangulation is that it can realize high-speed measurements. The speed of triangulation displacement depends on the shooting speed of the image sensor. The fast measurement makes it easy to capture even slight vibrations of the object. Here, we use a fast-moving motorized displacement stage to simulate a vibrating object with a displacement amplitude of 1 mm. The shooting speed of the image sensor was set to 5 times per second. Figure 4(c) shows the displacement information with approximate linear accuracy recorded at displacement speeds of 1.25 mm/s and 0.625 mm/s. The amplitude and frequency of the vibration can be obtained quickly by means of the Fourier transform or in combination with signal processing methods such as curve fitting. In addition, a visualization window allows rapidly locking the target position in real time, which helps eliminate the effect of measurement position offset on results. High-speed displacement measurement is useful for real-time vibration measurement and abnormal vibration monitoring.

3.3 Influence of light source and surface roughness

When using a laser to measure objects with rough surfaces, the unevenness of the surface to be measured often causes the shape and light intensity distribution of the reflected spot to vary significantly, which is inconducive to finding the center of the spot. An incoherent light helps reduce these effects, making the detected spot intensity distribution more uniform. Despite the aberrations, the outgoing spot profile remains centrosymmetric and the scatter effect is eliminated. However, our method exploits the power of inter-core coupling, resulting in a more uniform detected spot. When the laser is transmitted in an FB, the adjacent cores can be regarded as oscillating couplers. Different coupling lengths and efficiencies can cause drastic changes in the spot’s intensity distribution at the detection end, causing measurement errors. The incoherent light makes it easier to confine the optical power to the core (Fig. 5(b)) [26]. In practice, this can also effectively reduce the impact of vibration and other disturbances in the transmission process because the inter-core coupling effect is attenuated. However, the intensity in the core is more accurate and prominent, which is detrimental to the resolution of our method, but improves the stability of the measurement. Because our method requires a relative boundary to the background for acquiring the center of the spot, the power in the cladding is also used to calculate the threshold. We evaluated broadband light for displacement measurements. It exhibited a vertical resolution better than 0.25 mm, exceeding the resolution achieved when using a fiber core as the minimum detection unit. Figure 5(a) shows the transverse coordinates of the spot centers and the corresponding distances for sampling at 250-µm intervals. The linear fit coefficient R is as high as 0.9978. Figure 5(b) shows the photogram and binarized pattern for two locations of the spot 250 µm apart. Although the intensity is more restricted to the fiber core, the weaker inter-core coupling and the multimode transmission within each core ensure accurate binarization thresholds and sub-pixel detection and processing.

 figure: Fig. 5.

Fig. 5. Influence of light source and surface roughness on measurement accuracy. (a) Displacement testing in steps of 250 µm using broadband light. (b) Measured spot and binarized results at two positions 250 µm apart. (c) Calibration curves for objects of varied materials. (d) Deviation of objects of varied materials at different displacements. (e) Original images of the measured spots for three kinds of objects at d = 50 mm.

Download Full Size | PDF

The FB receives the scattered light from the measuring surface, and when the measuring surface’s inclination is ignored, its characteristics, such as roughness and reflectivity, influence the measurement accuracy of the FB sensor. The roughness of the following surfaces is in descending order: machined aluminium parts, black PVC tape surface and A4 white paper. The reflectivity of the following surfaces is in descending order: A4 white paper, machined aluminium, and black PVC tape surface. The spots detected by objects with different roughness and reflectivity at the same measurement position are shown in Fig. 5 (e). The test light source power and image sensor parameters were set to the same level. Figure 5(c) shows the measurement results for different objects sampled at 5 mm intervals between d = 40 mm and d = 90 mm. Figure 5(d) shows the deviation from the theoretical value at different displacements. The average pixel deviation was calculated to be 12.0, 8.2 and 10.4 for A4 white paper, black PVC tape surface and machined aluminium with standard deviations of 5.7, 6.1 and 7.0 pixels. The average pixel deviation represents the deviation of the measured value from the theoretical value. Despite the use of an oblique configuration, specular light can also affect the accuracy of the measurement. A higher reflectance results in a larger average pixel deviation. Although our method uses an FB as a relay rather than direct spot detection, roughness can also have an impact on measurement accuracy. As with the results of conventional triangulation, the larger roughness makes the spot more inhomogeneous and the measurement standard deviation is larger. Several methods [28,29] have been proposed to improve the triangulation accuracy of displacement measurements, which are also applicable to our FB-based displacement sensors.

4. Conclusion

Applying triangulation to FB-based fiber optic displacement sensors affords a large measurement range for rough objects while maintaining a small size and flexibility. An FB-based displacement measurement probe was fabricated, enabling displacement measurements without affecting visualization. The measurement error is less than 1.9% in a measurement range of 40.12–125.76 mm. Two typical application scenarios have proven to show the large measuring range and fast measurement for rough object measurements, which ensures the use of this new measurement tool in challenging areas such as in-line measurement of engine turbine blades and positioning and self-awareness of endoscopic robots. Several approaches have also been proposed to improve the accuracy of our sensors including the use of broadband light sources and more closely aligned fiber bundles. Future work will focus on its performance in specific scenarios and the improvement of the accuracy. The displacement sensor has a surprising accuracy for measuring rough objects and has great potential for application in the field of industrial equipment testing in harsh environments.

Funding

National Natural Science Foundation of China (61925502, 62135007).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. R. Dhawan, N. Kawade, and B. Dikshit, “Design and performance of a laser-based compact position sensor for long standoff distance,” IEEE Sens. J. 18(16), 6557–6562 (2018). [CrossRef]  

2. B. Sun and B. Li, “Laser displacement sensor in the application of aero-engine blade measurement,” IEEE Sens. J. 16(5), 1377–1384 (2016). [CrossRef]  

3. A. Maekawa, M. Noda, and M. Shintani, “Experimental study on a noncontact method using laser displacement sensors to measure vibration stress in piping systems,” Measurement 79, 101–111 (2016). [CrossRef]  

4. A. Pösch, J. Schlobohm, S. Matthias, and E. Reithmeier, “Rigid and flexible endoscopes for three-dimensional measurement of inside machine parts using fringe projection,” Optics and Lasers in Engineering 89(89), 178–183 (2017). [CrossRef]  

5. D. Wang, H. Liu, and X. Cheng, “A miniature binocular endoscope with local feature matching and stereo matching for 3D measurement and 3D reconstruction,” Sensors 18(7), 2243 (2018). [CrossRef]  

6. P. J. Boltryk, M. Hill, J. W. McBride, and A. Nasce, “A comparison of precision optical displacement sensors for the 3D measurement of complex surface profiles,” Sensors and Actuators A: Physical 142(1), 2–11 (2008). [CrossRef]  

7. G. Berkovic and E. Shafir, “Optical methods for distance and displacement measurements,” Adv. Opt. Photonics 4(4), 441–471 (2012). [CrossRef]  

8. Y. Na, C. Jeon, C. Ahn, M. Hyun, D. Kwon J, J. Shin, and J. Kim, “Ultrafast, sub-nanometre-precision and multifunctional time-of-flight detection,” Nat. Photonics 14(6), 355–360 (2020). [CrossRef]  

9. I. Coddington, W. C. Swann, L. Nenadovic, and N. R. Newbury, “Rapid and precise absolute distance measurements at long range,” Nat. Photonics 3(6), 351–356 (2009). [CrossRef]  

10. S. Foix, G. Alenya, and C. Torras, “Lock-in Time-of-Flight (ToF) Cameras: A Survey,” IEEE Sens. J. 11(9), 1917–1926 (2011). [CrossRef]  

11. H. Golnabi and P. Azimi, “Design and operation of a double-fiber displacement sensor,” Opt. Commun. 281(4), 614–620 (2008). [CrossRef]  

12. W. H. Ko, K. M. Chang, and G. J. Hwang, “A fiber-optic reflective displacement micrometer,” Sensors and Actuators A: Physical 49(1-2), 51–55 (1995). [CrossRef]  

13. Y. Libo and Q. Anping, “Fiber-optic diaphragm pressure sensor with automatic intensity compensation,” >Sensors and Actuators A: Physical 28(1), 29–33 (1991). [CrossRef]  

14. S. Xie, X. Zhang, B. Wu, and Y. Xiong, “Output characteristics of two-circle coaxial optical fiber bundle with regard to three-dimensional tip clearance,” Opt. Express 26(19), 25244–25256 (2018). [CrossRef]  

15. A. G. Leal-Junior, A. Frizera, C. Marques, M. R. A. Sánchez, W. M. dos Santos, A. A. G. Siqueira, M. V. Segatto, and M. J. Pontes, “Polymer Optical Fiber for Angle and Torque Measurements of a Series Elastic Actuator's Spring,” J. Lightwave Technol. 36(9), 1698–1705 (2018). [CrossRef]  

16. A. G. Leal-Junior, C. R. Díaz, C. Marques, M. J. Pontes, and A. Frizera, “Multiplexing technique for quasi-distributed sensors arrays in polymer optical fiber intensity variation-based sensors,” Opt. Laser Technol. 111, 81–88 (2019). [CrossRef]  

17. N. D. Acha, A. B. Socorro-Leránoz, C. Elosúa, and I. R. Matías, “Trends in the Design of Intensity-Based Optical Fiber Biosensors (2010–2020),” Biosensors 11(6), 197 (2021). [CrossRef]  

18. M. Loyez, M. Lobry, E. M. Hassan, M. C. DeRosa, C. Caucheteur, and R. Wattiez, “HER2 breast cancer biomarker detection using a sandwich optical fiber assay,” Talanta 221, 121452 (2021). [CrossRef]  

19. D. Rugar, H. J. Mamin, and P. Guethner, “Improved fiber–optic interferometer for atomic force microscopy,” Appl. Phys. Lett. 55(25), 2588–2590 (1989). [CrossRef]  

20. C. A. Regal, J. D. Teufel, and K. W. Lehnert, “Measuring nanomechanical motion with a microwave cavity interferometer,” Nature Phys 4(7), 555–560 (2008). [CrossRef]  

21. F. Mammano and J. F. Ashmore, “Reverse transduction measured in the isolated cochlea by laser michelson interferometry,” Nature 365(6449), 838–841 (1993). [CrossRef]  

22. P. Günther, R. Kuschmierz, T. Pfister, and J. W. Czarske, “Displacement, distance, and shape measurements of fast-rotating rough objects by two mutually tilted interference fringe systems,” J. Opt. Soc. Am. A 30(5), 825–830 (2013). [CrossRef]  

23. T. Pfister, L. Büttner, and J. Czarske, “Laser Doppler profile sensor with sub-micrometre position resolution for velocity and absolute radius measurements of rotating objects,” Meas. Sci. Technol. 16(3), 627–641 (2005). [CrossRef]  

24. J. Lee, Y. Kim, K. Lee, S. Lee, and S. Kim, “Time-of-flight measurement with femtosecond light pulses,” Nat. Photonics 4(10), 716–720 (2010). [CrossRef]  

25. D. Stellinga, D. B. Phillips, S. P. Mekhail, A. Selyem, S. Turtaev, T. Čižmár, and M. J. Padgett, “Time-of-flight 3D imaging through multimode optical fibers,” Science 374(6573), 1395–1399 (2021). [CrossRef]  

26. A. Orth, M. Ploschner, E. R. Wilson, I. S. Maksymov, and B. C. Gibson, “Optical fiber bundles: ultra-slim light field imaging probes,” Sci. Adv. 5(4), 1–10 (2019). [CrossRef]  

27. A. Geiger, J. Ziegler, and C. Stiller, “StereoScan: dense 3d reconstruction in real-time,” 2011 IEEE Intelligent Vehicles Symposium (IV), 963–968 (2011). [CrossRef]  

28. A. Schoob, D. Kundrat, S. Lekon, L. A. Kahrs, and T. Ortmaier, “Color-encoded distance for interactive focus positioning in laser microsurgery,” Opt. Laser Eng. 83, 71–79 (2016). [CrossRef]  

29. S. K. Nayar and Y. Nakagawa, “Shape from focus: an effective approach for rough surfaces,” Proceedings,” IEEE International Conference on Robotics and Automation 2, 218–225 (1990). [CrossRef]  

30. R. G. Dorsch, G. Hausler, and J. M. Herrmann, “Laser triangulation: fundamental uncertainty in distance measurement,” Appl. Opt 33(7), 1306–1314 (1994). [CrossRef]  

31. M. Rioux, “Laser range finder based on synchronized scanners,” Appl. Opt. 23(21), 3837–3844 (1984). [CrossRef]  

32. G. Ye, Y. Zhang, W. Jiang, S. Liu, L. Qiu, X. Fan, H. Xing, P. Wei, B. Lu, and H. Liu, “Improving measurement accuracy of laser triangulation sensor via integrating a diffraction grating,” Optics and Lasers in Engineering 143, 106631 (2021). [CrossRef]  

33. S. A. Reza, T. S. Khwaja, M. A. Mazhar, H. K. Niazi, and R. Nawab, “Improved laser-based triangulation sensor with enhanced range and resolution through adaptive optics-based active beam control,” Appl. Opt. 56(21), 5996–6006 (2017). [CrossRef]  

34. K. Žbontar, M. Mihelj, B. Podobnik, F. Povše, and M. Munih, “Dynamic symmetrical pattern projection based laser triangulation sensor for precise surface position measurement of various material types,” Appl. Opt. 52(12), 2750–2760 (2013). [CrossRef]  

35. B. Muralikrishnan, W. Ren, D. Everett, E. Stanfield, and T. Doiron, “Performance evaluation experiments on a laser spot triangulation probe,” Measurement 45(3), 333–343 (2012). [CrossRef]  

36. B. Li, F. Li, H. Liu, H. Cai, X. Mao, and F. Peng, “A measurement strategy and an error-compensation model for the on-machine laser measurement of large-scale free-form surfaces,” Meas. Sci. Technol. 25(1), 015204 (2014). [CrossRef]  

37. S. H. Patil and R. Kulkarni, “Surface roughness measurement based on singular value decomposition of objective speckle pattern,” Optics and Lasers in Engineering 150, 106847 (2022). [CrossRef]  

38. K. L. Reichenbach and C. Xu, “Numerical analysis of light propagation in image fibers or coherent fiber bundles,” Opt. Express 15(5), 2151–2165 (2007). [CrossRef]  

39. X. Chen, K. L. Reichenbach, and C. Xu, “Experimental and theoretical analysis of core-to-core coupling on fiber bundle imaging,” Opt. Express 16(26), 21598–21607 (2008). [CrossRef]  

40. A. Perperidis, H. E. Parker, A. Karam-Eldaly, Y. Altmann, K. Dhaliwal, R. R. Thomson, M. G. Tanner, and S. McLaughlin, “Characterization and modelling of inter-core coupling in coherent fiber bundles,” Opt. Express 25(10), 11932–2165 (2017). [CrossRef]  

41. N. Otsu, “A threshold selection method from gray level histograms,” IEEE Trans. Syst., Man, Cybern. 9(1), 62–66 (1979). [CrossRef]  

42. S. Karbasi, R. J. Frazier, K. W. Koch, T. Hawkins, J. Ballato, and A. Mafi, “Image transport through a disordered optical fiber mediated by transverse anderson localization,” Nat. Commun. 5(1), 3362 (2014). [CrossRef]  

43. B. Abaie, E. Mobini, S. Karbasi, T. Hawkins, J. Ballato, and A. Mafi, “Random lasing in an anderson localizing optical fiber,” Light Sci Appl 6(8), e17041 (2017). [CrossRef]  

44. G. Ruocco, B. Abaie, W. Schirmacher, A. Mafi, and M. Leonetti, “Disorder-induced single-mode transmission,” Nat. Commun. 8(1), 14571 (2017). [CrossRef]  

45. N. V. Gestel, S. Cuypers, P. Bleys, and J. Kruth, “A performance evaluation test for laser line scanners on CMMs,” Optics and Lasers in Engineering 47(3-4), 336–342 (2009). [CrossRef]  

46. M. F. M. Costa and J. B. Almeida, “Inspection of rough surfaces by optical triangulation,” Proc. SPIE 1712, 14th Symposium on Photonic Measurement (1993).

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1.
Fig. 1. Diagram of the FB endoscopic displacement measurement based on the triangulation method. (a) An experimental diagram of the displacement sensor based on the FB endoscope. (b) The structure of the endoscopic probe. A collimating lens (CL) is connected to the front end of single-mode fiber (SM) for collimating the outgoing light. A gradient index (GRIN) lens is attached to the end of the fiber optic bundles (FB) for long-range imaging. (c) A diagram of the triangulation principle.
Fig. 2.
Fig. 2. Spot center extraction algorithm for FB images. (a) A flow chart of the FB-based spot center extraction algorithm. Original maps of detected spots at different positions and their center-of-mass position. (b) The sensor’s vertical resolution at d = 50 mm and d = 100 mm with the limitations of the FB’s inherent structural (C-B RS) and after algorithmic processing (P-B RS).
Fig. 3.
Fig. 3. Calibration and evaluation of FB displacement sensors. (a) A comparison of theoretical and measured values with set parameters. The blue dots represent the measured values during calibration. The red curve is the theoretical value of the set parameters. (b) Measurement accuracy at different distances. The above two figures show the results of 50 repeated measurements at d = 40 mm and d = 125.5 mm, respectively. (c) The maximum deviation and corresponding error measured at other distances.
Fig. 4.
Fig. 4. Application of FB displacement sensor in surface reconstruction and vibration measurement. (a) Surface reconstruction of stepped objects. (b) Analysis of surface measurement. Red Dot: measured value on different steps. Blue lines: mean of measurements. (c) Displacement measurement results at different vibration speeds: v = 1.25 mm/s and v = 0.625 mm/s.
Fig. 5.
Fig. 5. Influence of light source and surface roughness on measurement accuracy. (a) Displacement testing in steps of 250 µm using broadband light. (b) Measured spot and binarized results at two positions 250 µm apart. (c) Calibration curves for objects of varied materials. (d) Deviation of objects of varied materials at different displacements. (e) Original images of the measured spots for three kinds of objects at d = 50 mm.

Equations (1)

Equations on this page are rendered with MathJax. Learn more.

d = f × s ( f tan ( 90 β ) ) + l 2 ( l X ) × X p i x e l ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.