Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Study of 256 fiber array biaxial LiDAR optical assembly measurements

Open Access Open Access

Abstract

This paper presents a method for measuring the optical assembly results based on multi-beam biaxial LiDAR. This method analyzes the optical assembly parameters of a LiDAR system affecting the LiDAR operation, and an experimental measurement system is built using a collimator to simulate the infinity imaging field. An InGaAs infrared camera is used to take pictures of the laser spot from the LiDAR transmitter and receiver, and then fit the laser spot images with Gaussian equations to calculate the biaxial LiDAR optical assembly results. Finally, the possible effecting factors of LiDAR alignment results are analyzed. This method is experimentally proven to achieve the measurement of the optical assembly results of a large scale multi-beam LiDAR. The possibility of further optimizing the measurement method by shaping the transmit laser is also reported.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

LiDAR (light detection and ranging) is an active remote sensing technique, it is determining target range by measuring the interval time between laser emit time and target surface reflected laser receive time, and now this technology is used in spaceborne (airborne) mapping, target detection and three-dimensional imaging [14]. Nowadays with the improvement of application requirements, LiDAR requires larger image fields, faster image speed and longer work distances. To expand image field, one way is to increase the divergence angle of the transmit laser, but for long-range detection when the divergence angle of the transmit laser up to milliradian-level, the diameter of laser footprints will exceed 10 m at 10 km distance. Such huge laser footprints will cause low energy utilization rate, echo pulse broadening, low distance accuracy, and low space resolution. Therefore, most long-range LiDAR transmit laser divergence angles are in submilliradian-level [58].

Earlier LiDAR were designed in single beam and mapping with scanning system, and wide range images will take a lot of time in scanning. Recently, laser beam split technology and fiber array technology leads a new way of LiDAR transmitter-receiver system, and LiDAR has gradually developed from a single-beam to dozens even hundreds of beams. The advantage of this technology is to enlarge the imaging field by increasing the number of transmit laser without increasing the transmit laser divergence angle. Larger detect area needs larger FoV (field of view), where single-beam LiDAR can only increase FoV, and fiber array technology can manufacture the fiber array which has the same shape as transmit laser footprints to let the FoV overlaps the transmit laser footprints. In this way, we can compress FoVs to reduce the influence of ambient light and avoid the possible reflected laser from other laser channels.

A specific example of a long-range multi-beam LiDAR is the main load of the ICESat-2 satellite: the Advanced Topographic Laser Altimeter System (ATLAS). Its Diffractive Optical Element splits the single outgoing beam into 6 beams and the reflected laser from target surface are focused onto the receiver optics by receiving telescope. The footprint size of the laser was less than 17 m on the ground while working at 500 km average orbital altitude, such a small footprint size is important to provide a vertical precision of 3 cm when measuring the sea surface height [9].

An example of a large-scale multi-beam LiDAR is the airborne LiDAR. A long-range photon-counting 3D imaging LiDAR designed by Tian et al. contains 32-by-32 transceivers. The system uses a 2D microlens array to divide the outgoing laser and coupled it into a 32-by-32 fiber array to gain a large-scale footprint, and then use a 64-by-64 Gm-APD array to gather the echoes. Such large-scale footprints and FoV achieves 500 m away building 3D imaging recover without scanning [10].

Submilliradian-level transmit laser divergence angle and FoV and multi-beam arrays have become a trend, but LiDAR in large scale of multi-beam and submilliradian-level transmit laser divergence angle and FoV cause problem in getting the best SNR (Signal to Noise Ratio): Only when laser footprints and FoVs overlapped at the target surface can reflected laser accepted by receiver in maximum efficiency. However, Submilliradian-level transmit laser divergence angle and FoV alignment needs microradian-level alignment precision. Furthermore, LiDAR optical components machining error, LiDAR optical assembly system accuracy and ambient temperature change will cause unexpected errors and affect the assembly results. One of the biggest challenges for multi-beam LiDAR is to minimize the align error to maximize the SNR.

In this paper, the LiDAR is used to track long range small target moving path, so we design a multi-beam array with 256 fibers which contains 4 columns and 64 rows. For the transmitter, two nearby columns in one row are vertical tangent as well as two nearby rows, so the multi-beam array has no blind area in vertical side. And the receiver has the same multi-beam array arrangement as the transmitter except the FoV is larger than the transmit laser divergence angle. However, because of the assembly error and machining error, multi-beam array will not be vertical when assembling the LiDAR optical system. Therefore, there must exists an angle between the multi-beam array and the horizontal line cause intervals in multi-beam array between two nearby beams which lead to scanning blind area. Now the next challenge arises: the multi-beam configuration needs to consider three-dimensional state of the multi-beam array instead of single beam configuration only needs to consider single-beam alignment, and we need to ensure the LiDAR multi-beam array are assembled without scanning blind area during LiDAR optical system assembly.

In this paper, we analyze the main parameters affect LiDAR detection during LiDAR optical system assembly. Then we establish an optical system for LiDAR assembling parameter measurement and obtain representative channel laser spot images in fiber array. And we use Gaussian fitting method to extract spots center and calculate assembly parameters. Finally, we analyze other factors that may affect results. The method described in this paper may offer a simple way for fiber array dense arrangement LiDAR assembly parameter measurement while maintaining a certain level of accuracy.

2. Principle and parameters

2.1 Alignment principle

Transmit a laser beam and receive the reflected light from target, then measure the distance by calculating the time of flight of laser is the principle of LiDAR. For transmitter-receiver systems, there are two fundamentally different configurations: coaxial configuration and biaxial configuration. Among them, the coaxial configuration uses only one lens set for both transmit and receive paths, that simple optical path does not require alignment. In the biaxial configuration, the transmit path and receive path are totally separated, the emit laser were sent by transmitter lens set and the reflected laser passes receiver lens set and finally focused onto the image-side focal plane. This configuration requires two different optical designs for transmitter-receiver. Because of two separated optical paths, biaxial configuration exists blind area at close range, such characteristic can avoid detector saturation caused by close object, especially the system transmits high energy laser in long-distance detection [11].

In this paper, we design a biaxial LiDAR using a fiber array with 256 fibers to transmit laser to obtain a large laser footprints array, and use another fiber array to receive reflected laser, thus each FoV can only receive the reflected laser transmitted by the same channel of transmitter fiber array. And its working principle is shown in Fig. 1, 256 uniformly arranged fiber transmits 256 laser beams from the image-side focal plane of the transmitter lens, and 256 laser footprints are formed in the far field. At the same time, 256 uniformly arranged lasers are transmitted from the object-side focal plane of the receiver lens, and 256 laser footprints are also formed in the far field. If the transmitter lens and receiver lens are adjusted so that two sets of laser footprints are overlapped in the far field, according to the inverse of the optical path, we can assume that the transmitter and receiver are aligned.

 figure: Fig. 1.

Fig. 1. Multi-beam biaxial LiDAR alignment principle.

Download Full Size | PDF

2.2 Boresite alignment parameter

According to the alignment principle of biaxial LiDAR, only when the optical axis of the transmitter lens and the optical axis of the receiver lens are completely paralleled, will the transmit laser footprint and FoV overlapped at infinity distance, and it is important for maximizing target reflection signal [12,13]. In this situation, the angular distance θ between transmit laser footprints and FoVs is

$$\theta = \sin \frac{d}{R}$$
where d is the distance between the transmitter lens optical axis and the receiver lens optical axis, R is the target distance. However, due to the influence of transmitter-receiver optical assembly error there will be an angle δ between transmitter lens optical axis and receiver lens optical axis, as shown in Fig. 2.

 figure: Fig. 2.

Fig. 2. Biaxial LiDAR overlap schematic representation.

Download Full Size | PDF

And the angular distance θδ between transmit laser footprint and FoV is

$${\theta _\delta } = \delta + \theta$$

When detecting long distance target, θ in Eq. (2) can be equivalent to zero, so the angular distance θδ is equal to δ. Obviously, as shown in Figs. 3(a)–(d), different δ will lead to four different situations.

 figure: Fig. 3.

Fig. 3. Biaxial LiDAR overlaps schematic representation, θT is the divergence angle of transmit laser, θR is FoV. (a) Center overlapping; (b) overlapping; (c) partial overlapping; (d) non-overlapping.

Download Full Size | PDF

In Fig. 3(a), δ equals zero and it is the best situation of transmitter-receiver alignment. Figure 3(b) shows that the transmit laser footprint and the FoV were overlapped when

$$\delta \le \frac{\textrm{1}}{\textrm{2}}({{\theta_R}\textrm{ - }{\theta_T}} )$$

Figure 3(c) shows that the transmit laser footprint and the FoV were partially overlapped when

$$\frac{\textrm{1}}{\textrm{2}}({{\theta_R}\textrm{ - }{\theta_T}} )\le \delta \le \frac{\textrm{1}}{\textrm{2}}({{\theta_R}\textrm{ + }{\theta_T}} )$$

Figure 3(b) shows that the transmit laser footprint and the FoV were non-overlapped when

$$\frac{\textrm{1}}{\textrm{2}}({{\theta_R}\textrm{ + }{\theta_T}} )\le \delta. $$

Therefore, it is necessary to satisfy Eq. (3) to ensure that the transmit laser footprint and FoV were overlapped to maximizing target reflection signal. The optical design parameters of LiDAR in this paper are shown in Table 1, where θT = 333 µrad and θR = 417 µrad, according to Eq. (3), when δ ≤ ± 42.00 µrad, the transmit laser footprint and the FoV will overlap.

Tables Icon

Table 1. Multi-beam biaxial LiDAR system design parameter

2.3 Fiber array incline parameter

The alignment principle of a single laser beam was discussed in Section 2.2, and in this paper, we design a fiber array containing 4 columns and 64 rows, and the diameter of each transmitter fiber core is 50 µm. Each fiber is angularly separated at 200 µm, so two nearby columns in one row, as well as two nearby rows, are vertical tangent, thus the fiber array has no blind area in vertical direction. Each fiber on the fiber array connect to a fiber laser, and when transmit laser, a lens in 150 mm focal length with telecentric optical design is used to collimate the laser beam into multiple beams, each laser beam has a divergence angle of 333 µrad and 1332 µrad pitch with others. In the receiver optical system, the telecentric optical lens and fiber array layout are same with the transmitter optical system (except the diameter of fiber core in fiber array is 62.5 µm). Figure 4(a) shows the best situation of the state of LiDAR optical system, where transmit laser footprints and FoVs are center overlapped, and have no blind in vertical direction. In real situations, due to the influence of assembly error, there will be an angle between the fiber array and the horizontal axis, and that will cause blindness in vertical side. So, it is necessary to consider the impact of the installation status of the fiber array.

 figure: Fig. 4.

Fig. 4. Fiber array incline state. (a) Beat situation, the small circle is the laser spot from transmitter, and the large circle is the FoV from receiver. And (X, Y) represents the beam coordinates in the fiber array (X represents the row position, and Y represents the column position); (b) inclined transmitter fiber array; (c) inclined receiver fiber array.

Download Full Size | PDF

As shown in Fig. 4(b), there is an angle α between the fiber array and horizontal axis. At that time, there exist a distance ΔdT between two adjacent beams of the fiber array in the vertical direction. When imaging a long-distance target, ΔdT may larger than the target size, in that case, system will miss target during scanning. Therefore, it is necessary to ensure that ΔdT in transmitter fiber array and ΔdR in receiver fiber array are smaller than the target size. It can be known from the geometric relationship in Fig. 4(b), the angle α between fiber array and horizontal axis is

$$\alpha = \arcsin \frac{{\Delta {d_\textrm{T}}}}{{3L}} $$
where L is the horizontal distance between two adjacent beams in the same row. In this paper, the maximum ranging distance of LiDAR is 6 km, and the minimum detection target is 10 cm. Taking half of the target size as the detection margin, it is necessary to ensure that ΔdT is less than 5 cm at 6 km, and the angle between the fiber array and the horizontal axis in transmitter αT is 0.83 mrad at maximum. For the receiver, the optical fiber array arrangement is same as the transmitter, but the FoV is 417 µrad. By calculating through the geometric relationship, the angle between the fiber array and the horizontal axis in receiver αR is 43.00 mrad at maximum.

3. Methods

3.1 Experimental setup

We design an experimental system, as shown in Fig. 5, where all the optical components are assembled on an aluminum platform to ensure that each component has the same reference plane. The experimental system incorporates a 1550 nm fiber laser, a fiber optical splitter, a collimator with 3500 mm focal length, an InGaAs infrared camera (GS3841-SW64015-C) and LiDAR transmitter-receiver optical system. The LiDAR transmitter-receiver optical system contains two collimated lens sets, two fiber array and a bandpass filter mounted in front of the transmitter lens. LiDAR optical system is placed on a two-dimensional turntable in front of the collimator, and InGaAs infrared camera is placed at the focal plane of the collimator. The experimental system uses a 1550 nm fiber laser as the laser source, and the emitted laser split into two beams through a fiber optical splitter.

 figure: Fig. 5.

Fig. 5. Schematic diagram of the experimental system.

Download Full Size | PDF

When measuring the boresite alignment parameter, we need to select two same fiber channels from the transmitter fiber array and receiver fiber array to connect the split fiber. The laser emitted from the end of each fiber and passed through the collimated lens (the transmitter and receiver lens are identical in optical design), then converged in the focal plane of the collimator to simulate the laser spots at infinity field. Through the law of reversibility of path of light, the laser spot emitted by the receiver is the FoV. Then adjusting the two-dimensional turntable, let two laser spots fall into the FoV of the InGaAs infrared camera, and then photograph two laser spots separately. By analyzing these two laser spots position, according to the law of reversibility of path of light, the laser spot and FoV will overlap when the center points of two laser spots satisfied Eq. (3).

For the fiber array incline parameter measurement, the way to photograph two laser spots are the same as the boresite alignment parameter measurement, but these two laser spots are emitted from the first and last row of channels from the same column of the same fiber array. However, because of the limited FoV of the InGaAs infrared camera, those two laser spots cannot be photographed at the same time, so it is necessary to rotate the two-dimensional turntable to take two laser spot images separately. By analyzing the relative positions of these two laser spots, the incline angle of the fiber array can be calculated according to Eq. (6). Based on the law of reversibility of path of light, this measurement method is also applicable to the receiver fiber array incline parameter measurement.

3.2 Measurement processing

After acquiring the laser spot image in the experiment setup, the laser spot image needs to be processed to extract the laser spot center point. The regularly method is to choose the centroid as the particle center [14], the centroid of acquired laser spot bitmap images is calculated as

$$\left( {{X_{cen}},{Y_{cen}}} \right) = \left( {\frac{{\sum\limits_k {{X_k}{I_k}} }}{{\sum\limits_k {{I_k}} }},\frac{{\sum\limits_k {{Y_k}{I_k}} }}{{\sum\limits_k {{I_k}} }}} \right)$$
where (Xk, Yk) is the position of pixel k and Ik is the intensity of pixel k. The centroid of the laser spot is obtained by solving the center spot along the X axis and Y axis. The InGaAs infrared camera photographed laser spot image, as shown in Fig. 6(a), is a grayscale image containing values ranging from 0 to 255 in each pixel, and the value of each pixel is the relative value of the current pixel detected laser intensity (camera photograph software has an attenuation function to prevent InGaAs infrared camera CCD saturation, so each pixel collect pixel values are relative to others). In Fig. 6(a), the center region of the laser spot has a flat top, because the CCD of InGaAs infrared camera is saturated by the high-power transmitted laser out of attenuation range, which leads image distortion. If choose the centroid as laser spot center, it will cause errors due to InGaAs infrared camera CCD saturation. In addition, due to the InGaAs infrared camera CCD defect, there exist several noises in the edge region of the image, which will also lead to errors in the centroid method.

In this paper, the transmitted laser is a fundamental Gaussian beam, for the Gaussian distribution of the transmit laser, although the presence of saturated pixels in the center of the laser spot image causes image distortion, the saturated pixels will only have an impact on the intensity judgment of the laser spot and will not influence the symmetry of the laser spot, the laser spot is still in Gaussian symmetry. Thus, we can take advantage of the symmetry of the Gaussian function to fit the transmit laser using the Gaussian function [15,16].

$$y = A\exp [ - {(x - \mu )^2}/2{\sigma ^2}]. $$

This function is a symmetrical bell-shaped curve, the center position is µ, A is the height of the peak, and σ controls its width. Use Gaussian function to fit photographed laser spot images, for the transmit laser intensity distribution curve, the center of the Gaussian beam can be determined very effectively by taking advantage of the symmetry of the Gaussian function, which can effectively avoid the fit error caused by distortion in the central region of the image, as well as the fit error caused by noisy pixels in the edge region.

 figure: Fig. 6.

Fig. 6. (a) Original laser spot image of channel 1 of receiver; (b) laser intensity curve solving method; (c) black points is the accumulated pixel sets intensity distribution along the X axis of the receiver channel 1, and the blue curve is the Gaussian fitting curve; (d) black points are the accumulated pixel sets intensity distribution along the Y axis of the receiver channel 1, and the blue curve is the Gaussian fitting curve.

Download Full Size | PDF

For the original image of measured laser spots, identify the pixel value of each pixel in the grayscale image and accumulate laser intensity distribution curves along the X axis and Y axis, as shown in Fig. 6(b), to obtain the intensity distribution of the laser along X axis and Y axis, then using Eq. (6) to fit curves. As shown in Figs. 6(c) and (d), the black point is measurement data, and the blue curve is Gaussian fitted curve. The fitted curve is

$${y_x} = 1.595 \times {10^4} \times \exp [ - {(x - 290.1)^2}/2312.6], $$
$${y_y} = \textrm{1}\textrm{.540} \times {10^4} \times \exp [ - {(x - 219.1)^2}/2537.1], $$
where the fitted curve correlation coefficient of R2 is 0.93 in Fig. 6(c), and the fitted curve correlation coefficient of R2 is 0.92 in Fig. 6(d).

4. Results and validation

4.1 Parameter analysis

In this paper, there are 256 channels in fiber array, and the spatial relative position of each channel is fixed, so the edge channel represents the worst boresite alignment parameter. The overall alignment results can be calculated by observing these edge channels and considering the processing error of each fiber arrangement. Figure 7 shows the overlapping image collected with an InGaAs infrared camera of laser spots from the edge channel (1,1), (64,1), (1,4), and (64,4) of both transmitter and receiver. Figure 7(a) shows the image of (1, 1) channel, and the white curves are the intensity curves of the spot transmitted by the receiver, and the black curves are the intensity curves of the spot transmitted by the transmitter. Figure 7(b) shows the result of using the method in Section 3.2 to extract spot center along the x-axis intensity distribution of the receiver laser spot. And Fig. 7(c) shows the result along y-axis of the receiver, Fig. 7(d) shows the result along x-axis of the transmitter, Fig. 7(e) shows the result along y-axis of the transmitter.

 figure: Fig. 7.

Fig. 7. (a) The overlapped image of the laser spot from (1, 1) channel of both transmitter and the receiver; (b)∼(e) laser spots intensity distribution and fitted Gaussian curves; (f)∼(t) (64, 1), (1, 4), (64, 4) channel overlapped laser spot images and Gaussian fitting results.

Download Full Size | PDF

Curves in Fig. 7 show the laser spot images processed results, and the center point distance ΔL between each laser spot is the center distance between each Gaussian fitted curves. So, the LiDAR optical assembly error θe is

$${\theta _e} = \frac{{\Delta L}}{{{f_0}}}$$
where the focal length of the collimator ƒ0 is 3500 mm. The calculated results are shown in Table 2, and the average LiDAR optical assembly error of (1, 1), (64, 1), (1, 4), and (64, 4) is 33.09 µrad.

Tables Icon

Table 2. Transmitter-receiver assembly errors of multi-beam biaxial LiDAR

And for fiber array incline parameter, using the method in Section 3.2 to extract the center point of the laser footprint of the first and last row from the same column in the same fiber array, and calculate the distance ΔL between two laser spots in the horizontal direction. As shown in Figs. 4(b) and (c), we can see that the angle between the fiber array and the horizontal axis α is

$$\alpha = \arcsin \frac{{\Delta L}}{L}. $$

And the results of the transmitter fiber array incline parameter αT is 0.65 mrad, and the results of the receiver fiber array αR is 0.66 mrad.

4.3 Error analysis

When measuring the boresite alignment parameter, the result of all channels was not observed during the experimental setup, so the machine processing error between each fiber in the fiber array needs to be considered. There is a ± 20 nm position processing error between two adjacent fibers during fiber array machining, and considering the error accumulation of multiple fiber bundles, the maximum position machining error introduced by the center channel and the edge channel in the transmitter fiber array and receiver fiber array is 0.88 µrad.

In addition to the error caused by processing accuracy, external factors can also have an impact on the LiDAR alignment accuracy. To ensure that the LiDAR alignment accuracy can meet the requirement, the error caused during the assembly processing and the error caused by external factors need to be analyzed separated. The assembly accuracy has been measured in Section 4.1. The error caused by external factors are divided into two categories, one is the measurement instrument self-error and the other is the environmental factor caused error. The measurement instrumentation self-error is the error introduced by the smallest pixel of the InGaAs infrared camera, and the error caused by environmental factor refers to the error caused by changes in the temperature of the lens due to changes in the ambient temperature, thus cause changes in the optical axis of the lens.

A single image element size of InGaAs infrared camera in experimental setup is 15 µm × 15 µm, and the observed image will exist a deviation of one image element size. So, the error introduced by the InGaAs infrared camera is 0.64 µrad. When the optical system temperature deviates from the design temperature, the optical axis of the transmitter and receiver will be shifted due to the thermal expansion of the material, and lens barrel is made of Invar 36 and lens is made of synthetic silica glass NIFS-S. Considering the temperature control range from 19 °C to 21 °C, according to the material characteristics of the lens barrel and lens, the optical axis shifts of the transmitter and receiver is 3.30 µrad.

In summary, the maximum root-mean-square error of all the error sources is 4.91 µrad. Thus, the maximum average error of the optical system alignment is 38.00 µrad, which satisfies the theoretical analysis of 42.00 µrad.

5. Summary

This paper presented a method to evaluate the optical assembly results affecting the performance of the multi-beam biaxial LiDAR, which is applicable to the special fiber array arrangement design in this paper. By analyzing the boresite alignment parameter and the fiber array incline parameter, it can make sure that the LiDAR operates in well overlap and have tangentially arranged fiber array status in the vertical direction without blind. For the laser spot image measured by the InGaAs infrared camera, a Gaussian equation is used to fit the laser spot, and the laser spot center is extracted by taking advantage of the symmetric spot distribution to avoid pixel saturations which affect the accuracy of the result. In laser spot images, the laser spot is not exactly in the fundamental mode Gaussian distribution, which is due to the mode change of the laser after the multimode fiber transmission, and the Gaussian equation does not exactly match the actual laser spots, and that will cause influence on the results. A proven method is to improve the beam discrimination by shaping the transmitted laser beam to obtain more accurate measurement results, which is a direction to our work on in the future.

Funding

National Key Research and Development Program of China (2022YFB3903102); The Key Project of Science and Technology of Anhui Province (202103a13010006).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. M. A. Albota, R. Gurjar, A. Mangognia, D. Dumanis, and B. Edwards, “Contributed Review: Advanced Three-Dimensional Laser Radar Imaging with the Airborne Optical Systems Testbed,” Rev. Sci. Instrum. 89(10), 101502 (2018). [CrossRef]  

2. Z. Li, E. Wu, C. Pang, B. Du, Y. Tao, H. Peng, H. Zeng, and G. Wu, “Multi-Beam Single-Photon-Counting Three-Dimensional Imaging Lidar,” Opt. Express 25(9), 10189 (2017). [CrossRef]  

3. M. McGill, T. Markus, V. S. Scott, and T. Neumann, “The Multiple Altimeter Beam Experimental Lidar (MABEL): An Airborne Simulator for the ICESat-2 Mission,” J. Atmos. Oceanic Technol. 30(2), 345–352 (2013). [CrossRef]  

4. Z. Li, X. Huang, Y. Cao, B. Wang, Y. Li, W. Jin, C. Yu, J. Zhang, Q. Zhang, C. Peng, F. Xu, and J. Pan, “Single-Photon Computational 3D Imaging at 45 Km,” Photonics Res. 8(9), 1532 (2020). [CrossRef]  

5. W. Lian, S. Li, G. Zhang, Y. Wang, X. Chen, and H. Cui, “Accuracy Verification of Airborne Large-Footprint Lidar Based on Terrain Features,” Remote Sens. 12(5), 879 (2020). [CrossRef]  

6. Z. Li, X. Huang, P. Jiang, Y. Hong, C. Yu, Y. Cao, J. Zhang, F. Xu, and J. Pan, “Super-Resolution Single-Photon Imaging at 8.2 Kilometers,” Opt. Express 28(3), 4076 (2020). [CrossRef]  

7. A. McCarthy, R. J. Collins, N. J. Krichel, V. Fernández, A. M. Wallace, and G. S. Buller, “Long-Range Time-of-Flight Scanning Sensor Based on High-Speed Time-Correlated Single-Photon Counting,” Appl. Opt. 48(32), 6241 (2009). [CrossRef]  

8. A. Kumar and S. Paul, “Space borne LIDAR and future trends,” (2007).

9. T. A. Neumann, A. J. Martino, T. Markus, et al., “The Ice, Cloud, and Land Elevation Satellite – 2 Mission: A Global Geolocated Photon Product Derived from the Advanced Topographic Laser Altimeter System,” Remote Sens. Environ. 233, 111325 (2019). [CrossRef]  

10. C. Tan, W. Kong, G. Huang, J. Hou, S. Jia, T. Chen, and R. Shu, “Design and Demonstration of a Novel Long-Range Photon-Counting 3D Imaging LiDAR with 32 × 32 Transceivers,” Remote Sens. 14(12), 2851 (2022). [CrossRef]  

11. U. Hofmann, M. Aikio, J. Janes, F. Senger, V. Stenchly, J. Hagge, H. Quenzer, M. Weiss, T. Wantoch, C. Mallas, B. Wagner, and W. Benecke, “Resonant Biaxial 7-Mm MEMS Mirror for Omnidirectional Scanning,” J. Micro/Nanolithogr., MEMS, MOEMS 13(1), 011103 (2013). [CrossRef]  

12. M. Li, J. Hou, C. Zhou, and R. Shu, “Experimental verification of transmitting-receiving registration method with high precision used in multi-beam lidar,” Infrared Laser Eng. 46(7), 730001 (2017). [CrossRef]  

13. L. Ramos-Izquierdo, V. S. Scott, S. Schmidt, J. Britt, W. Mamakos, R. Trunzo, J. Cavanaugh, and R. Miller, “Optical System Design and Integration of the Mercury Laser Altimeter,” Appl. Opt. 44(9), 1748 (2005). [CrossRef]  

14. Y. Ivanov and A. Melzer, “Particle Positioning Techniques for Dusty Plasma Experiments,” Rev. Sci. Instrum. 78(3), 033506 (2007). [CrossRef]  

15. Y. Feng, J. Goree, and B. Liu, “Accurate Particle Position Measurement from Images,” Rev. Sci. Instrum. 78(5), 053704 (2007). [CrossRef]  

16. H. Guo, “A Simple Algorithm for Fitting a Gaussian Function,” IEEE Signal Process. Mag. 28(5), 134–137 (2011). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Multi-beam biaxial LiDAR alignment principle.
Fig. 2.
Fig. 2. Biaxial LiDAR overlap schematic representation.
Fig. 3.
Fig. 3. Biaxial LiDAR overlaps schematic representation, θT is the divergence angle of transmit laser, θR is FoV. (a) Center overlapping; (b) overlapping; (c) partial overlapping; (d) non-overlapping.
Fig. 4.
Fig. 4. Fiber array incline state. (a) Beat situation, the small circle is the laser spot from transmitter, and the large circle is the FoV from receiver. And (X, Y) represents the beam coordinates in the fiber array (X represents the row position, and Y represents the column position); (b) inclined transmitter fiber array; (c) inclined receiver fiber array.
Fig. 5.
Fig. 5. Schematic diagram of the experimental system.
Fig. 6.
Fig. 6. (a) Original laser spot image of channel 1 of receiver; (b) laser intensity curve solving method; (c) black points is the accumulated pixel sets intensity distribution along the X axis of the receiver channel 1, and the blue curve is the Gaussian fitting curve; (d) black points are the accumulated pixel sets intensity distribution along the Y axis of the receiver channel 1, and the blue curve is the Gaussian fitting curve.
Fig. 7.
Fig. 7. (a) The overlapped image of the laser spot from (1, 1) channel of both transmitter and the receiver; (b)∼(e) laser spots intensity distribution and fitted Gaussian curves; (f)∼(t) (64, 1), (1, 4), (64, 4) channel overlapped laser spot images and Gaussian fitting results.

Tables (2)

Tables Icon

Table 1. Multi-beam biaxial LiDAR system design parameter

Tables Icon

Table 2. Transmitter-receiver assembly errors of multi-beam biaxial LiDAR

Equations (12)

Equations on this page are rendered with MathJax. Learn more.

θ = sin d R
θ δ = δ + θ
δ 1 2 ( θ R  -  θ T )
1 2 ( θ R  -  θ T ) δ 1 2 ( θ R  +  θ T )
1 2 ( θ R  +  θ T ) δ .
α = arcsin Δ d T 3 L
( X c e n , Y c e n ) = ( k X k I k k I k , k Y k I k k I k )
y = A exp [ ( x μ ) 2 / 2 σ 2 ] .
y x = 1.595 × 10 4 × exp [ ( x 290.1 ) 2 / 2312.6 ] ,
y y = 1 .540 × 10 4 × exp [ ( x 219.1 ) 2 / 2537.1 ] ,
θ e = Δ L f 0
α = arcsin Δ L L .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.