Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Generic precise augmented reality guiding system and its calibration method based on 3D virtual model

Open Access Open Access

Abstract

Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique.

© 2016 Optical Society of America

1. Introduction

Robotics and automatic equipment are widely applied in variety of fields but still cannot take the place of human in some specific fields such as surgery, components assembly and manufacture in small quantity. Due to the inherent restrictions of the human being, the operation quality could not be guaranteed without some measuring equipment. Currently, many measuring equipment cannot provide data in user-friendly format. Operators need to read and process the data at the specific interfaces offered by the instruments (e.g. monitor, data terminal) and then the work can be carried on. Due to the restrictions of human eyes and hands, additional instruments are usually needed to help operators find the operating target especially in the following three conditions: 1. the target is obscured and/or has unobvious features. 2. The operation requires high precision, for example in surgery and large-scale manufacturing. 3. The target has a large number of similar features such as rivets and antenna array. In order to improve the quality and efficiency of manual operation, it is necessary to find a user-friendly way to present the operating data to the operator.

Augmented reality guiding system (ARGS), also named as virtual marking and reality guiding system, provides intuitive indication for manual operation to avoid the restrictions and to improve adaptability and efficiency. After fusing the data with other measuring equipment, the ARGS can present key features and indicative characters on the object surface directly so that the operator can focus better on the manual operation. This kind of systems has been studied in biology and surgery. For example, vein contrast enhancement system based on projector-camera structure is helpful in finding the eye-invisible vein [1,2]. Some surgical operations can be carried out with the assistance of an image-guided system based on digital projector and medical imaging equipment [3,4]. Moreover, some three-dimensional (3D) imaging devices equipped with digital projector can project the measured feature onto surface of the object. The projecting indication technique also brings dramatic efficiency assistance in large-scale manufacturing such as airplane/ship assembly and skin painting.

As a kind of AR system, the ARGS should also face to the common problems: how to fuse the indication content with the real object precisely. Specific to the industry application, the minimum requirement for ARGS is to meet the design tolerances. Fortunately, manual operation often appears in large-scale manufacturing which is not suitable for numerical control (NC) machining. When designing the large-scale workpiece, the accuracy of the manual operation must be taken into account. Generally speaking, an error of 0.1 mm is the limit which can be guaranteed by manual operation. So the designing tolerances of large-scale workpiece are mostly in millimeter scale [5]. So in most cases, local indication accuracy of 0.1 mm is sufficient to meet the requirement in manual manufacturing phase.

Current researches on ARGS mainly focus on specific applications. Most of the current ARGSs are based on camera-projector structure. In other words, they are self-guiding systems. As a specific ARGS, the effective indication range of the camera-projector system is limited by the common field of view (FOV) and depth of field (DOF) of the camera and projector. High accuracy and large FOV cannot be satisfied at the same time, especially in large-scale workspace.

Moreover, the model of ARGS based on camera-projector structure is not generally applicable to other projecting apparatus (e.g. laser-galvo projecting system, scanning mirror projecting system [6]). In principle, the ARGS based on camera-projector structure should be calibrated with the principle of stereo vision technique [2]. The transformation between camera and projector is fixed. However, other projecting apparatus (e.g. galvo scanners) have different projecting model. If these projecting apparatus are used as a projecting unit of ARGS, the model derivation is quite different with the camera-projector system. Therefore, a general design model of ARGS not relying on camera-projector structure should be studied.

To avoid the restrictions of self-guiding system and maintain compatibility with various projecting apparatus, ARGS can be designed with a general method based on rigid body transformation and point remapping. The main objective of designing ARGS is to find the coordinates of indicating points in the projecting apparatus coordinate system in order to generate the input for various projecting apparatus so that these points can be displayed on the surface of the target. The general procedure can be summarized as follows:

  • 1. Obtain a 3D model of the object to be indicated. The origin design data, 3D measuring equipment (e.g. computed tomography (CT), 3D scanner) can be used.
  • 2. Obtain the position and posture of the projecting apparatus and the object to be indicated.
  • 3. Calculate the coordinates of target indicating points in the projecting apparatus coordinate frame.
  • 4. Convert the coordinates of target indicating points to the input of the projecting apparatus.

Based on these basic process mentioned above, this paper proposes a general method to design an ARGS based on 3D digitized model and coordinate transformation. This method treated the projecting apparatus as a separate part which can work alone, so that it is compatible with various projecting apparatus as well as all the coordinate measurement equipment (e.g. total station, laser tracker system or stereo vision system). Furthermore, the proposed ARGS can be set relatively arbitrary in the measuring range of the external guiding equipment. When handling a workpiece with large scale (e.g. aircraft skin), the calibrated ARGS can be simply put in front of the target area on the workpiece which should be processed. The proposed external-guiding ARGS can maintain a high local indicating accuracy in a large workplace which is only decided by the specific 3D positioning equipment. In other words, it has high relative indication accuracy.

To validate the proposed method, an ARGS with the digital projector as projecting apparatus is designed. To achieve high local indication accuracy in a considerable large workplace, a combined calibration method for projecting apparatus is proposed. The intrinsic and extrinsic parameters are calibrated in a well arranged working process with a short error transformation chain. The final indication accuracy is evaluated by a high resolution camera and the calibration plate with different projecting poses.

The rest of the paper is organized as follows. In Section 2, the principle of the proposed ARGS is detailed. The system based on digital projector is demonstrated as an example, and an optimized calibration method is applied. In Section 3, experiment system configuration and calibration procedure are detailed step by step, and quantitative evaluations are applied to illustrate the final indicating accuracy. Finally, Section 4 concludes the paper.

2. Principle

In the following statements, the naming rule of the spatial point is defined as follows. The subscript of a point refers to specific coordinate system. The superscript of a point refers to the actual location of the point. (e.g. Pho means the point P is located on the object workpiece and represented by ARGS holder coordinate frame.) W is a coordinate transformation matrix whose subscript refers to the original coordinate system and superscript refers to the target coordinate system. The definition of the coordinate system transformation is detailed in [7].

2.1 Generic method for designing ARGS

In this paper, the principle of proposed ARGS is divided into two factors: (1) calculate the coordinate transformation chain; (2) obtain the extrinsic parameters of the projecting apparatus.

As shown in Fig. 1, the primary objective of the ARGS is to find the point Pho(xh,yh,zh) in the projecting apparatus holder coordinate system corresponding to the known indicating target point Poo(xo,yo,zo) in the workpiece coordinate system. Therefore, several rigid body transformations should be applied to transform Poo into the projecting apparatus holder coordinate system for further processing.

 figure: Fig. 1

Fig. 1 The working configuration of proposed ARGS (Digital projector is applied as a example projecting apparatus)

Download Full Size | PDF

The simplest coordinate systems of the proposed ARGS consist of projecting apparatus frame, holder frame, object (workpiece) frame and world (global) frame, as illustrated in Fig. 1. The target point Poo(xo,yo,zo) to be indicated in object frame should be transformed to Pho(xh,yh,zh) in projector frame. The relationship between Poo and Pho is

Pho=WohPoo=WwhWowPoo
where Wow denotes the transform relationship between object frame and world frame, Wwh is the transformation matrix from world frame to holder frame while Wow and Wwh are variable because the ARGS and the workpiece are set arbitrarily. Calculating a part of coordinate transformation chain Woh means to obtain Wow and Wwh simultaneously.

When Pho is obtained, it should be transferred to projecting apparatus frame in order to drive the different kinds of projecting apparatus. The origin of projecting apparatus frame is a virtual point which should be calibrated. The holder is firmly fixed with the projecting apparatus so the transformation matrix Whp is a constant after the system is built up. In other words, Whp provides the compatibility of different projecting apparatus and it can be defined as extrinsic parameter of the selected projecting apparatus.

2.2 Principle of the proposed demonstration ARGS

Based on the proposed method, a demonstration ARGS can be built with a digital projector as projecting apparatus. In order to obtain Whp, the projecting principle of the digital projector should be analyzed.

The digital projector can be treated as a reverse imaging optical system, so the pinhole model can be used to map pointPpo(xp,yp,zp) from projector coordinate system to projector’s image plane by using projector intrinsic matrix Mp.

[upvp1]=sMp[xpypzp],
where s is an arbitrary scale factor, and
Mp=[fpx0cpx0fpycpy001]
is called the projector intrinsic matrix, with (cpx,cpy) the coordinates of the principal point, fpx and fpy the scale factors in projector’s image coordinate system. Therefore, the image of the indicating target point Pp(up,vp) is obtained.

Next, the target point Pp(up,vp) in linear (pinhole) model is generated with a subpixel projecting display method [8]. To compensate the distortion of the projecting lens, a pre-distort procedure is carried out on the generated image by using radial and tangential distortion parameters [9].

The procedure to calibrate Mp and the distortion parameters of the projector will be detailed in Subsection 2.3.1. Figure 2 shows the working flow chart of the proposed demonstration ARGS.

 figure: Fig. 2

Fig. 2 The working flow chart of the demonstration ARGS.

Download Full Size | PDF

2.3 Combined calibration method for the demonstration ARGS

As the projecting optical system can be treated as a reverse imaging optical system, pinhole model and distortion representation of camera is also suitable for projector. However, the commonly used camera calibration method cannot be applied on projector directly because the position of projected features on projector’s object plane is unavailable. Therefore, a camera is indispensable in the projector calibration procedure and it should be calibrated separately to obtain its intrinsic matrix Mc and distortion parameters by Zhang’s method [10]. All the images captured by the camera should be corrected by camera’s distortion parameters in the projector calibration procedure.

A method is detailed to obtain the projector’s intrinsic matrix and distortion parameters relying on virtual calibration pattern in [11]. In order to achieve accurate calibration results and calculate the ARGS holder-projector transformation during one procedure, a combined calibration method is proposed and elaborated in the following subsections.

2.3.1 Obtaining the optimized estimate of projector’s intrinsic and extrinsic parameters

Figure 3 illustrates the calibrating configuration of the ARGS. The projector’s intrinsic matrix Mp is calibrated by the camera and the calibration plate first. The camera is used to locate the calibration plate frame and remap the projected virtual calibration patterns into the calibration plate frame. When an image of mark point array on calibration plate is captured, the homography from calibration plate to the captured image Hcpc can be calculated. Next, a known calibration pattern is projected onto the calibration plate when the relative locations of the camera, projector and calibration plate are fixed. The corner points Pcvp in the captured pattern image are extracted by a subpixel corner detection method [12]. These corner points can be transformed into the calibration plate frame Pcpvp by using the following equation

 figure: Fig. 3

Fig. 3 The calibration configuration of the ARGS.

Download Full Size | PDF

Pcpvp=s(Hcpc)1Pcvp.

The camera calibration method [10] can be applied to estimate the intrinsic and the extrinsic parameters of current position and orientation with these corner points on the calibration plate and the corresponding corner points on projector’s image plane. In order to adapt to different working distances and orientations, the projected pattern, named as virtual calibration pattern, should be captured in several positions and orientations. After that, the optimized estimate of Mp and distortion parameters for all the projector’s positions can be obtained. Besides, the homography from calibration plate plane to projector’s image plane Hcp_ip in ith position is also calculated and further used to estimate the extrinsic parameters of projector.

To achieve more accurate results, it is necessary to reduce error propagation chain. A 3D positioning equipment, instead of camera-projector (stereo vision) structure [2], is applied to obtain Wh_icp in the ith position directly. Wh_ip can be calculated by the following equation

Wh_ip=Wcp_ipWh_icp.

As shown in Fig. 3, the reference points on combined calibration plate (Rcp) and the ARGS holder (Rh) can be measured when these two components are relatively static to each other. Rh defines the ARGS holder frame. The coordinates of Rcp can be transformed from world frame to ARGS holder frame, where they are available in calibration plate frame after measured by coordinate measuring machine (CMM). Thus, Wh_icp can be calculated by singular value decomposition (SVD) algorithm [13] with the common reference pointRcp.

The corresponding Hcp_ip and the optimized estimate of Mp (already calculated during calibrating the projector’s intrinsic parameters) can be used to solve several Wcp_ip. After that, Wh_ip is calculated by Eq. (4). To get the optimized estimate of Whp, a new method is proposed to get the optimized value (the “mean” coordinate transformation matrix) from Wh_ip by the following steps.

  • 1) Estimate the working range of the projector and simulate some ideal virtual target points Ppp_j(j=1,2,3n) (n is the amount of these target points) uniformly within the projector’s working range. Transform these points into ARGS holder frame with Wh_ip of corresponding positions.
  • 2) m positions of each point are obtained as Php_j(j=1,2,3n,i=1,2,3m). Calculate the points’ mean value as follows
    Php_jmean=(i=1mPhp_ji)/m.
  • 3) Calculate Wh_meanp by Php_jmean(j=1,2,3n) andPpp_jmean(j=1,2,3n) with SVD algorithm as the optimized estimate of Whp.

2.3.2 Procedure of the proposed calibration method

After simplifying the above process, a flow chart of the proposed method is shown in Fig. 4 to obtain the data and images for calibrating the ARGS.

 figure: Fig. 4

Fig. 4 Flow diagram of the proposed method to obtain the necessary data for calibration.

Download Full Size | PDF

Be similar with traditional camera calibration procedure, the more projecting positions m are measured, the better calibration result will be achieved [10, 14]. The projecting positions and orientations should be as uniformity as possible in the effective projecting space which is restricted by the DOF of the digital projector.

The entire ARGS calibration process can be summarized in the following steps:

  • 1) Calibrate the camera and obtain the data and images for ARGS calibration with the process detailed in the flow diagram of Fig. 4.
  • 2) Correct all the captured images with Mc and camera’s distortion parameters.
  • 3) Select the data and corrected images in the first position.
  • 4) Calculate the homography Hcp_1c from imring1.
  • 5) Extract the corner pointsPcvp1 from the virtual calibration pattern imvp1 and transform them into the calibration plate coordinate system (Pcpvp1=s(Hcpc)1Pcvp1).
  • 6) Repeat step 4 to 6 to calculate Pcpvpi(i=1,2,3m) in different projecting positions.
  • 7) Calculate the optimized estimate of and Hcp_ip(i=1,2,3m) with Pcpvpi(i=1,2,3m) and the corresponding corners’ coordinates on projector’s image plane.
  • 8) Calculate Wh_icp(i=1,2,3m) with the obtained Rhi(i=1,2,3m) and Rcp by SVD algorithm.
  • 9) Solve Wcp_ip(i=1,2,3m) with Mp and Hcp_ip(i=1,2,3m).
  • 10) Calculate Wh_ip(i=1,2,3m), (Whp=WcppWhcp).
  • 11) Calculate the optimized estimate of Whp with Wh_ip(i=1,2,3m) by using the proposed averaging method at the end of the Subsection 2.3.1.

3. Experiments and results

3.1 The combined calibration target

As shown in Fig. 5, a ceramic calibration plate with ring marks on the surface is attached to an aluminum plate, which is used to calibrate ARGS and verify the accuracy. Four ball pedestals are mounted on the aluminum plate to hold the ball reflector of laser tracker. Each ball center can be treated as a reference point. The center of ring mark in upper left corner defines the origin of the calibration plate coordinate system. Then, each of the four ball reflectors is set on the pedestal one by one and the coordinates of the ball centers are measured by a measuring microscope-CMM (TESA VISIO300 DCC). The measuring error can be guaranteed within 0.005mm. The coordinates of the four reference points are transformed from CMM coordinate system to the calibration plate coordinate system so that the position and orientation of the calibration plate can be measured by the laser tracker.

 figure: Fig. 5

Fig. 5 The combined calibration plate.

Download Full Size | PDF

3.2 ARGS holder and the projector

Figure 6 shows the hardware setup in the experiment: a digital projector (PRO4500 from Wintech Digital) and a CCD camera (B3320M-8MP from Imperx) with prime lens (f = 25mm) are set on an aluminum holder. Five ball pedestals are also fixed on it. The pedestals with NO 1,2,3 define the ARGS holder coordinate system, as shown in Fig. 6. The camera is indispensable for calibration as well as the verification of the indication accuracy of the ARGS.

 figure: Fig. 6

Fig. 6 The hardware setup of the ARGS (The camera is only for calibration and verification).

Download Full Size | PDF

3.3 Results of camera calibration

The camera is applied to capture the virtual calibration patterns and verify the final indicating accuracy [10,14,15]. A high resolution and well calibrated camera will improve the projector calibration accuracy and credibility of the final verification, a camera with resolution of 3312x2488 from Imperx is selected. The pixel of image sensor is square. The center of ring on the calibration plate is utilized to calibrate the camera. The least square ellipse fitting technique was applied to extract the central coordinates of ring mark [15,16]. 32 images (108 rings per image) with different orientations were captured to calculate Mc and the obtained distortion parameters [17] are listed in Table 1.

Tables Icon

Table 1. Intrinsic Parameters of the Camera

Figure 7 illustrates the reprojection error of the camera, which decides the credibility of the further experiments. The working distance of the camera is approximately 500mm and one camera pixel occupies 0.091mm. The mean value and the root mean error (RMSE) of reprojection error are 0.2321 pixels and 0.2647 pixels, respectively. So the mean measuring error of the camera in metric unit is about 0.021 mm and it is sufficient for the following experiment.

 figure: Fig. 7

Fig. 7 Reprojection error of the camera (Unit: camera pixel)

Download Full Size | PDF

3.4 Virtual pattern generation for projector calibration

The existing research mainly considered the software-generated checkerboard pattern as the virtual calibration pattern and projected with the digital projector. However, Digital Micromirror Device (DMD) of the projector for experiments is equipped with diamond-shaped pixels, so the projected horizontal and vertical lines are not smooth but with the jagged edges, as demonstrated in Fig. 8(c) and 8(d). The commonly used subpixel corner exaction method [12] results in low accuracy in dealing with jagged edges. To tackle with this problem, a diamond array pattern is generated so that the sharp edges and corners can be obtained, as illustrated in Fig. 8(e) and 8(f). All the pixel coordinates of these corners are used to calculateMp. In addition, the corner points overlapped on ring marks shown in Fig. 8(d) are all removed before calculating the parameters of the projector.

 figure: Fig. 8

Fig. 8 Virtual calibration patterns captured by camera. (a) Checkerboard pattern, (b) Diamond array pattern, (c) (d) Enlarged view of checkerboard pattern, (e) (f) Enlarged view of diamond array pattern.

Download Full Size | PDF

3.5 Results of the ARGS calibration

The calibration results are based on the procedure described in Subsection 2.3.2. A laser tracker (Leica AT901-LR) is applied to measure all the reference points on the calibration plate and the ARGS holder. The ARGS holder has been placed in six different positions. At each position, calibration image and the coordinates of reference points were obtained to calculate the parameters of the projector. The projector’s resolution is 912x1140 without optical offset and its aspect ratio is 16:10 due to the diamond shaped pixels, so the projector is equivalent to one equipped with rectangular pixel with a 2:1 aspect ratio. The optimized estimate of Mp and distortion parameters of the projector are listed in Table 2.

Tables Icon

Table 2. Intrinsic Parameters of the Projector (Mp)

The reprojection error of the projector is illustrated in Fig. 9. This is an important intermediate result because the result shows the additional error which introduced by the projecting apparatus. In other words, these errors are caused only by the projecting apparatus. The working distance of the projector is approximately 500mm. One pixel of the projector in X direction is about 0.274mm, while in Y direction is about 0.137mm. The mean value and the RMSE of reprojection error are 0.1126 pixels and 0.1523 pixels, respectively.

 figure: Fig. 9

Fig. 9 Reprojection error of the projector (Unit: protjector pixel).

Download Full Size | PDF

Based on the calculated Mp, the optimized estimate of Whp is obtained as:

Whp=[0.22630.93220.282428.54540.97150.23690.004554.55980.07030.27360.9593169.82500001]

The intermediate results of Whp in 6 different positions are listed in Table 3.

Tables Icon

Table 3. Whpin 6 Different Positions and the Final Optimized Result

3.6 Accuracy verification

The experiment for accuracy verification of the ARGS is based on Subsection 2.1. As shown in Fig. 10, the calibration plate is fixed and the coordinates of the four reference points are measured by laser tracker. The ARGS holder is placed in three arbitrary positions and the reference points on the ARGS holder are measured by the laser tracker.

 figure: Fig. 10

Fig. 10 Accuracy verification configuration of ARGS system.

Download Full Size | PDF

The calibration plate acts as a workpiece with a known numerical model and the centers of the ring marks on the plate is used to define the features to be indicated. In order to achieve the verification on subpixel scale, a cross pattern is generated and decomposed into two parts to project onto the calibration plate respectively. Each segment of the cross pattern is generated by Gaussian function and spread over five pixels, as shown in Fig. 11. The intersection point of two segments is the coordinate of the dot features, which will be projected back to the projector’s image plane. Meanwhile, the camera is used to capture the two segments of the cross pattern and the ring marks on the calibration plate. To evaluate the final indicating accuracy, the center of ring mark is considered as the actual position to be indicated, which is to be extracted by subpixel edge detection [18] and the ellipse fitting technique [16]. The images of the two segments are processed with gravity line fitting method to obtain the intersection point on the captured image. As shown in Fig. 11 (g), the deviation between the position of segments’ intersection point and the ring mark center defines the indicating accuracy in camera’s pixel unit. To show the indicating accuracy with metric unit, the intersection points and the ring mark centers should be projected from camera image plane to calibration plate plane. The distance between these two projected points define the final indicating accuracy of the ARGS.

 figure: Fig. 11

Fig. 11 Cross feature extraction technique with subpixel accuracy (a) and (b) Two segments of the cross feature captured by the camera and the red/green lines show the linear fitting result. (c) One ring mark captured by the camera (d) Profile line and the sample width in pixels (e) Profile gray distribution of the feature captured by the camera (f) Profile gray distribution of the feature generated for the projector (g) The length of black segment connecting the two lines’ intersection and the center of the ellipse defines the final indication error in pixels.

Download Full Size | PDF

The final indication accuracy of the proposed ARGS and the error distribution are illustrated in Fig. 12. The mean indication error is 0.105 mm and the standard deviation is 0.060 mm over a 260mm x 180mm indication area. The indication error in Y direction is smaller than X direction because the digital projector in the demonstration system has a 2:1 aspect ratio (pixel density in Y direction is doubled).

 figure: Fig. 12

Fig. 12 Distribution of the final indication error in three arbitrary positions (Unit: mm).

Download Full Size | PDF

3.7 Qualitative evaluation

A real workpiece with known numerical model is fixed onto an appliance equipped with ball pedestals. As shown in Fig. 13, four circle features are generated from the numerical model and projected onto the real workpiece. The ball pedestals on the appliance define the appliance frame (not shown in Fig. 13). The positions of the locating holes on the appliance and workpiece are pre-measured by CMM. Then the transformation matrix from workpiece frame (numerical model) to appliance frame can be calculated. Based on the transformation matrix, target features’ coordinates can be transformed from workpiece frame (numerical model) to appliance frame. Laser tracker is used to measure the ball pedestals on the appliance and ARGS respectively. Finally all four target features can be transformed to the projector’s image plane to finish the indicating procedure.

 figure: Fig. 13

Fig. 13 Qualitative evaluation of the proposed ARGS (a) The numerical model of workpiece with the features to be indicated (b) The real workpiece with blue indication features.

Download Full Size | PDF

Figure 13(b) shows that the circle features are displayed on the surface of the real workpiece accurately.

In practical application, the manufacturing reference points are often used to calculation the current relative position and orientation of the ARGS. Some pre-calibrated measuring accessories can be used to directly fix to the reference on the workpiece (e.g. reference holes, edge intersections).

4. Conclusion

In this paper, a novel type of ARGS system and the corresponding design method is proposed and analyzed. Based on the proposed method, a demonstration system based on digital projector is designed. With the help of 3D positioning equipment, the system can maintain high relative indicating accuracy in a large scale workplace. The corresponding method is developed to precisely calibrate the parameters of proposed ARGS. To obtain the optimized estimate of the intrinsic and extrinsic parameters of the whole ARGS, a simplified calibration procedure is designed. In the calibration procedure, the 3D positioning equipment, instead of camera, is applied to obtain the extrinsic parameters of the projector directly. With a combined calibration plate and a camera, the entire calibrating procedure can be implemented in the working field of large scale assembly. Through a sub-pixel pattern projecting technique for digital projector, the resolution limit of digital projector can be partly overcome. The sub-pixel pattern could be useful to the operators to find the target location precisely especially in a large one-shot indication area. The verification experiments show that the final indication accuracy is sufficient for the manual operation in large scale workplace. Results in this experiment can be used as benchmark to predict the indication accuracy of other ARGS with the similar principle.

With this new type of system, the application domain of augmented reality guiding systems can be greatly expanded. The proposed system is well applicable for large scale manufacturing and assembling (e.g. digitalized airplane manufacturing). It can be easily integrated into the existing production line to provide precise feature indication or display some extra information on the workpiece directly. It could also be used in painting instruction, auxiliary lofting and other specific human-participated applications.

Acknowledgments

The authors would like to thank the National Natural Science Foundation of China (NSFC) (NO: 61171048, 51225505), National High-technology Research and Development Program of China (NO: 2012AA041205), Key Basic Research Project of Applied Basic Research Programs Supported by Hebei Province (NO: 15961701D), Research Project for High-level Talents in Hebei University (NO: GCC2014049), Talents Project Training Funds in Hebei Province (NO: A201500503), and Program for Changjiang Scholars and Innovative Research Team in University (NO: IRT1275, IRT1232) are also acknowledged.

References and links

1. X. Dai, Y. Zhou, X. Hu, M. Liu, X. Zhu, and Z. Wu, “A fast vein display device based on the camera-projector system,” in Proceedings of IEEE Conference on Imaging Systems and Techniques (IEEE, Beijing, 2013), pp. 146–149. [CrossRef]  

2. J. Shahsavari, B. Nazari, and N. Gheissari, “A robust camera-projector calibration method to be used in vein contrast enhancement systems,” in Proceedings of 20th Iranian Conference on Electrical Engineering (IEEE, Tehran, 2012), pp. 800–805. [CrossRef]  

3. L. Besharati Tabrizi and M. Mahvash, “Augmented reality-guided neurosurgery: accuracy and intraoperative application of an image projection technique,” J. Neurosurg. 123(1), 206–211 (2015). [CrossRef]   [PubMed]  

4. N. H. Cho, J. H. Jang, W. Jung, and J. Kim, “In vivo imaging of middle-ear and inner-ear microstructures of a mouse guided by SD-OCT combined with a surgical microscope,” Opt. Express 22(8), 8985–8995 (2014). [CrossRef]   [PubMed]  

5. B. J. Holmes, C. J. Obara, G. L. Martin, and C. S. Domack, “Manufacturing tolerances for natural laminar flow airframe surfaces”, in Proceedings of Advanced Seminar on Generalized Inverse and Applications, (SAE International, 1985). [CrossRef]  

6. R. Kijima and T. Goto, “A Light-weight Annotation System Using a Miniature Laser Projector,” in Proceedings of IEEE Virtual Reality, Haptics Symposium and Symposium on 3D User Interface (IEEE, Alexandria, 2006), pp. 70. [CrossRef]  

7. J. B. Marion and S. T. Thornton, Classical Dynamics of Particles and Systems (Harcourt Brace, 1995.

8. M. Liu, C. Sun, S. Huang, and Z. Zhang, “An accurate projector calibration method based on polynomial distortion representation,” Sensors (Basel) 15(10), 26567–26582 (2015). [CrossRef]   [PubMed]  

9. M. Liu, State key laboratory of precision measuring technology and instruments, Tianjin University, Tianjin and Z. H. Zhang are preparing a manuscript to be called “Full-field rectification and evaluation for digital projector distortion.”

10. Z. Y. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intel. 22(11), 1330–1334 (2000). [CrossRef]  

11. H. Anwar, I. Din, and K. Park, “Projector calibration for 3D scanning using virtual target images,” Int. J. Precis. Eng. Manuf. 13(1), 125–131 (2012). [CrossRef]  

12. D. Chen and G. Zhang, “A new sub-pixel detector for X-corners in camera calibration targets,” in Proceedings of International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision’2005, (Plzen, Czech Republic, 2005), pp. 97–100.

13. D. W. Eggert, A. Lorusso, and R. B. Fisher, “Estimating 3-D rigid body transformations: a comparison of four major algorithms,” Mach. Vis. Appl. 9(5–6), 272–290 (1997). [CrossRef]  

14. C. Ricolfe-Viala and A. J. Sanchez-Salmeron, “Camera calibration under optimal conditions,” Opt. Express 19(11), 10769–10775 (2011). [CrossRef]   [PubMed]  

15. M. Vo, Z. Wang, B. Pan, and T. Pan, “Hyper-accurate flexible calibration technique for fringe-projection-based three-dimensional imaging,” Opt. Express 20(15), 16926–16941 (2012). [CrossRef]  

16. A. Fitzgibbon, M. Pilu, and R. B. Fisher, “Direct least square fitting of ellipses,” IEEE Trans. Pattern Anal. Mach. Intel. 21(5), 476–480 (1999). [CrossRef]  

17. J. Y. Bouguet, “Camera calibration toolbox for MATLAB,” www.vision.caltech.edu/bouguetj.

18. A. Fabijańska, “Gaussian-based approach to subpixel detection of blurred and unsharp edges,” in Proceedings of IEEE Federated Conference on Computer Science and Information Systems (IEEE, Warsaw, 2014), pp. 641–650. [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1
Fig. 1 The working configuration of proposed ARGS (Digital projector is applied as a example projecting apparatus)
Fig. 2
Fig. 2 The working flow chart of the demonstration ARGS.
Fig. 3
Fig. 3 The calibration configuration of the ARGS.
Fig. 4
Fig. 4 Flow diagram of the proposed method to obtain the necessary data for calibration.
Fig. 5
Fig. 5 The combined calibration plate.
Fig. 6
Fig. 6 The hardware setup of the ARGS (The camera is only for calibration and verification).
Fig. 7
Fig. 7 Reprojection error of the camera (Unit: camera pixel)
Fig. 8
Fig. 8 Virtual calibration patterns captured by camera. (a) Checkerboard pattern, (b) Diamond array pattern, (c) (d) Enlarged view of checkerboard pattern, (e) (f) Enlarged view of diamond array pattern.
Fig. 9
Fig. 9 Reprojection error of the projector (Unit: protjector pixel).
Fig. 10
Fig. 10 Accuracy verification configuration of ARGS system.
Fig. 11
Fig. 11 Cross feature extraction technique with subpixel accuracy (a) and (b) Two segments of the cross feature captured by the camera and the red/green lines show the linear fitting result. (c) One ring mark captured by the camera (d) Profile line and the sample width in pixels (e) Profile gray distribution of the feature captured by the camera (f) Profile gray distribution of the feature generated for the projector (g) The length of black segment connecting the two lines’ intersection and the center of the ellipse defines the final indication error in pixels.
Fig. 12
Fig. 12 Distribution of the final indication error in three arbitrary positions (Unit: mm).
Fig. 13
Fig. 13 Qualitative evaluation of the proposed ARGS (a) The numerical model of workpiece with the features to be indicated (b) The real workpiece with blue indication features.

Tables (3)

Tables Icon

Table 1 Intrinsic Parameters of the Camera

Tables Icon

Table 2 Intrinsic Parameters of the Projector (Mp)

Tables Icon

Table 3 W h p in 6 Different Positions and the Final Optimized Result

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

P h o = W o h P o o = W w h W o w P o o
[ u p v p 1 ] = s M p [ x p y p z p ] ,
M p = [ f p x 0 c p x 0 f p y c p y 0 0 1 ]
P c p v p = s ( H c p c ) 1 P c v p .
W h _ i p = W c p _ i p W h _ i c p .
P h p _ j m e a n = ( i = 1 m P h p _ j i ) / m .
W h p = [ 0.2263 0.9322 0.2824 28.5454 0.9715 0.2369 0.0045 54.5598 0.0703 0.2736 0.9593 169.8250 0 0 0 1 ]
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.