Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Optical camera communication (OCC) using a laser-diode coupled optical-diffusing fiber (ODF) and rolling shutter image sensor

Open Access Open Access

Abstract

We demonstrate an optical-camera-communication (OCC) system utilizing a laser-diode (LD) coupled optical-diffusing-fiber (ODF) transmitter (Tx) and rolling-shutter based image sensor receiver (Rx). The ODF is a glass optical fiber produced for decorative lighting or embedded into small areas where bulky optical sources cannot fit. Besides, decoding the high data rate rolling-shutter pattern from the thin ODF Tx is very challenging. Here, we propose and experimentally demonstrate the pixel-row-per-bit based neural-network (PPB-NN) to decode the rolling-shutter-pattern emitted by the thin ODF Tx. The proposed PPB-NN algorithm is discussed. The proposed PPB-NN method can satisfy the pre-forward error correction (FEC) BER at data rate of 3,300 bit/s at a transmission distance of 35 cm. Theoretical analysis of the maximum ODF Tx angle is also discussed; and our experimental values agree with our theoretical results.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

The increasing demand of wireless communication bandwidths is expanding the research attention from the radio frequency (RF) communication technologies to the optical wireless communication (OWC) technologies, which can provide huge bandwidths using the infra-red (IR), visible light, and ultra-violet (UV) spectra [14]. OWC has the advantages that it is electromagnetic-interference (EMI) free and license-free. It does not interfere with the RF communication devices; hence, it can be utilized in conjunction with the RF signal transmission providing extra communication bandwidths. OWC systems also offer many interesting applications, such as visible light positioning (VLP) [5,6] and underwater communication [7,8], etc. Among different OWC technologies, visible light communication (VLC) [9,10] is appealing because of its possibility of employing the white light emitting diodes (LEDs) which have already been used in countless applications, including illumination systems, sign boards, display panels, mobile phones flash lamps, vehicle front and tail lamps, etc [1113]. Besides, VLC is also considered as one of the promising solutions for 6G wireless networks [14].

To increase the applicability and popularity of VLC, image sensor or camera based VLC can be adopted. It is also called optical camera communication (OCC) [15]. OCC systems can be easily implemented using the existing LED lamps or display panels as transmitters (Txs); and the smart-phone cameras, vehicle cameras or surveillance cameras as receivers (Rxs). Recently, there are many works about OCC applications in robotics for communication and positioning [1618]. In OCC system, there is no need of significant hardware modification in both Txs and Rxs. One kind of OCC systems is based on under-sampled modulation (USM) [19]. In this approach, the LED Tx is modulated at a higher data rate than the camera frame rate in order to provide non-flickering light source. At the Rx side, only a small time portion of each data bit period is sampled and captured by the camera. USM based on under-sampled frequency shift on-off keying (UFSOOK) [19], red-green-blue (RGB) UFSOOK [20], under-sampled phase shift on-off keying (UPSOOK) [21], and RGB UPSOOK [22] were reported.

Besides the USM OCC systems, rolling shutter based OCC systems using complementary metal oxide semiconductor (CMOS) camera are also promising [23,24]. During the rolling shutter operation, each pixel-row in the CMOS camera is activated sequentially. The CMOS image sensor can capture bright and dark fringes denoting light “ON” and “OFF” when the LED Tx is modulated higher than the frame rate (i.e. frame per second, fps) but slower than the pixel-row activation rate. Each pixel-row is activated sequentially from top to bottom with a row-by-row exposure delay time (tpixel-row-delay). When all the pixel-rows in the image sensor have completed the exposure, an image frame will be constructed. During this image frame construction period, which is also known as the processing time-gap (tgap), the CMOS image sensor cannot detect any optical signal. Recently, different techniques have been adopted in rolling shutter based OCC systems to improve its performance, including using multiple-input multiple-output (MIMO) [25], adaptive thresholding to decode the rolling shutter pattern [26], etc. Techniques, such as using logistic regression [27], 2-D convolution neural network (CNN) [28], Z-score normalization with artificial neural network (ANN) [29,30], and accumulative sampling scheme [31] were proposed to enhance the rolling shutter pattern decoding. In order to increase the applicability of VLC systems, the field-of-view (FOV) of Txs and Rxs should be considered. For example, optical Txs utilizing optical beam steering achieving beam steering angles of 20° [32] and 12° [33] were reported respectively. Besides, 240° FOV Rx using fluorescent fiber optical concentrator was demonstrated [34]. As optical fiber can provide a uniform light emission pattern around its circumference, it could also be a promising optical Tx providing 360° light emission. Recently, OCC systems using light-emitting-diode (LED) coupled illuminating plastic optical fiber (POF) have been demonstrated [35,36] at data rate of 3 kbit/s (i.e. 6 bit × 500 Hz). Table 1 summarizes the recent demonstrations of rolling shutter based OCC systems.

Tables Icon

Table 1. Recent Demonstrations of Rolling Shutter Based OCC Systems

In this work, we demonstrate an OCC system utilizing a laser-diode (LD) coupled optical-diffusing-fiber (ODF) Tx and rolling-shutter based image sensor Rx. The ODF is also known as the optical illuminating fiber or optical leaky fiber [37]. It is a glass optical fiber produced for decorative lighting or embedded into small areas where bulky optical sources cannot fit. The ODF can provide 360° uniform light emission pattern around the fiber circumference. Moreover, decoding the high data rate rolling shutter pattern from the thin ODF Tx is very challenging. The thin ODF Tx only occupies a few pixels in each pixel row of the CMOS image sensor; hence, the column matrix selection during the rolling shutter decoding is extremely difficult. Inspired by the decoding idea proposed in [38] based on the pixel-per-symbol labeling for the 4-level pulse-amplitude-modulation (PAM4) decoding, we utilize the pixel-row-per-bit based neural network (PPB-NN) to decode the 2-level on-off keying (OOK) data emitted by the thin ODF Tx. The proposed PPB-NN algorithm is discussed. The implementation of PPB-NN is compared with traditional ANN. The proposed PPB-NN method can satisfy the pre-forward error correction (FEC) BER at data rate of 3,300 bit/s at the transmission distance of 35 cm. Efficient rolling shutter pattern decoding at 360° around the Tx circumference; and ± 80° Rx rotation angle at data rate of 2,100 bit/s can also be achieved.

2. Experiment and algorithm

Figure 1(a) shows the experimental setup of the OCC system using ODF optical Tx and mobile phone camera Rx. The optical source is a blue LD at wavelength of 450 nm and output power of 20 mW. The optical signal is coupled into the ODF via fiber pig-tailing. Figures 1(b) and 1(c) illustrate the fiber pig-tail connecting the blue LD and the ODF without and with the switching on of the blue LD respectively. The ODF (Corning Fibrance) has a light-diffusion length of 1 m, with the core, cladding and outer jacket diameters are 170 µm, 230 µm and 900 µm respectively. The blue LD is electrically driven by a simple driver circuit, which combines the direct-current (DC) of 3.7 V bias and the OOK data generated by an arbitrary waveform generator (AWG, Tektronix AFG3252C). The OCC data packet consists of a fixed 10-bit header and different bit-length payload. The AWG can be operated at different driving frequency fTx to produce different data rate OCC data packet. The OCC signal is transmitted through different free-space distances and then received by a mobile phone CMOS camera with a resolution of 1920 × 1080 pixels and a frame rate of 30 fps. Figures 1(d) and 1(e) illustrate the photos of rolling shutter patterns captured by the mobile phone camera at 0° and 45° rotation angles respectively. As described above, the CMOS camera is operated at rolling shutter mode. Bright and dark fringes representing optical signal “ON” and “OFF” are captured in each image frame when the ODF Tx is modulated faster than the frame rate but slower than the row-by-row exposure time.

 figure: Fig. 1.

Fig. 1. (a) Experimental setup of the proposed OCC system using ODF optical Tx and mobile phone camera Rx. Photos of fiber pig-tail connecting the blue LD and the ODF (b) without and (c) with the switching on of the blue LD. Photos of rolling shutter patterns captured by the mobile phone camera at (d) 0° and (e) 45° rotation angles.

Download Full Size | PDF

The architecture of the decoding mechanism, including the training phase and testing phase are illustrated in Figs. 2(a) and 2(b) respectively. First, training images are input to the Image-to-Data Preprocessing (IDP) block as illustrated in Fig. 2(c). In the IDP module, the image frames are changed into grayscale values from 0 to 255 representing total darkness and brightness. Since the ODF Tx is thin and only occupies a few pixels, noise reduction scheme is needed. In this process, a 2nd order polynomial fitting curve is applied to each pixel-row. The maximum point of each fitting curve is used to represent grayscale value of that pixel-row. Other technique [39] may also be used to identify the ODF Tx. After this, the grayscale values representing the ODF Tx in different pixel-rows are extracted to form a grayscale value column matrix, which is then used to construct the grayscale value pattern. Afterwards, the packet payload can be retrieved between two headers in the grayscale value pattern. At last, PPB calculation, as well as re-sampling are performed. The PPB is defined in Eq. (1), where Ppayload is the number of pixels of the whole payload data and Bpayload is the bit length of the payload.

$$PPB = roundup(\frac{{{P_{payload}}}}{{{B_{payload}}}})$$

The PPB re-sampling is to ensure that every input to the PPB-NN is an integer and has the same length for each logic bit. If the payload length is not equal to PPB, it will be up-sampled to PPB × Bpayload. Figure 3(a) illustrates the operation principle of PPB re-sampling. As discussed above, after the data packet payload retrieval from the grayscale value pattern, the whole grayscale value pattern will be split into several segments, and each segment length equals to the PPB. The grayscale values in each segment are used as the inputs of the PPB-NN. For example, if each logic bit occupies 7 pixel-rows in the image sensor, the PPB will be 7. This also means that there are 7 nodes in the input layer of the PPB-NN as illustrated in Fig. 3(b). Here, the proposed PPB-NN is composed of 6 fully connected layers. The first is the input layer with the node number equals the PPB. Then, there are 4 fully connected (FC) hidden layers, in which ReLu is the activation function. At the output layer, Softmax is used to obtain the probabilities of logic 0 and 1. Here, loss function used is based on sparse categorical cross entropy [40]. Adam optimizer is employed to update different parameters during the training phase. After the PPB-NN model is constructed; testing phase can be executed. Here, different image frame sets are used for the training and testing phases. The number of frames used in training, validation and testing are 16, 4 and 200 respectively. Finally, the PPB-NN model is evaluated using bit-error-rate (BER) measurement.

 figure: Fig. 2.

Fig. 2. Architecture of the decoding mechanism, including (a) training phase, (b) testing phase, and (c) Image-to-Data Preprocessing (IDP) module.

Download Full Size | PDF

 figure: Fig. 3.

Fig. 3. (a) Operation principle PPB re-sampling. (b) Structure of proposed PPB-NN model.

Download Full Size | PDF

3. Results and discussions

Figures 4(a) and 4(b) show the measured BER of the proposed OCC system using ODF Tx and mobile phone camera Rx at different scenarios. As shown in Fig. 4(a), the BERs of using the traditional ANN and the proposed PPB-NN are compared at different data rates and free-space transmission distances (100 cm and 35 cm). The proposed PPB-NN shows a significant BER improvement than the traditional ANN since it can provide efficient distortion mitigation at high inter-symbol interference (ISI) when both data rate and transmission distance are high. The proposed PPB-NN method can satisfy the pre-FEC BER at data rate of 3,300 bit/s at the transmission distance of 35 cm. When the traditional ANN decoding scheme is employed, data rate of only 1,500 bit/s can be achieved. The PPB-NN can also achieve 2,100 bit/s fulfilling the pre-FEC BER at 100 cm transmission distance, while the ANN can only achieve 1,500 bit/s. Besides, The BERs around the ODF circumference at a distance of 35 cm is also measured as shown in Fig. 4(b). Here, only the PPB-NN is utilized. The measured BER is nearly the same around the 360° ODF circumference.

 figure: Fig. 4.

Fig. 4. Measured BER of the proposed OCC system (a) using ANN and proposed PPB-NN at different data rates and distance; (b) using PPB-NN around the ODF Tx circumference.

Download Full Size | PDF

We then analyze and evaluate the rotation angle of the smart-phone camera with respected to the ODF Tx. The transmission distance is kept at 35 cm. Figures 5(a) and 5(b) show an example when the ODF Tx is rotated at an angle θ with respected to the width of the mobile phone screen. Each image frame has the width × height resolution of 1080 × 1920 pixels. As illustrated in Fig. 5(b), assuming a bright fringe at 0° occupies x and y pixels in the horizontal and vertical directions respectively. When the ODF Tx is rotated from 0° to θ, the bright fringe of the ODT observed in the screen is elongated. This means the bright fringe still occupies x pixels in the horizontal direction; however, the pixels in the vertical direction is increased. As a result, the maximum ODF Tx rotation angle is limited by the resolution of the image sensor height, not the width. To calculate the maximum ODF Tx rotation angle, Eq. (2) derived from the principle of rolling shutter effect can be obtained, where fTx and tpixel-row-delay are the AWG driving frequency and the row-by-row exposure delay time as discussed in Section 2; and Bpacket is the bit length of the whole packet.

$$PPB \approx \frac{1}{{{f_{Tx}}(Hz)}} \times \frac{1}{{{t_{pixel - row - delay}}(s)}} \times \frac{1}{{{B_{packet}}(bits)}}(pixels)$$

Based on the analysis in Fig. 5(b), the whole payload length should be less than or equal to the Pixelheight / tanθ. Hence, the maximum ODF Tx rotation angle θM can be expressed in Eq. (3).

$${\theta _M} \approx {\tan ^{ - 1}}(\frac{{Pixe{l_{height}}(pixels)}}{{PPB(pixels/bits) \times {B_{payload}}(bits)}})$$

 figure: Fig. 5.

Fig. 5. (a) Evaluation of rotation angle of the smart-phone camera with respected to the ODF Tx. (b) Bright and dark fringes observed in the screen are elongated during rotation.

Download Full Size | PDF

Here, the measured average tpixel-row-delay is 2.0142 × 10−5 s. When the OCC data packet length is 70 bits, the PPBs obtained from Eq. (2) at AWG frequencies of 90, 120 and 150 Hz are 7, 5 and 4 respectively. When the Pixelheight is 1920 pixels; the maximum theoretical ODF Tx rotation angles are 75.68°, 79.67° and 81.70° at AWG frequencies of 90, 120 and 150 Hz respectively, based on Eq. (3).

The experimental evaluation of the ODF Tx rotation is performed as shown in Fig. 6. When the mobile phone rotation angle is increased, the observed rolling shutter pattern on the mobile phone screen will be tilted as illustrated in the photo of Fig. 1(e). When the OCC data packet length is 70 bits, we can observe in Fig. 6 that at the AWG frequencies of 90 and 120 Hz, the maximum experimental ODF Tx rotation angle is 70°, satisfying the pre-FEC requirement. These values agree with our theoretical obtained angles. When the AWG frequency is increased to 150 Hz, the maximum experimental ODF Tx rotation angle is increased to 80°, satisfying the pre-FEC requirement. This value agrees with our theoretical obtained angle discussed above. However, the BER performance is slightly decreased when compared with the cases of using AWG frequencies of 90 and 120 Hz. This is because the PPB is reduced to 4 at the AWG frequency of 150 Hz. It is worth to note that as the ODF is bendable, higher rotation angle may be achieved by bending the ODF optical Tx.

 figure: Fig. 6.

Fig. 6. Experimental BER at different ODF optical Tx rotation angles.

Download Full Size | PDF

4. Conclusion

We proposed and demonstrated an OCC system utilizing a blue LD coupled ODF Tx and the rolling shutter image sensor Rx. Here, we also proposed and experimentally demonstrated the PPB-NN to decode the rolling shutter pattern emitted by the very thin ODF Tx. The proposed PPB-NN algorithm was discussed. The proposed PPB-NN method can satisfy the pre-FEC BER at data rate of 3,300 bit/s at the transmission distance of 35 cm. The PPB-NN can also achieve 2,100 bit/s fulfilling the pre-FEC BER at 100 cm transmission distance. In addition, we also experimentally and theoretically evaluated the maximum ODF Tx rotation angle, which was about ± 80° at the data rate of 2,100 bit/s. As the ODF Tx is bendable, it is believed that higher rotation angle could be achieved by bending the ODF optical Tx.

Funding

Ministry of Science and Technology, Taiwan (MOST-109-2221-E-009-155-MY3, MOST-110-2221-E-A49-057-MY3).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but maybe obtained from the authors upon reasonable request.

References

1. C. W. Chow, C. H. Yeh, Y. Liu, Y. Lai, L. Y. Wei, C. W. Hsu, G. H. Chen, X. L. Liao, and K. H. Lin, “Enabling techniques for optical wireless communication systems,” Proc. OFC2020, paper M2F.1. (Invited).

2. R. Zhang, P. C. Peng, X. Li, S. Liu, Q. Zhou, J. He, Y. W. Chen, S. Shen, S. Yao, and G. K. Chang, “4 ( 100-Gb/s PAM-4 FSO transmission based on polarization modulation and direct detection,” IEEE Photonics Technol. Lett. 31(10), 755–758 (2019). [CrossRef]  

3. H. H. Lu, Y. P. Lin, P. Y. Wu, C. Y. Chen, M. C. Chen, and T. W. Jhang, “A multiple-input-multiple-output visible light communication system based on VCSELs and spatial light modulators,” Opt. Express 22(3), 3468–3474 (2014). [CrossRef]  

4. C. H. Chang, C. Y. Li, H. H. Lu, C. Y. Lin, J. H. Chen, Z. W. Wan, and C. J. Cheng, “A 100-Gb/s multiple-input multiple-output visible laser light communication system,” J. Lightwave Technol. 32(24), 4723–4729 (2014). [CrossRef]  

5. Y. C. Chuang, Z. Q. Li, C. W. Hsu, Y. Liu, and C. W. Chow, “Visible light communication and positioning using positioning cells and machine learning algorithms,” Opt. Express 27(11), 16377–16383 (2019). [CrossRef]  

6. C. W. Hsu, S. J. Su, Y. W. Chen, Q. Zhou, Y. Alfadhli, and G. K. Chang, “Real-time demonstration of 5G MMW beamforming and tracking using integrated visible light positioning system,” Proc. OFC2021, paper Tu5E.6.

7. C. Shen, Y. Guo, H. M. Oubei, T. K. Ng, G. Liu, K. H. Park, K. T. Ho, M. S. Alouini, and B. S. Ooi, “20-meter underwater wireless optical communication link with 1.5 Gbps data rate,” Opt. Express 24(22), 25502–25509 (2016). [CrossRef]  

8. H. H. Lu, C. Y. Li, H. H. Lin, W. S. Tsai, C. A. Chu, B. R. Chen, and C. J. Wu, “An 8 m/9.6 Gbps underwater wireless optical communication system,” IEEE Photonics J. 8(5), 1–7 (2016). [CrossRef]  

9. J. Vučić, C. Kottke, S. Nerreter, K. D. Langer, and J. W. Walewski, “513 Mbit/s visible light communications link based on DMT-modulation of a white LED,” J. Lightwave Technol. 28(24), 3512–3518 (2010). [CrossRef]  

10. H. L. Minh, D. O’Brien, G. Faulkner, L. Zeng, K. Lee, D. Jung, Y. J. Oh, and E. T. Won, “100-Mb/s NRZ visible light communications using a post-equalized white LED,” IEEE Photonics Technol. Lett. 21(15), 1063–1065 (2009). [CrossRef]  

11. C. W. Chow, C. H. Yeh, Y. F. Liu, and Y. Liu, “Improved modulation speed of LED visible light communication system integrated to main electricity network,” Electron. Lett. 47(15), 867–868 (2011). [CrossRef]  

12. P. Ji, H. M. Tsai, C. Wang, and F. Liu, “Vehicular visible light communications with LED taillight and rolling shutter camera,” IEEE VTC Spring2014, pp. 1–6.

13. C. W. Chow, R. J. Shiu, Y. C. Liu, C. H. Yeh, X. L. Liao, K. H. Lin, Y. C. Wang, and Y. Y. Chen, “Secure mobile-phone based visible light communications with different noise-ratio light-panel,” IEEE Photonics J. 10(2), 1–6 (2018). [CrossRef]  

14. N. Chi, Y. Zhou, Y. Wei, and F. Hu, “Visible light communication in 6G: advances, challenges, and prospects,” IEEE Veh. Technol. Mag. 15(4), 93–102 (2020). [CrossRef]  

15. C. W. Chow, Y. Liu, C. H. Yeh, Y. H. Chang, Y. S. Lin, K. L. Hsu, X. L. Liao, and K. H. Lin, “Display light panel and rolling shutter image sensor based optical camera communication (OCC) using frame-averaging background removal and neural network,” J. Lightwave Technol. 39(13), 4360–4366 (2021). [CrossRef]  

16. W. Guan, L. Huang, S. Wen, Z. Yan, W. Liang, C. Yang, and Z. Liu, “Robot localization and navigation using visible light positioning and SLAM fusion,” J. Lightwave Technol. 39(22), 7040–7051 (2021). [CrossRef]  

17. L. Huang, S. Wen, Z. Yan, H. Song, S. Su, and W. Guan, “Single LED positioning scheme based on angle sensors in robotics,” Appl. Opt. 60(21), 6275–6287 (2021). [CrossRef]  

18. Z. Yan, W. Guan, S. Wen, L. Huang, and H. Song, “Multirobot cooperative localization based on visible light positioning and odometer,” IEEE Trans. Instrum. Meas. 70, 1–8 (2021). [CrossRef]  

19. R. D. Roberts, “Undersampled frequency shift ON-OFF keying (UFSOOK) for camera communications (CamCom),” Proc. WOCC 2013, 2013.

20. C. W. Chow, R. J. Shiu, Y. C. Liu, Y. Liu, and C. H. Yeh, “Non-flickering 100 m RGB visible light communication transmission based on a CMOS image sensor,” Opt. Express 26(6), 7079–7084 (2018). [CrossRef]  

21. P. Luo, Z. Ghassemlooy, H. Le Minh, X. Tang, and H. M. Tsai, “Undersampled phase shift ON-OFF keying for camera communication,”, Proc. WCSP, pp. 1–6, 2014.

22. P. Luo, M. Zhang, Z. Ghassemlooy, H. L. Minh, H. M. Tsai, X. Tang, L. C. Png, and D. Han, “Experimental demonstration of RGB LED-based optical camera communications,” IEEE Photonics J. 7(5), 1–12 (2015). [CrossRef]  

23. C. Danakis, M. Afgani, G. Povey, I. Underwood, and H. Haas, “Using a CMOS camera sensor for visible light communication,” Proc. OWC’12, 1244–1248.

24. C. W. Chow, C. Y. Chen, and S. H. Chen, “Visible light communication using mobile-phone camera with data rate higher than frame rate,” Opt. Express 23(20), 26080–26085 (2015). [CrossRef]  

25. K. Liang, C. W. Chow, and Y. Liu, “RGB visible light communication using mobile-phone camera and multi-input multi-output,” Opt. Express 24(9), 9383–9388 (2016). [CrossRef]  

26. C. W. Chen, C. W. Chow, Y. Liu, and C. H. Yeh, “Efficient demodulation scheme for rolling-shutter-patterning of CMOS image sensor based visible light communications,” Opt. Express 25(20), 24362–24367 (2017). [CrossRef]  

27. Y. C. Chuang, C. W. Chow, Y. Liu, C. H. Yeh, X. L. Liao, K. H. Lin, and Y. Y. Chen, “Using logistic regression classification for mitigating high noise-ratio advisement light-panel in rolling-shutter based visible light communications,” Opt. Express 27(21), 29924–29929 (2019). [CrossRef]  

28. L. Liu, R. Deng, and L. K. Chen, “47-kbit/s RGB-LED-based optical camera communication based on 2D-CNN and XOR-based data loss compensation,” Opt. Express 27(23), 33840–33846 (2019). [CrossRef]  

29. 29. Y. S. Lin, Y. Liu, C. W. Chow, Y. H. Chang, D. C. Lin, S. H. Song, K. L. Hsu, and C. H. Yeh, “Z-score averaging neural network and background content removal for high performance rolling shutter based optical camera communication (OCC),” Proc. OFC, 2021, paper F1A.4.

30. K. L. Hsu, C. W. Chow, Y. Liu, Y. C. Wu, C. Y. Hong, X. L. Liao, K. H. Lin, and Y. Y. Chen, “Rolling-shutter-effect camera-based visible light communication using RGB channel separation and an artificial neural network,” Opt. Express 28(26), 39956–39962 (2020). [CrossRef]  

31. P. Zhang, Q. Wang, Y. Yang, Y. Wang, Y. Sun, W. Xu, J. Luo, and L. Chen, “Enhancing the performance of optical camera communication via accumulative sampling,” Opt. Express 29(12), 19015–19023 (2021). [CrossRef]  

32. H. Chun, A. Gomez, C. Quintana, W. Zhang, G. Faulkner, and D. O’Brien, “A wide-area coverage 35 Gb/s visible light communications link for indoor wireless applications,” Sci. Rep. 9(1), 4952 (2019). [CrossRef]  

33. T. Koonen, J. Oh, K. Mekonnen, Z. Cao, and E. Tangdiongga, “Ultra-high capacity indoor optical wireless communication using 2D-steered pencil beams,” J. Lightwave Technol. 34(20), 4802–4809 (2016). [CrossRef]  

34. A. Riaz and S. Collins, “A wide field of view VLC receiver for smartphones,” Proc. ECOC, 2020, pp. 1–4.

35. S. R. Teli, K. Eollosova, S. Zvanovec, Z. Ghassemlooy, and M. Komanec, “Optical camera communications link using an LED-coupled illuminating optical fiber,” Opt. Lett. 46(11), 2622–2625 (2021). [CrossRef]  

36. S. R. Teli, K. Eollosova, S. Zvanovec, Z. Ghassemlooy, and M. Komanec, “Experimental characterization of fiber optic lighting - optical camera communications,” 2021 IEEE Int. Sym. Personal, Indoor and Mobile Radio Comm. (PIMRC), 2021, pp. 1–5.

37. D. C. Tsai, Y. H. Chang, Y. Liu, C. W. Chow, Y. S. Lin, and C. H. Yeh, “Wide field-of-view (FOV) light-diffusing fiber optical transmitter for rolling shutter based optical camera communication (OCC),” Proc. OFC, 2022. Paper Tu3C.3.

38. Y. S. Lin, C. W. Chow, Y. Liu, Y. H. Chang, K. H. Lin, Y. C. Wang, and Y. Y. Chen, “PAM4 rolling-shutter demodulation using a pixel-per-symbol labeling neural network for optical camera communications,” Opt. Express 29(20), 31680–31688 (2021). [CrossRef]  

39. K. Liang, C. W. Chow, and Y. Liu, “Mobile-phone based visible light communication using region-grow light source tracking for unstable light source,” Opt. Express 24(15), 17505–17510 (2016). [CrossRef]  

40. P. T. de Boer, D. P. Kroese, S. Mannor, and R. Y. Rubinstein, “A tutorial on the cross-entropy method,” Ann. Oper. Res. 134(1), 19–67 (2005). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but maybe obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. (a) Experimental setup of the proposed OCC system using ODF optical Tx and mobile phone camera Rx. Photos of fiber pig-tail connecting the blue LD and the ODF (b) without and (c) with the switching on of the blue LD. Photos of rolling shutter patterns captured by the mobile phone camera at (d) 0° and (e) 45° rotation angles.
Fig. 2.
Fig. 2. Architecture of the decoding mechanism, including (a) training phase, (b) testing phase, and (c) Image-to-Data Preprocessing (IDP) module.
Fig. 3.
Fig. 3. (a) Operation principle PPB re-sampling. (b) Structure of proposed PPB-NN model.
Fig. 4.
Fig. 4. Measured BER of the proposed OCC system (a) using ANN and proposed PPB-NN at different data rates and distance; (b) using PPB-NN around the ODF Tx circumference.
Fig. 5.
Fig. 5. (a) Evaluation of rotation angle of the smart-phone camera with respected to the ODF Tx. (b) Bright and dark fringes observed in the screen are elongated during rotation.
Fig. 6.
Fig. 6. Experimental BER at different ODF optical Tx rotation angles.

Tables (1)

Tables Icon

Table 1. Recent Demonstrations of Rolling Shutter Based OCC Systems

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

P P B = r o u n d u p ( P p a y l o a d B p a y l o a d )
P P B 1 f T x ( H z ) × 1 t p i x e l r o w d e l a y ( s ) × 1 B p a c k e t ( b i t s ) ( p i x e l s )
θ M tan 1 ( P i x e l h e i g h t ( p i x e l s ) P P B ( p i x e l s / b i t s ) × B p a y l o a d ( b i t s ) )
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.