Abstract

In this paper, facial images from various video sequences are used to obtain a heart rate reading. In this study, a video camera is used to capture the facial images of eight subjects whose heart rates vary dynamically, between 81 and 153 BPM. Principal component analysis (PCA) is used to recover the blood volume pulses (BVP) which can be used for the heart rate estimation. An important consideration for accuracy of the dynamic heart rate estimation is to determine the shortest video duration that realizes it. This video duration is chosen when the six principal components (PC) are least correlated amongst them. When this is achieved, the first PC is used to obtain the heart rate. The results obtained from the proposed method are compared to the readings obtained from the Polar heart rate monitor. Experimental results show the proposed method is able to estimate the dynamic heart rate readings using less computational requirements when compared to the existing method. The mean absolute error and the standard deviation of the absolute errors between experimental readings and actual readings are 2.18 BPM and 1.71 BPM respectively.

© 2015 Optical Society of America

1. Introduction

Video based heart rate estimation is based on the photoplethysmography (PPG) technique [1,2]. PPG technique is a non-invasive optical technique used for measuring the blood volume pulses (BVP) through the intensity variations in the reflected light [3].

Poh et al. [4] used Independent component analysis (ICA) to recover the BVP signals from the facial images of the subjects. They showed that the color components of the facial images, namely red, green and blue (RGB), are dependent to each other. The method was tested on several 60-second videos and accurate results were obtained.

Kumar et.al [5] proposed a model, known as DistancePPG, to improve the signal-to-noise ratio of the camera-based PPG signal by combining the color change signals obtained from different regions of the face using a weighted average. Additionally, they introduced a method to track different regions of the face separately to extract the PPG signals under motion. The method was evaluated on people having diverse skin tones, under various lighting conditions and natural motion scenarios. Kumar et.al concluded that the accuracy of heart rate estimation was significantly improved using the proposed method.

On the other hand, Xu et al. [6] designed a simplified Mathematic model to predict the human heart rates based on the light absorbance by the skin. In this work, the change of blood concentration due to arterial pulsation was defined as a pixel quotient in log space. The method was tested on subjects who were stationary. Xu et al. concluded that the method gave accurate results.

Previous works did not focus on the dynamic heart rate variation. For dynamic heart rate variation, short video sequence is essential for real time implementation. Yu et al. [7] proposed a method that uses a combination of ICA and mutual information to compute the dynamic heart rate variation from short video sequence. For short video sequences, the challenge in using ICA method is that the ICA sources may not have sufficient independence amongst them. Hence, mutual information was used to establish the independence of the sources to obtain an accurate reading. However, this method is computationally intensive.

In this paper, we propose to use principal component analysis (PCA), which is less computationally intensive than ICA, to estimate the instantaneous heart rate that varies dynamically from short video sequences. PCA is a data reduction technique that is commonly used in image and video analysis [8,9]. Since the pixel intensities in log space for the facial images are correlated to each other, PCA is used to recover the de-correlated principal components. However, for short video sequence, the principal components (PC) may still have high correlation amongst them and that may render inaccurate reading. To determine the video duration that gives uncorrelated PCs, the Pearson correlation coefficient is used.

Section 2 discusses the relationship between RGB and YCbCr components and hemoglobin concentration where it can be used to determine the heart rate from the video sequences. The proposed method that uses PCA and Pearson correlation coefficient is presented in Section 3. The results obtained from the method are presented and discussed in Section 4. Section 5 concludes the study.

2. Relationship between RGB and YCbCr components of facial images

Human skin is composed of different layers [10] and its color is highly related to melanin and hemoglobin concentrations. Xu et al. [6] derived the relationship between the RGB pixel intensities obtained from a facial image and the hemoglobin and melanin concentrations, ch and cm, in the skin layer as:

logPR={vm(R)cm+vh(R)ch+A0(R)}+logkE(R),
logPG={vm(G)cm+vh(G)ch+A0(G)}+logkE(G),
logPB={vm(B)cm+vh(B)ch+A0(B)}+logkE(B).
where R,G and B represent the red, green and blue components of the image respectively while vh and vm are the product of pigment extinction coefficient of hemoglobin and melanin respectively and their mean path length of photons in the skin layer. A0 denotes the baseline skin absorbance while k is the constant number for the camera gain and E is the power of incident light for each color component.

Considering the video is captured under constant background light and for a short duration, then the AC components of the RGB pixel intensities in log space consist mostly of hemoglobin concentration. Since hemoglobin concentration is related to blood concentration, the frequency of this hemoglobin concentration is considered as the BVP, i.e. the heart rate pulse.

As Eqs. (1)-(3) depend only on pigment concentration, baseline skin absorbance and the incident light, we may conclude that the RGB in log space are correlated to each other. Figure 1 shows the distribution of RGB pixel intensities in log space over a period of time used in our experiment. Table 1 shows the correlation among log PR, log PG, and log PB. The values in Table 1 show the RGB pixel intensities in log space are highly correlated to each other. Therefore, PCA can be used to decorrelate these color components and recover the corresponding uncorrelated PCs.

 figure: Fig. 1

Fig. 1 The distribution of log PR, log PG and log PB.

Download Full Size | PPT Slide | PDF

Tables Icon

Table 1. Correlation coefficient among log PR, log PG, and log PB

However, for short video duration, the correlation amongst the PCs is still high which upon using them in the subsequent operation may lead to inaccurate heart rate readings. To address this issue, we add three more color components, i.e. luminance Y, chrominance Cb and Cr, in log space as the input features. YCbCr components are correlated to the RGB components. These components can be derived from the RGB [11] as follows:

Y=16+65.481R+128.553G+24.966B
Cb=12837.797R74.203G+112B
Cr=128+112R93.786G18.214B
With these six color components as input features, the corresponding PCs have much lower correlation as compared to the PCs whose input features are RGB components only. Figure 2 illustrates the correlation coefficients for six PCs and three PCs respectively. It shows that the correlation coefficient for the six PCs decreases as the video duration increases while the correlation coefficient for the three PCs is not consistent and relatively higher when compared to the six PCs. Therefore, in this paper, we use PCA to recover the PPG signals from these six color components. After applying the PCA, the first PC that has the largest possible variance is considered as the PPG signal that consists of the hemoglobin concentration. The heart rate can be computed from this PPG signal.

 figure: Fig. 2

Fig. 2 The graph of correlation coefficient amongst PCs vs video duration for 3 PCs and 6 PCs respectively.

Download Full Size | PPT Slide | PDF

3. Proposed method

In this section, the proposed model to estimate the dynamic heart rate measurements using PCA is presented. The relationship between the correlation among PCs, video duration and heart rate accuracy is also discussed. As the video duration affects the accuracy of the heart rate reading, a stopping criterion is set to determine the video duration needed for dynamic heart rate estimation. The details are described in this section.

3.1 Relationship between the correlation among PCs and video duration

Ideally, PCA will compute its PCs by maximizing the correlation among the input features. However, for short video duration, the PCs may still have high correlation. Hence, it is important to find out the minimum video duration that gives the least correlation. We use Pearson correlation coefficient to determine the correlation between any two PCs. For any given two PCs x and y, the Pearson correlation coefficient of these two PCs, R is given as:

R(x,y)=C(x,y)C(x,x)C(y,y).
where C(x,y) is the covariance of PCs x and y, C(x,x) and C(y,y) are the variances of PCs x and y respectively. Since six PCs are recovered from the PCA, the averaged correlation coefficient amongst PCs, Ravg is computed using

Ravg=1(62)m=26n=1mR(m,n).

Figure 3 illustrates the relationship between the averaged correlation coefficient Ravg and video duration for a particular heart rate reading used in the experiment. A power function curve is fitted to represent the function Ravg. It is found that the value of the Ravg decreases significantly at the beginning, but remains almost constant when the video duration exceeds a specific duration. The value of the Ravg varies very little after this duration. Hence, the stopping criterion to determine the video duration is set as when the difference of Ravg for 3 continuous video frames is smaller than 2 × 10−4.

 figure: Fig. 3

Fig. 3 The relationship between the averaged correlation coefficient Ravg and the video duration and the respective computed heart rates.

Download Full Size | PPT Slide | PDF

To illustrate how correlated PCs affect the accuracy of computed heart rate readings, two points X and Y are selected in Fig. 3. The actual heart rate reading for this particular instant is 143 BPM. Point X represents a very short video duration where the Ravg doesn’t meet the stopping criterion. Point Y represents the suitable video duration where the Ravg has met the stopping criterion. Point Y gives more accurate heart rate estimation, i.e 142.38 BPM as compared to point X that gives 63.75 BPM. When the stopping criterion is met, the corresponding video duration is used to compute the instantaneous heart rate for that particular instant.

3.2 Block diagram of proposed model

The block diagram of the proposed model is illustrated in Fig. 4. The face region is identified by using the model described in [12] and the region of interest (ROI) is fixed at the area below eyes and above the upper lip of mouth. For each frame, the spatially average of the RGB and YCbCr components, i.e.: µR, µG, µB, µY, µCb, and µCr are computed respectively. All six color components are projected into log space. Therefore, at any time instant, a set of six input features log PR, log PG, log PB, log PY, log PCb and log PCr are formed. The set of input features are then detrended using the model developed by [13]. PCA is then used to recover six PCs from these six input features. The set of PCs is bandpass filtered (128-point Hamming window, 0.8-4 Hz).

 figure: Fig. 4

Fig. 4 Flow chart of the proposed method.

Download Full Size | PPT Slide | PDF

The entire process is repeated by increasing the number of previous video frames, until the stopping criterion described in Section 3.1 is met. At this point, the corresponding number of frames is chosen as the video duration needed to compute the instantaneous heart rate reading. The first PC is then chosen as the PPG signal. The corresponding frequency of this PPG signal is considered as the instantaneous heart rate reading for that particular instant.

4. Experimental study

In this section, the experimental setup and the experimental results are discussed and analysed. A comparative study between the proposed method and the method used in [7] is also presented.

4.1 Experimental setup

All experiments were set up under constant office fluorescent light. A Sony camcorder (HDR-PJ260VE) was used for the video recording purposes. All videos were recorded and sampled at 50 frames per second. The camcorder was fixed at a position with a distance of about 0.60 m from the subject’s face. In the experiment, eight subjects were selected and requested to carry out a cycling activity. In the first stage of the experiment, four subjects were asked to cycle at different speeds for about two minutes. Then they were asked to stop for one minute. The camcorder was used to capture their facial images during that time. In the second stage of the experiment, the remaining four subjects were asked to cycle continuously and their facial images were captured by the camcorder for one minute. An increasing heart rate trend was observed. Throughout the video recordings, all subjects were asked to remain stationary. Sixty heart rate readings (sampled at each second) were computed for every subject.

As reference, the instantaneous heart rates of each subject that obtained from the proposed method were compared to the actual heart rate readings measured from Polar Heart Rate Monitor – Polar Team2 Pro. Polar Team2 Pro samples and computes the instantaneous heart rate by measuring at least one ECG signal waveform, as described in the patents [14,15].

4.2 Experimental results and analysis

A total of 480 instantaneous heart rate readings were obtained from this experiment. In the experiment, the subjects’ heart rates were varying between 81 BPM and 153 BPM. Table 2 summarizes the details of the computed heart rate readings of all subjects. The highest and the lowest mean absolute errors are 2.99 and 1.37 BPM. Figure 5 shows the scattered plot of all computed and actual heart rate readings. It shows that the computed heart rate readings are closely correlated to the actual heart rate readings. The correlation coefficient between the computed and actual heart rate readings is 0.99. The mean absolute error for all readings is 2.18 BPM while the standard deviation of absolute errors is 1.71 BPM. The Bland Altman plot is shown in Fig. 6. It shows that only a small number of computed heart rate readings are located outside the 95% limit of agreement interval.

Tables Icon

Table 2. Summary of Heart Rate Readings Results Obtained from Proposed Method

 figure: Fig. 5

Fig. 5 Comparison of actual heart rate readings and computer heart rate readings.

Download Full Size | PPT Slide | PDF

 figure: Fig. 6

Fig. 6 Bland-Altman Plot for all computed heart rate readings.

Download Full Size | PPT Slide | PDF

4.3 Comparative study between proposed method and existing method

A comparative study has been done to compare the accuracy (mean error and standard deviation of error), video duration for the heart rate computation, and the computational cost of using the method described in this paper and the method suggested in [7]. To calculate the computational cost, both ICA [7] and PCA computations are repeated for 1000 times and the average time taken is recorded. Table 3 summarizes the results of the comparative study. As can be seen in Table 3, both accuracy and video duration are not much different for these two methods. However, in terms of the computational cost, the proposed method is much more efficient than the method used in [7]. Additionally, the proposed method directly uses the first PCs to compute the heart rate while [7] investigated all ICA sources first and then chose the source with high peak in frequency domain as the source giving heart rate information. As low computational cost and small memory resources are important factors for the eventual implementation in mobile phones, the proposed method is more efficient than the previous method.

Tables Icon

Table 3. Comparison of proposed method (using PCA) and method suggested in [7] (using ICA)

5. Conclusion

In this study, it is found that heart rate readings can be obtained by applying PCA to the facial images. When the PCs are uncorrelated to each other, then an accurate reading can be obtained. An important consideration for dynamic heart rate estimation is the need for video duration. Instead of using RGB components only, three additional components, YCbCr are used. In doing so, a shorter video duration is obtained. To ensure the reliability of the heart rate estimation, the PCs must have least correlation. To validate the criterion, Pearson correlation coefficient is used. Experimental results show that this method is able to estimate dynamic heart rates from short video sequences using less computational requirements when compared to [7].

Acknowledgment

This research is supported by High Impact Research Chancellory Grant UM.C/HIR/MOHE/ENG/42 from the University of Malaya.

References and links

1. A. A. Kamshilin, S. Miridonov, V. Teplov, R. Saarenheimo, and E. Nippolainen, “Photoplethysmographic imaging of high spatial resolution,” Biomed. Opt. Express 2(4), 996–1006 (2011). [CrossRef]   [PubMed]  

2. M. Z. Poh, D. J. McDuff, and R. W. Picard, “Non-contact, automated cardiac pulse measurements using video imaging and blind source separation,” Opt. Express 18(10), 10762–10774 (2010). [CrossRef]   [PubMed]  

3. A. B. Hertzman and C. R. Spealman, “Observations on the finger volume pulse recorded photoelectrically,” Am. J. Physiol. 119(334), 3 (1937).

4. M. Z. Poh, D. J. McDuff, and R. W. Picard, “Advancements in noncontact, multiparameter physiological measurements using a webcam,” IEEE Trans. Biomed. Eng. 58(1), 7–11 (2011).

5. M. Kumar, A. Veeraraghavan, and A. Sabharwal, “DistancePPG: Robust non-contact vital signs monitoring using a camera,” Biomed. Opt. Express 6(5), 1565–1588 (2015). [CrossRef]   [PubMed]  

6. S. Xu, L. Sun, and G. K. Rohde, “Robust efficient estimation of heart rate pulse from video,” Biomed. Opt. Express 5(4), 1124–1135 (2014). [CrossRef]   [PubMed]  

7. Y. P. Yu, P. Raveendran, and C. L. Lim, “Dynamic heart rate measurements from video sequences,” Biomed. Opt. Express 6(7), 2466–2480 (2015). [CrossRef]   [PubMed]  

8. X. Liu, D. Wang, F. Liu, and J. Bai, “Principal component analysis of dynamic fluorescence diffuse optical tomography images,” Opt. Express 18(6), 6300–6314 (2010). [CrossRef]   [PubMed]  

9. J. Vargas, J. A. Quiroga, and T. Belenguer, “Phase-shifting interferometry based on principal component analysis,” Opt. Lett. 36(8), 1326–1328 (2011). [CrossRef]   [PubMed]  

10. A. Krishnaswamy and G. V. Baranoski, (2004). A study on skin optics. Natural Phenomena Simulation Group, School of Computer Science, University of Waterloo, Canada, Technical Report, 1, 1–17.

11. C. A. Poynton, (1996). A technical introduction to digital video. John Wiley & Sons, Inc.

12. P. Viola and M. Jones, (2001). Rapid object detection using a boosted cascade of simple features. In Computer Vision and Pattern Recognition, 2001. CVPR 2001. Proceedings of the 2001 IEEE Computer Society Conference on (Vol. 1, pp. I-511). IEEE. [CrossRef]  

13. M. P. Tarvainen, P. O. Ranta-Aho, and P. A. Karjalainen, “An advanced detrending method with application to HRV analysis,” IEEE Trans. Biomed. Eng. 49(2), 172–175 (2002). [CrossRef]   [PubMed]  

14. A. Pietila and T. Tammi, (1997). U.S. Patent No. 5,622,180. Washington, DC: U.S. Patent and Trademark Office.

15. I. Heikkila, (1998). U.S. Patent No. 5,840,039. Washington, DC: U.S. Patent and Trademark Office.

References

  • View by:
  • |
  • |
  • |

  1. A. A. Kamshilin, S. Miridonov, V. Teplov, R. Saarenheimo, and E. Nippolainen, “Photoplethysmographic imaging of high spatial resolution,” Biomed. Opt. Express 2(4), 996–1006 (2011).
    [Crossref] [PubMed]
  2. M. Z. Poh, D. J. McDuff, and R. W. Picard, “Non-contact, automated cardiac pulse measurements using video imaging and blind source separation,” Opt. Express 18(10), 10762–10774 (2010).
    [Crossref] [PubMed]
  3. A. B. Hertzman and C. R. Spealman, “Observations on the finger volume pulse recorded photoelectrically,” Am. J. Physiol. 119(334), 3 (1937).
  4. M. Z. Poh, D. J. McDuff, and R. W. Picard, “Advancements in noncontact, multiparameter physiological measurements using a webcam,” IEEE Trans. Biomed. Eng. 58(1), 7–11 (2011).
  5. M. Kumar, A. Veeraraghavan, and A. Sabharwal, “DistancePPG: Robust non-contact vital signs monitoring using a camera,” Biomed. Opt. Express 6(5), 1565–1588 (2015).
    [Crossref] [PubMed]
  6. S. Xu, L. Sun, and G. K. Rohde, “Robust efficient estimation of heart rate pulse from video,” Biomed. Opt. Express 5(4), 1124–1135 (2014).
    [Crossref] [PubMed]
  7. Y. P. Yu, P. Raveendran, and C. L. Lim, “Dynamic heart rate measurements from video sequences,” Biomed. Opt. Express 6(7), 2466–2480 (2015).
    [Crossref] [PubMed]
  8. X. Liu, D. Wang, F. Liu, and J. Bai, “Principal component analysis of dynamic fluorescence diffuse optical tomography images,” Opt. Express 18(6), 6300–6314 (2010).
    [Crossref] [PubMed]
  9. J. Vargas, J. A. Quiroga, and T. Belenguer, “Phase-shifting interferometry based on principal component analysis,” Opt. Lett. 36(8), 1326–1328 (2011).
    [Crossref] [PubMed]
  10. A. Krishnaswamy and G. V. Baranoski, (2004). A study on skin optics. Natural Phenomena Simulation Group, School of Computer Science, University of Waterloo, Canada, Technical Report, 1, 1–17.
  11. C. A. Poynton, (1996). A technical introduction to digital video. John Wiley & Sons, Inc.
  12. P. Viola and M. Jones, (2001). Rapid object detection using a boosted cascade of simple features. In Computer Vision and Pattern Recognition, 2001. CVPR 2001. Proceedings of the 2001 IEEE Computer Society Conference on (Vol. 1, pp. I-511). IEEE.
    [Crossref]
  13. M. P. Tarvainen, P. O. Ranta-Aho, and P. A. Karjalainen, “An advanced detrending method with application to HRV analysis,” IEEE Trans. Biomed. Eng. 49(2), 172–175 (2002).
    [Crossref] [PubMed]
  14. A. Pietila and T. Tammi, (1997). U.S. Patent No. 5,622,180. Washington, DC: U.S. Patent and Trademark Office.
  15. I. Heikkila, (1998). U.S. Patent No. 5,840,039. Washington, DC: U.S. Patent and Trademark Office.

2015 (2)

2014 (1)

2011 (3)

2010 (2)

2002 (1)

M. P. Tarvainen, P. O. Ranta-Aho, and P. A. Karjalainen, “An advanced detrending method with application to HRV analysis,” IEEE Trans. Biomed. Eng. 49(2), 172–175 (2002).
[Crossref] [PubMed]

1937 (1)

A. B. Hertzman and C. R. Spealman, “Observations on the finger volume pulse recorded photoelectrically,” Am. J. Physiol. 119(334), 3 (1937).

Bai, J.

Belenguer, T.

Hertzman, A. B.

A. B. Hertzman and C. R. Spealman, “Observations on the finger volume pulse recorded photoelectrically,” Am. J. Physiol. 119(334), 3 (1937).

Kamshilin, A. A.

Karjalainen, P. A.

M. P. Tarvainen, P. O. Ranta-Aho, and P. A. Karjalainen, “An advanced detrending method with application to HRV analysis,” IEEE Trans. Biomed. Eng. 49(2), 172–175 (2002).
[Crossref] [PubMed]

Kumar, M.

Lim, C. L.

Liu, F.

Liu, X.

McDuff, D. J.

M. Z. Poh, D. J. McDuff, and R. W. Picard, “Advancements in noncontact, multiparameter physiological measurements using a webcam,” IEEE Trans. Biomed. Eng. 58(1), 7–11 (2011).

M. Z. Poh, D. J. McDuff, and R. W. Picard, “Non-contact, automated cardiac pulse measurements using video imaging and blind source separation,” Opt. Express 18(10), 10762–10774 (2010).
[Crossref] [PubMed]

Miridonov, S.

Nippolainen, E.

Picard, R. W.

M. Z. Poh, D. J. McDuff, and R. W. Picard, “Advancements in noncontact, multiparameter physiological measurements using a webcam,” IEEE Trans. Biomed. Eng. 58(1), 7–11 (2011).

M. Z. Poh, D. J. McDuff, and R. W. Picard, “Non-contact, automated cardiac pulse measurements using video imaging and blind source separation,” Opt. Express 18(10), 10762–10774 (2010).
[Crossref] [PubMed]

Poh, M. Z.

M. Z. Poh, D. J. McDuff, and R. W. Picard, “Advancements in noncontact, multiparameter physiological measurements using a webcam,” IEEE Trans. Biomed. Eng. 58(1), 7–11 (2011).

M. Z. Poh, D. J. McDuff, and R. W. Picard, “Non-contact, automated cardiac pulse measurements using video imaging and blind source separation,” Opt. Express 18(10), 10762–10774 (2010).
[Crossref] [PubMed]

Quiroga, J. A.

Ranta-Aho, P. O.

M. P. Tarvainen, P. O. Ranta-Aho, and P. A. Karjalainen, “An advanced detrending method with application to HRV analysis,” IEEE Trans. Biomed. Eng. 49(2), 172–175 (2002).
[Crossref] [PubMed]

Raveendran, P.

Rohde, G. K.

Saarenheimo, R.

Sabharwal, A.

Spealman, C. R.

A. B. Hertzman and C. R. Spealman, “Observations on the finger volume pulse recorded photoelectrically,” Am. J. Physiol. 119(334), 3 (1937).

Sun, L.

Tarvainen, M. P.

M. P. Tarvainen, P. O. Ranta-Aho, and P. A. Karjalainen, “An advanced detrending method with application to HRV analysis,” IEEE Trans. Biomed. Eng. 49(2), 172–175 (2002).
[Crossref] [PubMed]

Teplov, V.

Vargas, J.

Veeraraghavan, A.

Wang, D.

Xu, S.

Yu, Y. P.

Am. J. Physiol. (1)

A. B. Hertzman and C. R. Spealman, “Observations on the finger volume pulse recorded photoelectrically,” Am. J. Physiol. 119(334), 3 (1937).

Biomed. Opt. Express (4)

IEEE Trans. Biomed. Eng. (2)

M. Z. Poh, D. J. McDuff, and R. W. Picard, “Advancements in noncontact, multiparameter physiological measurements using a webcam,” IEEE Trans. Biomed. Eng. 58(1), 7–11 (2011).

M. P. Tarvainen, P. O. Ranta-Aho, and P. A. Karjalainen, “An advanced detrending method with application to HRV analysis,” IEEE Trans. Biomed. Eng. 49(2), 172–175 (2002).
[Crossref] [PubMed]

Opt. Express (2)

Opt. Lett. (1)

Other (5)

A. Krishnaswamy and G. V. Baranoski, (2004). A study on skin optics. Natural Phenomena Simulation Group, School of Computer Science, University of Waterloo, Canada, Technical Report, 1, 1–17.

C. A. Poynton, (1996). A technical introduction to digital video. John Wiley & Sons, Inc.

P. Viola and M. Jones, (2001). Rapid object detection using a boosted cascade of simple features. In Computer Vision and Pattern Recognition, 2001. CVPR 2001. Proceedings of the 2001 IEEE Computer Society Conference on (Vol. 1, pp. I-511). IEEE.
[Crossref]

A. Pietila and T. Tammi, (1997). U.S. Patent No. 5,622,180. Washington, DC: U.S. Patent and Trademark Office.

I. Heikkila, (1998). U.S. Patent No. 5,840,039. Washington, DC: U.S. Patent and Trademark Office.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 The distribution of log PR, log PG and log PB.
Fig. 2
Fig. 2 The graph of correlation coefficient amongst PCs vs video duration for 3 PCs and 6 PCs respectively.
Fig. 3
Fig. 3 The relationship between the averaged correlation coefficient Ravg and the video duration and the respective computed heart rates.
Fig. 4
Fig. 4 Flow chart of the proposed method.
Fig. 5
Fig. 5 Comparison of actual heart rate readings and computer heart rate readings.
Fig. 6
Fig. 6 Bland-Altman Plot for all computed heart rate readings.

Tables (3)

Tables Icon

Table 1 Correlation coefficient among log PR, log PG, and log PB

Tables Icon

Table 2 Summary of Heart Rate Readings Results Obtained from Proposed Method

Tables Icon

Table 3 Comparison of proposed method (using PCA) and method suggested in [7] (using ICA)

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

log P R = { v m ( R ) c m + v h ( R ) c h + A 0 ( R ) } + log k E ( R ) ,
log P G = { v m ( G ) c m + v h ( G ) c h + A 0 ( G ) } + log k E ( G ) ,
log P B = { v m ( B ) c m + v h ( B ) c h + A 0 ( B ) } + log k E ( B ) .
Y = 16 + 65.481 R + 128.553 G + 24.966 B
C b = 128 37.797 R 74.203 G + 112 B
C r = 128 + 112 R 93.786 G 18.214 B
R ( x , y ) = C ( x , y ) C ( x , x ) C ( y , y ) .
R a v g = 1 ( 6 2 ) m = 2 6 n = 1 m R ( m , n ) .

Metrics