FIRST WEEK

June 3, 2013 - June 7, 2013

This is the first week of my summer at Texas A&M. I am glad to be part of this program and I plan to get the best out of it. During the summer program, I will be working at the Perception Sensing and Instrumentation (PSI) laboratory at Texas A&M University under Dr. Ricardo Gutierrez-Osuna and the supervision of the Ph. D student Avinash Parnandi. The PSI conducts research on stress measurement using mobile phones. The name of the research project that I will be working on during the summer is called “Estimating respiratory rate from heart rate using a cellphone camera”. The purpose of the study is to establish the feasibility of extracting respiration rate derived from the heart rate (HR) using the index finger over a cellphone camera

The proposed system will reduce the intrusiveness of current sensors and potentially help in future studies. I am interested on this project because I always have been fascinated by human physiology. Previously I worked on Brain Computer Interfaces, specifically with electroencephalography (EEG). By working at the PSI laboratory I plan to extent my interest further.

At the end of this research program I expect to have improved my research, technical and problem-solving skills. While conducting the studies I will be analyzing data using Matlab and programming an application for smartphones using the Android SDK. I do not have previous experience working with any of these tools, but during the study I will be learning how to use them in order to accomplish the research goal. Hopefully, this research study could potentially lead to a publication.

Also, I expect to create a new network of friend and mentors so that potentially we can collaborate in future studies. During this program I will improve and gain more knowledge and skills that will help me to prepare myself in order to reach my career goals, one of them is to go to graduate school.

During this week I have been taking Matlab tutorials and conducting literature review with the purpose of finding previous methods and techniques of HR extraction from a cellphone camera. This way I can have a baseline and a better understanding of how the HR is obtained. Also, I started working on my research plan and my research plan timeline. Once I have these two documents done, I am going to be able to have a clear and organized plan in a sequential way for the objectives that I want to accomplish during the ten weeks program.

SECOND WEEK

June 10, 2013 - June 14, 2013

During this week I finished my research plan and the research plan timeline. From now on, I am going to start implementing some techniques and algorithms based on previous studies, with the purpose of extracting respiration rate from heart rate.

One of the issues that I have is that in order to determine heart rate there are a lot of signal analysis involved. Unfortunately I don’t have any previous experience with signals. However, I have been taking some tutorials online in order to understand more about signals. In the case of having any question, I can always ask to the Ph. D student.

On Wednesday, I recorded a set of data in a controlled environment, using the cellphone camera and a commercial sensor (Bioharness sensor). I am using the BioHarness sensor in order to validate the data recorded from the cellphone’s camera. I started doing some trial and error work on Matlab with the purpose of understanding the data gathered from the cellphone and the BioHarness sensor.

Every Monday there is a meeting at the laboratory with Dr. Gutierrez to discuss accomplishments from the previous week and the goals for the current week. As well, Every Friday there is a presentation at the lab, this week I was chosen to present.

On Friday I had the laboratory presentation, which I presented based on my research plan. The purpose of this presentation was to expose the Ph. D students to the work that I am going to be conducting during the next weeks. At the end of the presentation I received some comments and observations that will potentially benefit and improve the way I am going to conduct the studies.

THIRD WEEK

June 17, 2013 - June 21, 2013

In order to obtain heart rate and respiration rate using the cell phone camera, a subject has to place the index finger over the cell phone camera with the flashlight turned on. By computing the amount of light absorbed by the finger tissue. The phone acquires the Photoplethysmographic (PPG) signal. PPG works as follows: During the cardiac cycle, when the heartbeats, it creates a wave of blood that reaches the capillarity at the tip of the finger, when the capillarity is full of blood, it will block the amount of light that can pass through. When the blood retracts, more light can pass through the tissue. If these changes are recorded over time, a waveform is going to be created that correspond to the pulsatile changes in the arterial blood in that tissue. These changes in the arterial blood volume correspond to the heart rate.

This week, I collected three sets of controlled data using a cell phone camera and a commercial sensor (BioHarness) in order to validate heart rate and respiration rate. The data was recorded at three different frequencies 0.1, 0.2 and 0.16 Hz., which corresponds to 6, 12 and 10, breathes per minute. From the recorded video, the green values from every frame were extracted in order to acquire the PPG signal. From this signal, I was able to see two types of peaks. The small peaks corresponded to the cardiac pulse and bigger peaks shaped by the small ones that corresponded to the breathing frequency. In order to compute HR I need to detect every peak from the signal, so I used peak detection algorithm in order to detect all the cardiac peaks in the signal. Once a peak was found, the time difference between consecutive peaks was computed. This time difference is known as R-R interval (RRI). From the RRI values the HR was estimated using the following formula: HR = 60/RR

FOURTH WEEK

June 24, 2013 - June 28, 2013

This week, I validated the heart rate extraction from the cell phone camera was validated using a commercial sensor (Zephyr Bioharness), which provides the ECG signals at the sampling rate of 250Hz. It also provides other physiological parameters such as heart rate, breathing rate, skin temperature, and activity information. For validation, the bioharness sensor was attached around subject’s chest to monitor heart rate and respiration rate while simultaneously recording PPG/cardiac information from the index finger using the cell phone camera.

The correlation coefficient (CC) was applied in order to measure the strength of the correlation between signals. The results showed a strong correlation between the measurements from the cell phone camera and from the commercial sensor. CC of 0.92, 0.96 and 0.51 that corresponded to the samples collected at 6 bpm, 10 bpm and 12 bpm respectively.

A delay was observed between the signal collected using the cellphone and the bioharness signal. This is because there is a delay in delivering blood to the capillarity at the tip of the finger due to the distance between the finger and the heart. On the other hand, the commercial sensor belt is around the thoracic cavity, which makes closer to the heart and therefore can detect the pulse earlier.

After acquiring HR from the PPG signal, the next step was extracting BR from the HR in the spectrum domain. This is possible because respiration rate modulates amplitude and frequency of a signal, similar to RSA. Before spectral analysis, I had to interpolate the HR signal in order to address the issue of irregular sampling from the cellphone and because R-wave are not equidistantly timed events. After this the Fast Fourier Transform (FFT) of the HR was computed. We observed that the FFT plots show a clear harmonic peak at the frequencies, which corresponded to the respective breathing rate.

FIFTH WEEK

July 1, 2013 - July 5, 2013

This week I applied the same procedure to the samples collected uncontrolled settings. The samples were collected as follows: breathing under regular pace, sit-ups and walking. From the three sets of uncontrolled settings, the heart rate results from breathing under regular pace were similar to that obtained from the commercial sensor. However, for the sit-ups and walking, the heart rate data was not closely correlated with the data obtained from the commercial sensor. After applying the FFT to the three data sets, the respiration rate was not clearly found in any frequency. There was not a strong peak in the HR spectrum, which corresponded to the breathing rate. These results were expected. This might be due to the abrupt movements between the finger and camera while the subject was performing the tasks (sit-ups and walking). These movements could be the cause of interference, which makes a signal noisy. Therefore, while looking for a specific frequency signal in the spectral domain, it is hard to pick a frequency component that correspond respiration rate.

I talked to my advisor about the situation. He told me that I can start working on the app implementation. So according to the studies the app is going to be able to estimate breathing rate and heart rate under controlled settings. I started taking some android tutorials.

SIXTH WEEK

July 8, 2013 - July 12, 2013

This week I started the app development with the purpose of implementing the same method that I used in Matlab offline to estimate heart rate and breathing rate, but now in real time. It seems to be a challenge to implement the same method, because I will have to create my own function. In order to develop the app I am using an Android smartphone and the android SDK. I went over some android tutorials since I don’t have any previous experience in android development. The positive side is that the SDK is java based and I am familiar with that programming language. There is an existing HR application offered as an open source. The application measures the heart rate, but the HR measurement is not accurate. The app doesn’t use the correct formula to estimate HR, but it is a good reference

I began the project by developing a camera app that displays the camera preview and that keeps the camera flash turned on while using the app. I was using the cell phone that I was provided in the lab. However, I had some problems with it while I was developing the application. After spending some time trying to figure out why the camera flash wasn’t working on the phone. I looked online for some help and I realized that the API of the phone was too old. This means that the cell phone's built in functions and functionalities sometimes are not compatible with the application that I am developing. Definitely, from now I can see that the application is not going to work in old phones. I tried the app in a different android phone and the application worked fine. So this week the application is displaying the camera preview.

SEVENTH WEEK

July 15, 2013 - July 19, 2013

I am developing the application using a cell phone HTC EVO 4G running android OS 2.2. I looked online for some real-time image processing applications in order to compute the bright amount of light absorbed by the finger tissue in every frame. I found a good application (Viewfinder) developed by Stanford University. This application analyzed the amount of red, green and blue components in the camera and displays a histogram on the screen. This application was a great finding. I used this application to understand more about how the frames need to be processed.

After looking for documentation and applications related with the one that I am trying to develop, I came up with an algorithm to process the frames from the camera in real time and estimate HR. First, I need to create a camera application that capture all the frames from the camera preview, and from each frame I need to get its timestamp. Second, I have to acquire the bitmap from every frame. Third, from the bitmap, decode every pixel from YCbCr (NV21) format to red, green and blue (RGB) components. Fourth, use the green values to obtain the PPG signal from the image. Fifth, the green intensity average in the PPG signal form peaks that correspond to cardiac pulse. A peak detection algorithm needs to be implemented in order to find all the cardiac peaks in the signal. Sixth, once a peak is found, the time difference between consecutive peaks needs to be computed. This time difference is known as R-R interval (RRI). From the RRI values the HR will be estimated using the following formula: HR = 60/RR.

I am still working on the first step. Now, I am trying to find the way to capture every frame and its timestamp. Once I have that ready, I can move to the next step. Next week is going to be very busy. This week I also, started working on writing the paper.

EIGHTH WEEK

July 22, 2013 - July 26, 2013

This week I was able to finish the first, second and third step. However, while getting the frames, I realized that the number of frames per second obtained by the application was very low. The cell phone is supposed to record at 25 frames per second. But, my application is just getting 4 frames after the image processing (Bitmap and conversion) and 5 - 6 without processing anything. This is a problem because the less number of frames that processed, the less data I have to estimate HR. This situation could potentially affect the accuracy of the estimations.

In order to find a solution, I looked for some references online. I found that this is a very common problem. I decided to check whether the HR monitor open source app and the viewfinder app had the same problem, and in fact they had it. The frame rate varies every second, but usually remains constant while the program is running. The viewfinder was acquiring 4 fps and the HR monitor up to 20fps. I decided to use the algorithm that the HR monitor app is using. Now, I am getting 9 - 13 fps, but usually 10. I am going to use those frames to computer to acquired the PPG signal and compute hr.

NINETH WEEK

July 29, 2013 - August 2, 2013

During this week I decided to use a different phone. This time I used a HTC Nexus One running the Android OS 2.3.6 version. This cell phone is giving me a higher number of frames. I finished steps fourth to sixth. This means that I finished the first version of the HR & BR estimator prototype. The app is estimating HR in real-time. I collected two sets of data under controlled settings using the app prototype and the BioHarness sensor in order to validate the HR estimations from the app. When I plotted the HR data from both measurements in Matlab, I found that there was not a strong correlation between the HR estimations. This week I also worked on the poster and final research paper report.

TENTH WEEK

August 5, 2013 - August 9, 2013

This was the last week of the summer program. I had two poster presentations on my project. I felt so good when I presented because a lot of people were very interested on the research.

This week I also worked on the app, but due to time constraints I wasn't able to finish the application completely (estimating HR and BR), but I was able to improve the HR estimations by implementing a filter algorithm. However, the application is still not estimating HR accurately. I recorded 2 sets of data under controlled settings along with the BioHarness sensor, but the results were not very correlated. There could be many factors that influenced the low correlation results from the HR & BR Estimator application. One of them might be due to the inconsistent frame rate from the camera. This is a factor that makes difficult to detect cardiac peaks; furthermore, the peak detection algorithm might not be detecting every peak from the PPG signal.

This summer has been one of my best ones ever. I had the opportunity to met awesome people and I had so much fun. Also, I learned and improved my skills. Definitely, the summer program has increased my desire to attend grad school, by giving me the opportunity to experience how grad school life is.