Today we went over more outlines at the morning meeting and Matt gave me the idea of using eye tracking videos as a way to help explain various eye movements. Then I went to my lab and worked on programming the Gaussian filter to smooth the data. This took longer than I expected because for some reason the part of my code that normalized the gaze vectors (turned them into unit vectors) was not working.
In between my work on the code, I went to the undergraduate research symposium. I went to a talk on the LFEPR project that is similar to what Anjana is doing. I also saw many posters, including one on the "Computational Power of Quantum Turing Machines", one on "Developing Instrumentation and Analysis tools for Single-Walled Carbon Nanotubes", and one on "Laser Light Scattering of Ternary Mixtures to Derive the Gibbs Free Energy". I found the posters and talks all very interesting and they helped give me ideas on how to describe/present what I have been doing here. After seeing the posters, I went back to the lab and discovered that by quitting out of the PyCharm software and retyping the unit vector function into the code made it work again. Once I fixed my code, I helped write down observations Titus and I made when watching the eye tracking data we had collected earlier. I had to leave early today since I had a college interview.
Here is an image that illustrates the smoothing effect created by using different filters on raw data (and what the resulting angular velocity versus time graphs which I have been creating after passing the raw data through various filters should look like):