Today I continued work on filtering the data by converting the cartesian (x,y,z) vectors to spherical coordinates (radius, azimuth, elevation). Then, since I had already converted the vectors to unit vectors and thus knew that their radius was one, I created graphs of the azimuth (in degrees) versus time and the radius (in degrees) versus times. Both of these graphs were noisy and this meant that the vectors had to be filtered more because once you calculate the angular velocity based off the vectors, the noise increases because you are differentiating. Tomorrow, I will be working on creating functions that filter the data through a Gaussian filter and a mean filter.
I also worked on labelling some eye tracking data using a software created in MATLAB. To label the data, I would watch the eye tracking video and whenever I saw a blink, fixation or a saccade, I would select the respective option in the sidebar and then on the graph next to the video I would highlight the time interval the type of gaze movement.
Here is an image of the data labelling software I used. The gaze point is marked with the red cross, and based of the movement of the gaze point and also whether the person's head was moving also, I classified the various types of eye movements.