Skip to content

PiPy package to algorithmically analyze eye-tracking data

License

Notifications You must be signed in to change notification settings

footballdaniel/gazeclassify

Repository files navigation

GazeClassify

PiPy package to algorithmically annotate eye-tracking data. Recommended python version: 3.7

Open In Colab Test status Downloads Downloads DOI


What is GazeClassify?

GazeClassify provides automatized and standardized eye-tracking annotation. Anyone can analyze gaze data online with less than 10 lines of code.

Exported csv will contain distance from gaze (red circle) to human joints (left image) and human shapes (right image) for each frame.

frame number classifier name gaze_distance [pixel] person_id joint name
0 Human_Joints 79 0 Neck
... ... ... ... ...
0 Human_Shape 0 None None
... ... ... ... ...

Run on example data

from gazeclassify import Analysis, PupilLoader, SemanticSegmentation, InstanceSegmentation
from gazeclassify import example_trial

analysis = Analysis()

PupilLoader(analysis).from_recordings_folder(example_trial())

SemanticSegmentation(analysis).classify("Human_Shape")
InstanceSegmentation(analysis).classify("Human_Joints")

analysis.save_to_csv()

Run on your own data

Capture eye tracking data from a Pupil eye tracker. Then, export the data using Pupil software. You will get a folder with the exported world video and the gaze timestamps. Finally, let gazeclassify analyze the exported data:

from gazeclassify import Analysis, PupilLoader, SemanticSegmentation, InstanceSegmentation

analysis = Analysis()

PupilLoader(analysis).from_recordings_folder("path/to/your/folder_with_exported_data/")

SemanticSegmentation(analysis).classify("Human_Shape")
InstanceSegmentation(analysis).classify("Human_Joints")

analysis.save_to_csv()