I have learned a number of skills that have aided me with my research. I describe some of these skills on this page.
You can find code samples from a few of my projects on GitHub.
I use eye tracking in my research primarily to understand how attention is allocated across space. I also use it to program gaze-contingent events in my experiments. For example, the video to the right shows a display where moving squares turn red if the participant looks at them. I programmed this with pygame, pylink, and an EyeLink 2K eye tracker. You can find the code for this task on GitHub
As a graduate student, I conducted numerous experiments using a NADS MiniSim driving simulator (pictured on the right) and its proprietary software. Most of these experiments were human factors studies contracted by the Florida Department of Transportation, and they looked at issues surrounding cognitive aging, road device visibility, and driver attention.