I will use the IR camera in the kinect and enhance the retroreflections from the user’s eyes (or rather pupil) by adding IR Leds. Ideally, I would like to track the whole eye area and looking for the pupil within that rectangle. The relative position of the pupil tells me where the user is looking – roughly up, down, left, right. So, for now this setup will not result in an very accurate tracking tool.
This setup is inspired by Golan Levin’s installation “Eyeshine“.
The purpose for this project is…
1. …to use it in my major studio final project. This is an interactive installation that displays multiple news videos on multiple devices and depending on where the user looks the focused video is sharp and loud and the other videos become blurry and their volume decreases, but not muted creating an underlying noise. It is a comment on how we perceive news and a statement that we can only focus on so many things. Eye tracking would be important for this art installation’s experience, because I want the technology to be as “invisible” as possible, so the message is clearer and is revealed more as a surprise.
2. …to create a relatively simple eye tracking tool similar to the Eyewriter by Zach Lieberman, Theo Watson, Chris Sugrue and others, but obviously less accurate, but also less complicated to construct. Looking into the Eyewriter, the effort to built it is quite big (the amount of pieces and the time required) and also the code is not working in current software (Mac OS 10.10, of 0.8.4). Hence, if I get this tool to work, I could share it with the openFrameworks community helping others who want to use this for their art.
In the end I hacked the PSEye and swapped the camera with a tele lens (the view field is very narrow in comparison). I added 4 IR Leds to enhance the contrast in the eye (brings out the pupil). I think in the final presentation demo the battery was dying and so the IR Leds were not working completely, hence Connor’s pupil couldn’t be tracked very well.
This setup is actually quite cheap:
Here is the final result:
Here are some background videos to my Major Studio final for context of why I wanted to explore the EyeTracking technology.
Here I used the ofxFaceTracker for detecting face orientation and I mapped the values accordingly to control the videos just by looking at them.