geekhack

geekhack Community => Other Geeky Stuff => Topic started by: Playtrumpet on Thu, 26 July 2012, 16:57:10

Title: Tobii Eye Control
Post by: Playtrumpet on Thu, 26 July 2012, 16:57:10

By my understanding, we all don't use our eyes to move our cursors because the technology has yet to be perfected and/or made affordable. But besides that, how many people really think about eye control? With a high enough refresh rate, imagine actually playing games with this. Look at a pixel and your cursor's there. Forget about clicking with your eyes, leave that to the physically impaired. Everyone else can click with their mice as per usual. But instead of relying on arm/wrist/hand speed during high speed gaming, you can rely on just your eyes.

The fact is that while using our cursors normally, we often look somewhere on the screen/read something and move our cursors somewhere else because we can use our peripheral vision and multitask. I'm guessing eye control might hinder that.

What are your thoughts?
Title: Re: Tobii Eye Control
Post by: Djuzuh on Thu, 26 July 2012, 17:00:58
Your eyes do not move as you think they move.

For example, they jump around, and don't do any smooth movements. Your brain just assembles everything back.

So it would be pretty weird.
Title: Re: Tobii Eye Control
Post by: alaricljs on Thu, 26 July 2012, 17:05:56
But if they nailed down the system so that where you were looking was definitely where the "pointer" was, then you make the pointer invisible so you don't see it jittering.  When you click is really the only point in time that where you are looking is relevant.  Or if you want mouse-over/hover effects then where your gaze dwells for more than N ms.
Title: Re: Tobii Eye Control
Post by: pyro on Thu, 26 July 2012, 17:33:16
I totally agree with playtrumpet. Moving your mouse to make the cursor appear at the spot you're looking at is simply overhead, or redundant if you will.

Also, you're basically blind while your eyes are moving (you won't see your eyes moving in the mirror, for example), so you wouldn't notice the jumping. And as alaricljs said, there is no need for an arrow-like cursor with gazetracking, anyway. There's also no real need to hit exactly the right pixel in most user scenarios.

This seems like quite a popular subject already and I hope independent devs will pick it up as soon the "office" kinect comes out. It will certainly have the capabilities for this.

ps
I also wouldn't care about the refresh rate, just add a push-to-move button and it can eat all the cpu time it wants.
Title: Re: Tobii Eye Control
Post by: Findecanor on Thu, 26 July 2012, 21:27:36
I have tried it at Tobii headquarters outside Stockholm. I was looking for work there, but was rejected because I did not show enough excitement about it in the end ...
To be honest, part of the reason that I had applied was to get a chance to try their tech demos and learn from them first hand about the tech and its possibilities. I'm a UI nerd ...

The software I used was two simple games, and then an integration with the Windows UI. All are shown to all visitors in the lobby and/or have been shown at trade shows also, so I am not disclosing any privileged information.
The first game was a version of Asteroids, where the "ship" was stationary and I gazed at rocks to shoot them. That worked pretty good, apparently, except that afterwards, I was struck by a feeling of "What the hell did I just do?". I did not get a feeling that I was in control of the game.
The second game was about using my eyes as a searchlight and locate items, but I was given no instructions before I started, so I failed at it completely.

The integration with the Windows UI:
The biggest feature was mouse acceleration: Look at a place on screen, move the mouse in the direction towards that place and the mouse pointer jumps to your gaze point, unless it was already close.
There was also a gaze-navigated start menu replacement, activated by looking at the screen border. It was really weird to use, and I can imagine that applications such as it would cause eye strain in the long run. Part of why it was weird was that the accuracy of the system is not high enough. I had to sometimes look at a place right next to the active menu item. The accuracy is very far from that of a mouse, or even a touchscreen.

There are weaknesses with the tech. It is based on using cameras. Only one person can use it at a time and it needs to be calibrated to that person before it can be used. It is sensitive to lighting conditions. Too much glare puts it off and problems are especially caused by eyeglasses.

The biggest reason why I don't believe in gaze interaction is that while our pointing tools such as a mouse and fingers are extension of ourselves, gaze is not. Our brains simply are not used to working that way. Our eyes are used for monitoring the world around us, not manipulating it.
I don't believe in gaze interaction by itself as an input device to control a computer. Even if it worked flawlessly and people got used to using it, it would create invisible barriers around the user. The user would adapt to the machine and become part of the machine, instead of the user using the machine as a tool, and I find that to be wrong on a conceptual and at a moral level. Users must be free to look wherever they want.
I think that some kind of eye tracking could be used together with other types of input devices to improve the accuracy of those, but the way the tech work would also have to change.