I have tried it at Tobii headquarters outside Stockholm. I was looking for work there, but was rejected because I did not show enough excitement about it in the end ...
To be honest, part of the reason that I had applied was to get a chance to try their tech demos and learn from them first hand about the tech and its possibilities. I'm a UI nerd ...
The software I used was two simple games, and then an integration with the Windows UI. All are shown to all visitors in the lobby and/or have been shown at trade shows also, so I am not disclosing any privileged information.
The first game was a version of Asteroids, where the "ship" was stationary and I gazed at rocks to shoot them. That worked pretty good, apparently, except that afterwards, I was struck by a feeling of "What the hell did I just do?". I did not get a feeling that I was in control of the game.
The second game was about using my eyes as a searchlight and locate items, but I was given no instructions before I started, so I failed at it completely.
The integration with the Windows UI:
The biggest feature was mouse acceleration: Look at a place on screen, move the mouse in the direction towards that place and the mouse pointer jumps to your gaze point, unless it was already close.
There was also a gaze-navigated start menu replacement, activated by looking at the screen border. It was really weird to use, and I can imagine that applications such as it would cause eye strain in the long run. Part of why it was weird was that the accuracy of the system is not high enough. I had to sometimes look at a place right next to the active menu item. The accuracy is very far from that of a mouse, or even a touchscreen.
There are weaknesses with the tech. It is based on using cameras. Only one person can use it at a time and it needs to be calibrated to that person before it can be used. It is sensitive to lighting conditions. Too much glare puts it off and problems are especially caused by eyeglasses.
The biggest reason why I don't believe in gaze interaction is that while our pointing tools such as a mouse and fingers are extension of ourselves, gaze is not. Our brains simply are not used to working that way. Our eyes are used for monitoring the world around us, not manipulating it.
I don't believe in gaze interaction by itself as an input device to control a computer. Even if it worked flawlessly and people got used to using it, it would create invisible barriers around the user. The user would adapt to the machine and become part of the machine, instead of the user using the machine as a tool, and I find that to be wrong on a conceptual and at a moral level. Users must be free to look wherever they want.
I think that some kind of eye tracking could be used together with other types of input devices to improve the accuracy of those, but the way the tech work would also have to change.