We're constantly seeing Google patents show up with new possibilities to how some Google services could evolve in the future. We recently saw Google get awarded a patent for a new style of lockscreen that could give Android's lockscreen the ability to launch apps directly. Now a new patent has popped up, but in this case it's for Google Glass, the company's augmented reality glasses. The patent describes a method for Glass (or future versions of it) to track the user's gaze using a two-camera system, one looking forward and a second one looking to the user's face for eye-tracking. Used together they would determine what the user is looking at.
Google already has a patent that would allow Glass to recognize what's in front of the user and give additional information. Like the example given in the patent application, you could watch the Chrysler Building in New York and Glass would provide a small card with more info, like the name of the architect, the exact location or the year it was built.
If we put these two patents together, the possibilities are huge. Glass could give you information on exactly what you're looking at, not just what's in front of you and that information could also be modified depending on how you're looking at something. If you look at something for too long, the information could start scrolling to give you more. If you're looking somewhere near a landmark, it could tell you to look a little to your left, to tell you that there's something important you're missing.
If the eye-tracking camera is also able to detect stuff like pupil dilatation, the device could also trigger different reactions based on emotions, whether the user is in a certain state or that what they saw triggered that emotion.
Engadget noted that Google also received a patent for a quantum dot-based eye display, which could mean that future Glass devices would have an improved display with much better resolution and color, both very important features if Google intends to show more information through Glass.
Patents are never a certain thing, this doesn't mean that by the end of the year when Google Glass comes to market it will have these features. It doesn't even mean that Glass 2.0 will have them. It just means that Google has an idea that they like and that it could be implemented in the future.