Google Fiber is already quite fast in the growing list of areas it’s been rolled out to, in some cases managing to scare local players like cable companies and telecoms. Speeds are up in the 1 gigabit per second area for most customers, ten times faster than the fairly zippy 100 megabit per second connections that’s become status quo these days. Although Google is still working out the kinks with getting the service rolled out in a wider variety of markets, a job posting on their website points to the possibility that the service could become even faster than it already is, at some point in the future.
The posting details a position for a photonics engineer that would entail innovating on current internet technologies to break the current speed barrier and bring customers speeds “beyond Gb/s per user in a cost effective manner.” To be sure, this is not a posting for a novice; while networks faster than a gigabit per second certainly exist, they’re far from cost effective and, for now, are normally only hosted by big-name telecoms, smaller players that devote all their resources to them, or universities. To bring Fiber customers those speeds on a massive scale without sending the price soaring would be no small accomplishment. Local companies offer customers in some areas speeds as high as 10 gigabits per second, but most of the time, this is at great cost to both the provider and the user.
It should be stated that Google has made no announcement regarding this, meaning it’s most likely a decidedly far-future affair. The posting is certainly a cage-rattler and is quite likely to attract the kind of talent that could pull it off, but the focus, for the time being, seems to be on getting Fiber out to more customers in more areas. Next-generation network speeds, up into the terabits per second territory, did not seem to be on order, but it is a distinct possibility that the technology could be feasible for a rollout through Fiber at some point. For now, the technology is mostly used for backend transfer between continents, data centers and other use cases where tons of money can be spent and tons of resources can be allotted to transfer tons of data quickly.