Google Glass Used To Improve Productivity Of Boeing Workers


Numbers from Index AR Solutions show that augmented reality is set to become a huge market in the enterprise sector within the next 15 years or so, growing up to $105 billion. With technology like Magic Leap on the horizon, it's not hard to see how that could happen. Another big name in the space is Google Glass. Once thought to be all but dead, the project has come back with a vengeance lately, with the recent revelation of an enterprise-specific version of the hardware. Many enterprise customers, of course, are already using Glass. With the right developer support, just about any task from filing paperwork to working an assembly line simply becomes easier; the information you would otherwise have to turn away or leave your workstation to access is right there. One customer taking advantage of that feature of Glass in a big way is Boeing.

The story of this relationship starts back in 2013, when a higher-up in the company became curious about Glass and snagged a few units of the Explorer Edition to hand out to some team members, asking for ideas about how they could be used. It wasn't long before they found their biggest niche in the company's workflow; wire harnesses, the often huge and always complicated wiring systems that go into aircraft. Workers assembling wire harnesses currently use a laptop with a file pulled up showing the harness they're working on. They have to pull up the harness manually, follow the blueprints very carefully, use CTRL+F to search for individual bits and bobs, and of course, have to turn away from their work and manipulate the laptop with their hands to see what they need to do. For the workers using a pilot version of Google Glass with a special app called Skylight, this was a thing of the past.

After putting out a call for bids on developers to make something to make techs' and assembly line workers' lives a bit easier, Boeing ended up getting Skylight from the winner. The way the app works is much simpler than the laptop-based system. Once a worker has their Glass unit, they scan in their work order for the harness in front of them. The app pulls up that harness in an AR display on the Glass unit, and shows the worker where to start. From there, they can follow the display without ever having to look away from their work. If they end up lost, something goes wrong, or they need to pull up a different harness or a different part of their current harness for reference, they can do so with voice commands. The program is still in pilot status, with no word on exactly when a wider roll out is planned, but things are going very smoothly for the time being, and many workers that use the app and Glass are very excited about it.


Share this page

Copyright ©2016 Android Headlines. All Rights Reserved.

This post may contain affiliate links. See our privacy policy for more information.
Senior Staff Writer

Daniel has been writing for Android Headlines since 2015, and is one of the site's Senior Staff Writers. He's been living the Android life since 2010, and has been interested in technology of all sorts since childhood. His personal, educational and professional backgrounds in computer science, gaming, literature, and music leave him uniquely equipped to handle a wide range of news topics for the site. These include the likes of machine learning, Voice assistants, AI technology development news in the Android world. Contact him at [email protected]

View Comments