Google is looking to bring together printed media, one of the most popular shows on Netflix, and its Lens-branded mobile augmented reality technology with a brand new Stranger Things 3 promotion in The New York Times.
To get in on the promotion, users will need to go out and buy a physical print edition of The New York Times and look for three advertisements for “Starcourt Mall.” The mall is central to events in the latest season of the hit Netflix Original, which is set -- with a fair amount of nostalgia -- in the mid-1980s.
Once the ads are found, users will be able to scan those using Google Lens. That’s Google’s AI-driven machine vision and AR tool found inside of Google Assistant, whether in the Google app or elsewhere, or in some cases as an option in the primary camera on a select handful of Android devices.
Details surrounding exactly what users will see when they scan the ads to take part in the promotion have not been divulged. So there won't be any spoilers here. The only hints Google provides are going to be vague at best, even for anybody following the series. The search giant only alludes to "Demogorgons. Mindflayers. Shadowy government agencies."
That doesn't mean that any season giveaways to be found within the ads themselves will be kept under wraps for long though since the company is also encouraging participants to share what they find online via the hashtag "#StrangerThings3."
How strange could this get?
The new Stranger Things promotion is the latest in a string of similar efforts Google has put forward to highlight the possibilities presented by AR since the company first began releasing major updates to Gooogle Lens with those features earlier this year.
With consideration for the few hints that the search giant has provided, it will likely involve the ability to overlay key characters or creatures from Stranger Things into the real world. The effect will probably be similar to what the company did with a great white shark at this year's I/O developer conference -- allowing users to view and interact with the predatory fish at a realistic life-size scale.
The AI tool, when used that way, additionally surfaces information about whatever's been scanned, as shown through Google's partnership with museums, magazines, and retailers. For instance, in the former case, users are able to get important details and the backstory behind paintings at a museum by scanning the painting while retailers can include AR representations of their products alongside details about pricing and availability among other things.
While not guaranteed, similar capabilities will probably be provided to those who scan the ads in question.
To get involved
To take part in the promotion, as mentioned above, Android users will need both a print copy of The New York Times from today (July 11) and a compatible phone with Google Lens capabilities. Regardless of whether the app is found in the camera app or in the Google app via Google Assistant, Google Lens is generally tucked away behind a square-shaped camera icon.
After finding the feature, the ads from the paper will need to be put into the frame and the AI -- as long as the latest update has been installed -- should recognize the interactive object from the image of the ad. Dots should appear that can be tapped and interacted with over the top of the image. On-screen instructions will probably appear at that point to guide users through the experience.