Amazon appears to be exploring a new AI-enhanced security solution driven by behavioral analysis that might be well-suited for a story set in some dystopian future akin to a George Orwell novel, according to new intellectual property documentation reviewed by Android Headlines. The firm's latest invention is described as encompassing video searching device, systems, and methods to determine whether behavior fits user-defined criteria across multiple video clips. That would be conducted using machine vision and artificial intelligence using comparative methods not dissimilar to those utilized in other technologies in those categories. ‘Search’ terms in the subsequent behavioral search engine would effectively be scanned for in real-time and with a focus on the subject's path of movement, allowing for secondary devices that normally work independently to be incorporated too. For example, a series of smart motion detection lights might determine that a subject is moving toward an area that’s under monitoring where nobody is currently supposed to be.
More complex iterations could incorporate cameras and audio too, with clips being sent to the owner if movements match certain behavioral patterns. Taking things a step further still, Amazon’s patent describes similar mechanics being used in a number of other scenarios that extend from front door security using Ring-branded door cameras or doorbells to floodlighting and wide area monitoring. In fact, use cases range a from a home or business’s security-related devices working together for analysis of behavior to an entire neighborhood of security devices all working in tandem to the same effect. In each scenario, the system in question could be controlled either offsite or onsite and similar methods could be used to perform a search of clips or readings after the fact.
Background: Amazon’s latest patented invention takes a clear step away from its usual consumer AI business in markets more closely related to convenience-driven home automation and other Alexa features. Those generally center around a homeowner's use of the technology to interact at-will with their security or smart home implementations. A user might connect Alexa to their outdoor smart lights and their Ring doorbell to recognize who is at their door or receive notifications based on facial recognition and initiate contact. The same camera might be used to catch a package thief if the homeowner isn't already participating in Amazon's package delivery service and hardware under the Key program -- allowing parcel carriers to leave deliveries just inside the door. Similar remote control and functionality can be extended to a number of in-home or in-business security and smart home accessory tools from televisions and speakers to appliances and in tandem with Alexa Routines.
The newly proposed patent is much closer to a more controversial project called out earlier this year in Orlando under which the local police were using Amazon Rekognition in at least three IRIS surveillance cameras located in the downtown district. Rekognition is summarily an AI software tool designed to scan using facial recognition able to identify, track, and analyze up to 100 people in a single image. In Orlando, the police chief had stated that none of the cameras in the city were linked up to the software before later rescinding that remark to admit that footage had been viewed by at least seven volunteer offers. The unannounced AI-powered surveillance was justified on the basis that no public privacy expectation actually exists and that the department had ensured that Amazon itself only had limited access to the camera feeds or footage.
Impact: Documents and images associated with the Amazon AI behavioral analysis patent primarily point to its use in at-home scenarios but it goes quite a bit further than Rekognition in the abovementioned city-spying scenario. The concept of a neighborhood-wide interconnected security system does have its own appeal in circumstances where ongoing criminal activity becomes a problem or as a deterrent factor to those behaviors. Simultaneously, the technology raises further questions in light of security breaches and vulnerabilities as well as the probability that the system would be abused. Concerns would most likely center around control of access to the footage and that's something that could feasibly be solved using a blockchain-based networking solution or other features. Secondary to that are questions about who maintains or controls the definition for suspicious or abnormal behavior and whether or not those rules can be changed.
Analyzing behavior adds a completely new layer of complexity the retail shipper and global tech giant is going to have to account for. That's especially true given that AI technology has also previously learned bigoted behavior with biases based on race and other factors such as gender. That raises further concerns with regard to whether the machine learning inherent in the invention could lend itself to discrimination either deliberately or by accidentally mimicking unrecognized or subconscious offensive behaviors through its training by real humans. Beyond the problems more generally derived from relatively straightforward spying and privacy concerns, the invention could make any given user's smart home environment much more secure. But there are a plethora of questions that Amazon is going to need to both solve and convince the general populace that it has solved before the newly described methods and tools can be used outside of those kinds of locations.