- Lens Live adds live visual search to the Amazon Lens scanner.
- Integration with Rufus for summaries, suggested questions, and product insights.
- Initial rollout on the Amazon iOS app in the US, with progressive expansion.
- Technology based on Amazon SageMaker and OpenSearch to operate at large scale.
Amazon has begun to activate Lens Live, a purchase function with artificial intelligence that converts the mobile phone camera in a real-time product search engine. When you open the Amazon Lens camera in the app, the system instantly begins to recognize objects and displays matches in a sliding carousel with options to quickly compare without leaving the camera view.
The company raises Lens Live as an extension of its visual search tool, without replacing Amazon Lens. Instead of taking a photo and waiting, the live component now lets you point at what's in front of you and see matches instantly, which is especially useful for compare prices in physical stores or find similar alternatives in the Amazon catalog.
What is Lens Live and how to use it

When you activate the Amazon Lens camera, the live feature starts scanning already identify what you see, displaying the most similar items in a results bar at the bottom of the screenFrom there, you can swipe through similar options to compare features, prices, and variants at a glance.
If you want to focus on something specific, just tap that object within the camera view; Lens Live will apply focus to that product to refine the search and show more accurate matches.This gesture is useful when there are several elements in the frame and you're only interested in one in particular.
When an option that suits you appears, You can add it directly to your cart with the (+) icon or save it to your wish list by tapping the heart., all without leaving the camera. This immediacy reduces steps and turns visual search into a seamless shopping experience from the first touch.
In addition to the results, the interface presents quick summaries below the carousel and suggested questions to discover what makes each article stand out. It's a quick way to Get key details and resolve questions on the fly before deciding on a specific purchase.
Rufus and AI at the service of informed purchasing
Lens Live integrates with Rufus, Amazon's AI shopping assistant, to generate product summaries and suggest conversational queries. This way, customers can get condensed insights, review key aspects and ask specific questions without abandoning the real-time visual experience.
The identification of the object is based on a detection model which contrasts what the camera captures with billions of marketplace listings. The goal is accelerate the leap from “what you see” to “the option you can buy”, a field where tools such as compete Google Lens or Pinterest Lens, and which Amazon reinforces by prioritizing purchasing actions integrated into the camera view itself.
Beyond Lens Live, the retailer has long been rolling out AI tools to fine-tune the experience: generative shopping guides, smart summary reviews, personalized suggestions, virtual fit tests in fashion and tools for sellers. All of this creates an ecosystem where AI reduces friction and provides context to every purchasing decision.
Availability, deployment and the technology behind it
The function begins to arrive at tens of millions of customers in the United States through the Amazon app for iOS, with a deployment that will be progressively extended to more users in the country in the coming weeks and months. The company has not specified plans for other markets for now.
On the technical side, Lens Live relies on Amazon SageMaker to deploy machine learning models at scale and runs on Amazon OpenSearch managed by AWS. This foundation enables real-time experience with massive volumes of data. images and queries without degrading the response.
The approach also leverages widespread habits, such as price comparison in physical stores: you point your camera, see a match, and if you like one, you add it to your cart or save it for later. This way, Amazon seeks to make the transition between the real world and its catalog as seamless as possible..
For those who already know Amazon Lens, the new feature is the “live layer”: instead of isolated captures, the camera remains active and delivers instant results. It's a natural evolution that combines visual search with purchasing actions and AI-generated context in the same flow.
Powered by AI and deeper app integration, Lens Live aims to reduce steps from discovery to purchase, reinforcing the trend toward a visual search more conversational, with smart help and immediate options to complete the transaction when the user decides.
I am a technology enthusiast who has turned his "geek" interests into a profession. I have spent more than 10 years of my life using cutting-edge technology and tinkering with all kinds of programs out of pure curiosity. Now I have specialized in computer technology and video games. This is because for more than 5 years I have been writing for various websites on technology and video games, creating articles that seek to give you the information you need in a language that is understandable to everyone.
If you have any questions, my knowledge ranges from everything related to the Windows operating system as well as Android for mobile phones. And my commitment is to you, I am always willing to spend a few minutes and help you resolve any questions you may have in this internet world.