All News

Amazon launches Lens Live for real-time AI shopping

Amazon introduced Lens Live, an AI-driven real-time upgrade to Amazon Lens that surfaces matching products in a swipeable carousel as you point your phone. Integrated with the Rufus shopping assistant and built on SageMaker and OpenSearch, it targets in-store comparison shopping and expands Amazon’s suite of AI shopping tools.

Published September 2, 2025 at 04:09 PM EDT in Artificial Intelligence (AI)

Amazon is expanding its AI shopping toolkit with Lens Live, a real-time upgrade to Amazon Lens that surfaces matching products as you point your phone at items in the physical world.

Unlike the original Amazon Lens — which analyzes photos, uploaded images, or barcodes — Lens Live adds a live camera view and a swipeable product carousel at the bottom of the screen so shoppers can discover items instantly.

Lens Live pairs with Amazon’s AI shopping assistant Rufus to surface AI-generated product summaries and suggested conversational prompts, letting users do quick research before tapping to add items to cart or save them.

Key interactions in Lens Live include:

  • Tap an on-screen item to focus recognition
  • Swipe through matching products in the carousel
  • Tap + to add to cart or heart to save to wish list

Under the hood, Lens Live runs on Amazon SageMaker for scalable ML model deployment and uses AWS-managed Amazon OpenSearch for search operations. That backbone lets Amazon match in-camera images to catalog products quickly at scale.

Why Lens Live matters

Lens Live formalizes a behavior many shoppers already use: comparison shopping in brick-and-mortar stores. By lowering friction between seeing a product and checking Amazon’s inventory or price, the feature can shift purchase decisions and intensify online/offline price competition.

For brands and merchants, real-time visual search raises new priorities: ensuring accurate product images and metadata, optimizing catalog coverage for look-alike items, and monitoring how in-store experiences drive conversions or returns.

Lens Live also increases pressure on privacy, model fairness, and false-match rates. Visual search must balance helpfulness with transparent explanations and easy ways to correct mismatches — otherwise shoppers can lose trust fast.

Amazon is launching Lens Live on the iOS Shopping app for “tens of millions” of U.S. customers first, with no public timeline yet for global expansion.

Lens Live joins a wave of Amazon AI features — from product summaries and clothing-fit tools to personalized prompts and merchant tools — all designed to shorten the journey from discovery to purchase.

What businesses should do next

Retailers and brands can start by auditing visual discoverability: ensure high-quality, multi-angle images; enrich product metadata with materials, patterns, and dimensions; and test how visual matches map to SKUs. Pricing engines should monitor in-store-to-Amazon parity in near real time to avoid losing sales during comparison shopping.

For teams building or integrating similar tech, Lens Live is also a reminder that scalable ML ops and search infrastructure matter. Low latency, robust indexing, and user-facing prompts that explain AI suggestions are core to delivering helpful real-time experiences.

QuarkyByte approaches features like Lens Live by mapping customer journeys, stress-testing model outputs in real-world lighting and angles, and designing operational dashboards that tie visual-search signals back to conversion and inventory metrics. That practical focus helps organizations measure impact and prioritize fixes that move the business needle.

Keep Reading

View All
The Future of Business is AI

AI Tools Built for Agencies That Move Fast.

QuarkyByte can help retailers and brands evaluate how real-time visual search affects discovery and pricing strategies. We map AI integration paths, test visual-search UX changes, and benchmark model ops to reduce latency and false matches. Start a tailored assessment to measure conversion and margin impact.