Amazon announced it’s developing smart glasses for its delivery drivers, which include a display for real-time navigation and delivery instructions.
The News
Amazon announced the news in a blog post, which partially confirms a recent report from The Information, which alleged that Amazon is developing smart glasses both for its delivery drivers and consumers.
The report, released in September, maintained that Amazon’s smart glasses for delivery drivers will be bulkier and less sleek than the consumer model. Codenamed ‘Jayhawk’, the delivery-focused smart glasses are expected to rollout as soon as Q2 2026, and include an initial production run of 100,000 units.

Amazon says the smart glasses were designed and optimized with input from hundreds of delivery drivers, and include the ability to identify hazards, scan packages, capture proof of delivery, and navigate by serving up turn-by-turn walking directions.
The company hasn’t confirmed whether the glasses’ green monotone heads-up display is monoscopic or stereoscopic, however images suggest it indeed features a single waveguide in the right lens.
Moreover, the glasses aren’t meant to be used while driving, as Amazon says that the glasses “automatically activate” when the driver parks their vehicle. Only afterwards does the driver receive instructions, ostensibly done to reduce the risk of driver distraction.
In addition to the glasses, the system also features what Amazon calls “a small controller worn in the delivery vest that contains operational controls, a swappable battery ensuring all-day use, and a dedicated emergency button to reach emergency services along their routes if needed.”
Additionally, Amazon says the glasses support prescription lenses along with transitional lenses that automatically adjust to light.
As for the reported consumer version, it’s possible Amazon may be looking to evolve its current line of ‘Echo Frames’ glasses. First introduced in 2019, Echo Frames support AI voice control, music playback, calls, and Alexa smart home control, although they notably lack any sort of camera or display.
My Take
I think Amazon has a good opportunity to dogfood (aka, use its own technology) here on a pretty large scale—probably much larger than Meta or Google could initially with their first generation of smart glasses with displays.
That said, gains made in enterprise smart glasses can be difficult to translate to consumer products, which will necessarily include more functions and apps, and likely require more articulated input—all of the things that can make or break any consumer product.

Amazon’s core strength though is generally less focused on high-end innovation, and more about creating cheap, reliable hardware that feeds into recurring revenue streams: Kindle, Fire TV, Alexa products, etc. Essentially, if Amazon can’t immediately figure out a way to make consumer smart glasses feed into its existing ecosystems, I wouldn’t expect to see the company put its full weight behind the device, at least not initially.
After the 2014 failure of Fire Phone, Amazon may still be gun-shy from going head-first into a segment it has near-zero experience entering. And I really don’t count Echo Frames, because they’re primarily just Bluetooth headphones with Alexa support baked in. Still, real smart glasses with cameras and displays represent a treasure trove of data that the company may not be so keen to pass up.
Using object recognition to peep into your home or otherwise follow you around could allow Amazon to better target personalized suggestions, figure out brand preferences, and even track users as they shop at physical stores. Whatever the case, I bet the company will give it a go, if only to occupy the top slot when you search “smart glasses” on Amazon.
,