Normal view
Amazon Reportedly Planning Massive Corporate Layoffs
Yes, it’s true that many companies went out of business during the pandemic, and many people lost their jobs. However, the tech industry actually saw a surge in hirings. But it looks like that hiring spree is coming to a head. According to a report from Reuters, Amazon could be looking to conduct corporate layoffs.
Amazon to conduct corporate layoffs
Based on the report, Amazon corporate layoffs could be the largest layoffs the company conducted since 2023. The Reuters report suggests that Amazon could be looking to layoff up to 30,000 corporate positions. This is higher than the 27,000 job cuts that the company had previously conducted.
30,000 positions seem like a lot, but to Amazon, that’s roughly 10% of its corporate workforce. This will cover various departments such as human resources, cloud computing, advertising, and more. However, the total number of staff reductions hasn’t been finalized yet, but for now, it is estimated to be around 30,000.
Is AI coming for our jobs?
Many companies, not just Amazon, conduct layoffs periodically as a way to cut costs. But when there’s a surge in demand, these companies tend to hire again. However, what’s worrying about the recent layoffs is that some of these positions may never be filled by humans again.
Amazon CEO Andy Jassy said back in June that some jobs will be cut because of AI. Jassy said at that time, “As we roll out more Generative AI and agents, it should change the way our work is done. We will need fewer people doing some of the jobs that are being done today, and more people doing other types of jobs.”
It is true that as AI becomes smarter and more capable that jobs that humans used to do may no longer be necessary. If an AI can generate reports and summarize information in minutes compared to the hours it takes a human, it makes sense who you’d rather keep from an efficiency point of view.
But this doesn’t necessarily mean that humans are doomed to be jobless. The rise of AI represents a new industry that we can transition to. As the saying goes, when one door closes, another opens.
The post Amazon Reportedly Planning Massive Corporate Layoffs appeared first on Android Headlines.
-
GeekWire
- Amazon confirms 14,000 corporate job cuts, says push for ‘efficiency gains’ will continue into 2026
Amazon confirms 14,000 corporate job cuts, says push for ‘efficiency gains’ will continue into 2026

Amazon confirmed Tuesday that it is cutting about 14,000 corporate jobs, citing a need to reduce bureaucracy and become more efficient in the new era of artificial intelligence.
In a message to employees, posted on the company’s website, Amazon human resources chief Beth Galetti signaled that the cutbacks are expected to continue into 2026, while indicating that the company will also continue to hire in key strategic areas.
Reuters reported Monday that the number of layoffs could ultimately total as many as 30,000 people, which is still a possibility as the cutbacks continue into next year. At that scale, the overall number of job cuts could eventually be the largest in Amazon’s history, exceeding the 27,000 positions that the company eliminated in 2023 across multiple rounds of layoffs.
“This generation of AI is the most transformative technology we’ve seen since the Internet, and it’s enabling companies to innovate much faster than ever before,” wrote Galetti, senior vice president of People Experience and Technology. Amazon needs “to be organized more leanly, with fewer layers and more ownership, to move as quickly as possible for our customers and business,” she explained.
Amazon’s corporate workforce numbered around 350,000 people in early 2023, the last time the company provided a public number. At that scale, the initial reduction of 14,000 represents about 4% of Amazon’s corporate workforce. However, the number is a much smaller fraction of its overall workforce of 1.55 million people, which includes workers in its warehouses.
Although the cuts are expected to be global, they are likely to hit especially hard in the Seattle region, home to the company’s first headquarters and its largest corporate workforce. The tech hub has already felt the impact of major layoffs by Microsoft and many other companies in recent months.

The cuts come two days before Amazon’s third quarter earnings report. Amazon and other cloud giants have been pouring billions into capital expenses to boost AI capacity. Cutting jobs is one way of showing operating-expense discipline to Wall Street.
In a memo to employees in June, Amazon CEO Andy Jassy wrote that he expected Amazon’s total corporate workforce to get smaller over time as a result of efficiency gains from AI.
Jassy took over as Amazon CEO from founder Jeff Bezos in mid-2021. In recent years he has been pushing to reduce management layers and eliminate bureaucracy inside the company, saying he wants Amazon to operate like the “world’s largest startup.”
Bloomberg News reported this week that Jassy has told colleagues that parts of the company remain “unwieldy” despite the 2023 layoffs and other efforts to streamline operations.
Reuters cited sources saying the magnitude of the cuts is also a result of Amazon’s strict return-to-office policy failing to cause enough employees to quit voluntarily. Amazon brought workers back five days a week earlier this year.
Impacted teams and people will be notified of the layoffs today, Galetti wrote.
Amazon is offering most impacted employees 90 days to find a new role internally, though the timing may vary based on local laws, according to the message. Those who do not find a new position at Amazon or choose to leave will be offered severance pay, outplacement services, health insurance benefits, and other forms of support.
Amazon reportedly plans to cut around 30,000 corporate jobs
Amazon reportedly set to lay off up to 30,000 corporate employees in massive workforce cut

Amazon is preparing to lay off as many as 30,000 corporate employees in a sweeping workforce reduction intended to reduce expenses and compensate for over-hiring during the pandemic, according to a report from Reuters on Monday.
GeekWire has contacted Amazon for comment.
Layoff notifications will start going out via email on Tuesday, according to Reuters, which cited people familiar with the matter. One employee at Amazon told GeekWire the workforce is on “pins and needles” in anticipation of cuts.
Bloomberg reported that cuts will impact several business units, including logistics, payments, video games, and Amazon Web Services.
Amazon’s corporate workforce numbered around 350,000 in early 2023. It has not provided an updated number since then.
The company’s last significant layoff occurred in 2023 when it cut 27,000 corporate workers in multiple stages. Since then the company has made a series of smaller layoffs across different business units.
Fortune reported this month that Amazon planned to cut up to 15% of its human resources staff as part of a wider layoff.
Amazon has taken a cautious hiring approach with its corporate workforce, following years of huge headcount growth. The company’s corporate headcount tripled between 2017 and 2022, according to The Information.
The reported cuts come as Amazon is investing heavily in artificial intelligence. The company said earlier this year it expects to increase capital expenditures to more than $100 billion in 2025, up from $83 billion in 2024, with a majority going toward building out capacity for AI in AWS.
Amazon CEO Andy Jassy also hinted at potential workforce impact from generative AI earlier this year in a memo to employees that was shared publicly.
“We will need fewer people doing some of the jobs that are being done today, and more people doing other types of jobs,” he wrote. “It’s hard to know exactly where this nets out over time, but in the next few years, we expect that this will reduce our total corporate workforce as we get efficiency gains from using AI extensively across the company.”
Amazon reported 1.54 million total employees as of June 30 — up 3% year-over-year. The majority of the company’s workforce is made up of warehouse workers.
Amazon employs roughly 50,000 corporate and tech workers in buildings across its Seattle headquarters, with another 12,000 in Bellevue.
The company reports its third quarter earnings on Thursday afternoon.
Fellow Seattle-area tech giant Microsoft has laid off more than 15,000 people since May as it too invests in AI and data center capacity. Microsoft has cut more than 3,200 roles in Washington this year.
Last week, The New York Times cited internal Amazon documents and interviews to report that the company plans to automate as much as 75% of its warehouse operations by 2033. According to the report, the robotics team expects automation to “flatten Amazon’s hiring curve over the next 10 years,” allowing it to avoid hiring more than 600,000 workers even as sales continue to grow.
GeekWire reporter Kurt Schlosser contributed to this story.
What it’s like to wear Amazon’s new smart glasses for delivery drivers

SAN FRANCISCO — Putting on Amazon’s new smart delivery glasses felt surprisingly natural from the start. Despite their high-tech components and slightly bulky design, they were immediately comfortable and barely heavier than my normal glasses.
Then a few lines of monochrome green text and a square target popped up in the right-hand lens — reminding me that these were not my regular frames.
Occupying just a portion of my total field of view, the text showed an address and a sorting code: “YLO 339.” As I learned, “YLO” represented the yellow tote bag where the package would normally be found, and “339” was a special code on the package label.
My task: find the package with that code. Or more precisely, let the glasses find them.

As soon as I looked at the correct package label, the glasses recognized the code and scanned the label automatically. A checkmark appeared on a list of packages in the glasses.
Then an audio alert played from the glasses: “Dog on property.”
When all the packages were scanned, the tiny green display immediately switched to wayfinding mode. A simple map appeared, showing my location as a dot, and the delivery destination marked with pins. In this simulation, there were two pins, indicating two stops.
After putting the package on the doorstep, it was time for proof of delivery. Instead of reaching for a phone, I looked at the package on the doorstep and pressed a button once on the small controller unit —the “compute puck” — on my harness. The glasses captured a photo.
With that, my simulated delivery was done, without ever touching a handheld device.
In my very limited experience, the biggest concern I had was the potential to be distracted — focusing my attention on the text in front of my eyes rather than the world around me. I understand now why the display automatically turns off when a van is in motion.
But when I mentioned that concern to the Amazon leaders guiding me through the demo, they pointed out that the alternative is looking down at a device. With the glasses, your gaze is up and largely unobstructed, theoretically making it much easier to notice possible hazards.
Beyond the fact that they’re not intended for public release, that simplicity is a key difference between Amazon’s utilitarian design and other augmented reality devices — such as Meta Ray-Bans, Apple Vision Pro, and Magic Leap — which aim to more fully enhance or overlay the user’s environment.
One driver’s experience
KC Pangan, who delivers Amazon packages in San Francisco and was featured in Amazon’s demo video, said wearing the glasses has become so natural that he barely notices them.
Pangan has been part of an Amazon study for the past two months. On the rare occasions when he switches back to the old handheld device, he finds himself thinking, “Oh, this thing again.”
“The best thing about them is being hands-free,” Pangan said in a conversation on the sidelines of the Amazon Delivering the Future event, where the glasses were unveiled last week.
Without needing to look down at a handheld device, he can keep his eyes up and stay alert for potential hazards. With another hand free, he can maintain the all-important three points of contact when climbing in or out of a vehicle, and more easily carry packages and open gates.
The glasses, he said, “do practically everything for me” — taking photos, helping him know where to walk, and showing his location relative to his van.
While Amazon emphasizes safety and driver experience as the primary goals, early tests hint at efficiency gains, as well. In initial tests, Amazon has seen up to 30 minutes of time savings per shift, although execs cautioned that the results are preliminary and could change with wider testing.

Regulators, legislators and employees have raised red flags over new technology pushing Amazon fulfillment and delivery workers to the limits of human capacity and safety. Amazon disputes this premise, and calls the new glasses part of a larger effort to use technology to improve safety.
Using the glasses will be fully optional for both its Delivery Service Partners (DSPs) and their drivers, even when they’re fully rolled out, according to the company. The system also includes privacy features, such as a hardware button that allows drivers to turn off all sensors.
For those who use them, the company says it plans to provide the devices at no cost.
Despite the way it may look to the public, Amazon doesn’t directly employ the drivers who deliver its packages in Amazon-branded vans and uniforms. Instead, it contracts with DSPs, ostensibly independent companies that hire drivers and manage package deliveries from inside Amazon facilities.
This arrangement has periodically sparked friction, and even lawsuits, as questions have come up over DSP autonomy and accountability.
With the introduction of smart glasses and other tech initiatives, including a soon-to-be-expanded training program, Amazon is deepening its involvement with DSPs and their drivers — potentially raising more questions about who truly controls the delivery workforce.
From ‘moonshot’ to reality
The smart glasses, still in their prototype phase, trace their origins to a brainstorming session about five years ago, said Beryl Tomay, Amazon’s vice president of transportation.
Each year, the team brainstorms big ideas for the company’s delivery system. During one of those sessions, a question emerged: What if drivers didn’t have to interact with any technology at all?
“The moonshot idea we came up with was, what if there was no technology that the driver had to interact with — and they could just follow the physical process of delivering a package from the van to the doorstep?” Tomay said in an interview. “How do we make that happen so they don’t have to use a phone or any kind of tech that they have to fiddle with?”

That question led the team to experiment with different approaches before settling on glasses. It seemed kind of crazy at first, Tomay said, but they soon realized the potential to improve safety and the driver experience. Early trials with delivery drivers confirmed the theory.
“The hands-free aspect of it was just kind of magical,” she said, summing up the reaction from early users.
The project has already been tested with hundreds of delivery drivers across more than a dozen DSPs. Amazon plans to expand those trials in the coming months, with a larger test scheduled for November. The goal is to collect more feedback before deciding when the technology will be ready for wider deployment.
Typically, Amazon would have kept a new hardware project secret until later in its development. But Reuters reported on the existence of the project nearly a year ago. (The glasses were reportedly code-named “Amelia,” but they were announced without a name.) And this way, Amazon can get more delivery partners involved, get input, and make improvements.
Future versions may also expand the system’s capabilities, using sensors and data to automatically recognize potential hazards such as uneven walkways.
How the technology works
Amazon’s smart glasses are part of a system that also includes a small wearable computer and a battery, integrated with Amazon’s delivery software and vehicle systems.
The lenses are photochromatic, darkening automatically in bright sunlight, and can be fitted with prescription inserts. Two cameras — one centered, one on the left — support functions such as package scanning and photo capture for proof of delivery.
A built-in flashlight switches on automatically in dim conditions, while onboard sensors help the system orient to the driver’s movement and surroundings.

The glasses connect by a magnetic wire to a small controller unit, or “compute puck,” worn on the chest of a heat-resistant harness. The controller houses the device’s AI models, manages the visual display, and handles functions such as taking a delivery photo. It also includes a dedicated emergency button that connects drivers directly to Amazon’s emergency support systems.
On the opposite side of the chest, a swappable battery keeps the system balanced and running for a full route. Both components are designed for all-day comfort — the result, Tomay said, of extensive testing with drivers to ensure that wearing the gear feels natural when they’re moving around.
Connectivity runs through the driver’s official Amazon delivery phone via Bluetooth, and through the vehicle itself using a platform called “Fleet Edge” — a network of sensors and onboard computing modules that link the van’s status to the glasses.
This connection allows the glasses to know precisely when to activate, when to shut down, and when to sync data. When a van is put in park, the display automatically activates, showing details such as addresses, navigation cues, and package information. When the vehicle starts moving again, the display turns off — a deliberate safety measure so drivers never see visual data while driving.
Data gathered by the glasses plays a role in Amazon’s broader mapping efforts. Imagery and sensor data feed into “Project Wellspring,” a system that uses AI to better model the physical world. This helps Amazon refine maps, identify the safest parking spots, pinpoint building entrances, and optimize walking routes for future deliveries.
Amazon says the data collection is done with privacy in mind. In addition to the driver-controlled sensor shut-off button, any imagery collected is processed to “blur or remove personally identifiable information” such as faces and license plates before being stored or used.
The implications go beyond routing and navigation. Conceivably, the same data could also lay the groundwork for greater automation in Amazon’s delivery network over time.
Testing the delivery training
In addition to trying the glasses during the event at Amazon’s Delivery Station in Milpitas, Calif., I experienced firsthand just how difficult the job of delivering packages can be.

- Strapped into a harness for a slip-and-fall demo, I learned how easily a driver can lose footing on slick surfaces if not careful to walk properly.
- I tried a VR training device that highlighted hidden hazards like pets sleeping under tires and taught me how to navigate complex intersections safely.
- My turn in the company’s Rivian van simulator proved humbling. Despite my best efforts, I ran red lights and managed to crash onto virtual sidewalks.

The simulator, known as the Enhanced Vehicle Operation Learning Virtual Experience (EVOLVE), has been launched at Amazon facilities in Colorado, Maryland, and Florida, and Amazon says it will be available at 40 sites by the end of 2026.
It’s part of what’s known as the Integrated Last Mile Driver Academy (iLMDA), a program available at 65 sites currently, which Amazon says it plans to expand to more than 95 delivery stations across North America by the end of 2026.
“Drivers are autonomous on the road, and the amount of variables that they interact with on a given day are countless,” said Anthony Mason, Amazon’s director of delivery training and programs, who walked me through the training demos. One goal of the training, he said, is to give drivers a toolkit to pull from when they face challenging situations.
Suffice it to say, this is not the job for me. But if Amazon’s smart glasses live up to the company’s expectations, they might be a step forward for the drivers doing the real work.
TechCrunch Mobility: The ‘robot army’ argument
Amazon nails the fundamentals with first NBA broadcast — with a sports betting twist

“It is here, it is real, it is happening,” said play-by-play announcer Ian Eagle. “The NBA on Prime.”
And with that, Amazon’s foray into live streaming NBA games tipped off.
Amazon marked a major milestone with its growing sports portfolio on Friday, broadcasting its first-ever live NBA game around the world. The matchup — Celtics vs. Knicks — was part of an 11-year deal that gives Amazon exclusive rights to select regular season and playoff games.
We watched the game via Prime Video — accessible with a $139/year Prime subscription — and came away impressed.
The stream ran seamlessly across Fire TV, iPhone, and MacBook. The quality was crisp, load times near-instant, and there wasn’t a hint of lag — at least on a home WiFi connection. Amazon’s 1080p HDR video and 5.1 surround sound were a slam dunk.
The broadcast looked and felt like a traditional national telecast. The graphics mirrored what fans expect from ESPN or TNT, the commentary came from familiar voices — Eagle and Stan Van Gundy — and the pregame show from featured a slick set with former NBA stars at Amazon MGM Studios.

But under the surface, Amazon quietly tested a new frontier: in-stream sports betting.
The most noticeable new feature was the FanDuel integration, Amazon’s latest experiment in blending live sports and interactive technology.
Fans watching on Fire TV could log into their FanDuel accounts through Prime Video to view real-time betting information and track wagers directly within the broadcast.
You can’t make actual bets on Prime Video — not yet, at least— but it marks a subtle yet significant shift in how live sports may evolve on streaming platforms.
And it comes at a fascinating moment: the NBA is dealing with a major betting scandal that made headlines this week and involves the FBI.

I was surprised when NBA Commissioner Adam Silver joined the broadcast for a live interview. Sideline reporter Cassidy Hubbarth opened by asking about the scandal.
Silver said he was “deeply disturbed” upon hearing the news.
“There’s nothing more important to the league and its fans than the integrity of the competition,” he said.
Silver also praised Amazon’s coverage: “I should have started [by saying] how excited we are to be on Amazon,” he said. “I guess I wouldn’t have predicted that my first interview on Amazon would be about sports betting.”
The interview underscored how Amazon’s coverage didn’t shy away from real-time news relevance — adding a traditional journalistic layer within a tech-powered broadcast.
It was also a surreal moment: the NBA’s top official discussing a sports betting scandal during the league’s debut on a platform now integrating betting tools into its stream.
Amazon has other new tech-fueled features including advanced NBA stats powered by Amazon Web Services — but I didn’t notice that during Friday’s broadcast.
One of the only stumbles for me came on the Fire TV user experience, which feels clunky compared to mobile or desktop. Navigation wasn’t intuitive, and the remote’s button mapping made simple actions harder than expected.
But overall, the whole experience felt less like a tech demo and more like a finished product.

Amazon’s sports strategy is crystalizing: use live sports to drive Prime signups and boost engagement across its ecosystem. The broadcast was promoted on Amazon’s homepage and apps. Live sports also helps fuel Amazon’s growing advertising business.
Bloomberg reported that Amazon is paying $1.8 billion annually for the NBA rights.
As more people cut the cord, sports leagues are increasingly partnering with tech companies as their existing deals with traditional cable providers expire. Companies like Amazon, Apple, and Netflix are hungry for valuable content such as live sports to draw more subscribers to their respective platforms.
Amazon also aired the Timberwolves vs. Lakers game on Friday evening. It will stream 66 regular season games this year, along with some playoff games.
The company also separate deals to air the NFL’s Thursday Night Football, WNBA, and Premier League, among other sports-related programming on its Prime Video platform.
The NBA debut on Friday was a reminder of Amazon’s approach to live sports: combine the reliability of broadcast TV with subtle tech layers — such as betting, data, and e-commerce — built on its AWS cloud infrastructure and Prime membership model.
The prime crew nails it again 👏
— Oh No He Didn't (@ohnohedidnt24) October 25, 2025
More of this and fewer hot takes! https://t.co/G3IN2BOyFO pic.twitter.com/swHUtlVXXN
-
GeekWire
- Out of Office: Amazon design technologist makes ‘robot art’ and the tools to help others be creative
Out of Office: Amazon design technologist makes ‘robot art’ and the tools to help others be creative

Out of Office is a new GeekWire series spotlighting the passions and hobbies that members of the Seattle-area tech community pursue outside of work.
- Name: Maksim Surguy.
- Day job: Senior design technologist for Amazon Devices, working on concepts for new devices or new features on existing devices, such as Fire TV, Alexa, and Echo smart speakers.
- Out-of-office passion: Using machines to create art.
Before he pursued a bachelor’s degree in computer science, Maksim Surguy made an initial — and brief — run at a bachelor’s in art.
“Two weeks later, I realized that I suck at art and I switched to computer science,” he laughed.
Fourteen years after completing his education at California State University, Fullerton, Surguy has found happiness and success in marrying the two disciplines, as a technologist and an artist in Seattle.
“My sketching is not to the level that I want, so instead I use code to create artwork,” he said in describing the “robot art” that occupies his free time.
Surguy not only relies on machines to generate his artwork, he creates the software tools that facilitate such art, whether the finished pieces exist as digital NFTs or as physical works such as pen plotter drawings made via scalable vector graphics.
“I spend a lot more time making the tools than actually using them,” Surguy said. “But other people use them to actually make something. So I enjoy both sides of this.”

Surguy is a 2018 graduate of the University of Washington’s Master of Science in Technology Innovation (MSTI), a program at the UW’s Global Innovation Exchange (GIX) — a joint initiative of the College of Engineering and Foster School of Business.
For a hardware/software project, he created a 3D-printed drawing machine with his own electronics program. During the process, he couldn’t find a community for like-minded people who make such things. So he started DrawingBots, a website/Discord that’s attracted thousands of artists and engineers.
Surguy was born and raised in Ukraine and was an accomplished breakdancer who competed as a professional in Eastern Europe when he was younger. He moved to the U.S. in 2004.
He’s been at Amazon for six years and his artwork has been displayed in the company’s headquarters buildings, in public exhibitions — including at Seattle’s NFT Museum, and on his website and social media channels. He’s also written extensively about technology.
And in the blurring space between human and AI-created artwork, he’s leaning further into technology.
“I use AI for a lot of things, and especially now with code, it makes it easier to create tools that are custom and specific for whatever use case,” Surguy said. “I just open-sourced one last weekend. It’s a tool that allows artists to preview their artwork, how it’s going to look before they make it on paper. So it saves them time and money and art supplies.”

Most rewarding aspect of this pursuit: Surguy most enjoys the growing community he helped foster around the tools and art he makes.
“I got to know thousands of people that do this kind of stuff and are very interesting people,” he said. “Some of them were TED speakers. Some of them are PhDs, very well known researchers, scientists, artists. I had conversations with all of these people and consider some of them my friends. So that’s the most rewarding part.”
The lessons he brings back to work: “This kind of procedural and algorithmic art definitely has a place in making products that are digital experiences,” Surguy said of the connection between his hobby and his work at Amazon.
For example, his Devices team launched a dynamic art feature for Fire TV: a screen saver that created artwork on the fly based on data such as weather, time of day, and other inputs.
Surguy said the ideas he generates outside of work serve as inspiration for what he creates at work, whether it’s creative coding or simply expanding the boundaries of what he makes and how he makes it.
Read more Out of Office profiles.
Do you have an out-of-office hobby or interesting side hustle that you’re passionate about that would make for a fun profile on GeekWire? Drop us a line: tips@geekwire.com.
Amazon and the media: Inside the disconnect on AI, robots and jobs

SAN FRANCISCO — Amazon showed off its latest robotics and AI systems this week, presenting a vision of automation that it says will make warehouse and delivery work safer and smarter.
But the tech giant and some of the media at its Delivering the Future event were on different planets when it came to big questions about robots, jobs, and the future of human work.
The backdrop: On Tuesday, a day before the event, The New York Times cited internal Amazon documents and interviews to report that the company plans to automate as much as 75% of its operations by 2033. According to the report, the robotics team expects automation to “flatten Amazon’s hiring curve over the next 10 years,” allowing it to avoid hiring more than 600,000 workers even as sales continue to grow.
In a statement cited in the article, Amazon said the documents were incomplete and did not represent the company’s overall hiring strategy.
On stage at the event, Tye Brady, chief technologist for Amazon Robotics, introduced the company’s newest systems — Blue Jay, a setup that coordinates multiple robotic arms to pick, stow, and consolidate items; and Project Eluna, an agentic AI model that acts as a digital assistant for operations teams.
Later, he addressed the reporters in the room: “When you write about Blue Jay or you write about Project Eluna … I hope you remember that the real headline is not about robots. The real headline is about people, and the future of work we’re building together.”

He said the benefits for employees are clear: Blue Jay handles repetitive lifting, while Project Eluna helps identify safety issues before they happen. By automating routine tasks, he said, AI frees employees to focus on higher-value work, supported by Amazon training programs.
Brady coupled that message with a reminder that no company has created more U.S. jobs over the past decade than Amazon, noting its plan to hire 250,000 seasonal workers this year.
His message to the company’s front-line employees: “These systems are not experiments. They’re real tools built for you, to make your job safer, smarter, and more rewarding.”
‘Menial, mundane, and repetitive’
Later, during a press conference, a reporter cited the New York Times report, asking Brady if he believes Amazon’s workforce could shrink on the scale the paper described based on the internal report.
Brady didn’t answer the question directly, but described the premise as speculation, saying it’s impossible to predict what will happen a decade from now. He pointed instead to the past 10 years of Amazon’s robotics investments, saying the company has created hundreds of thousands of new jobs — including entirely new job types — while also improving safety.
He said Amazon’s focus is on augmenting workers, not replacing them, by designing machines that make jobs easier and safer. The company, he added, will continue using collaborative robotics to help achieve its broader mission of offering customers the widest selection at the lowest cost.
In an interview with GeekWire after the press conference, Brady said he sees the role of robotics as removing the “menial, mundane, and repetitive” tasks from warehouse jobs while amplifying what humans do best — reasoning, judgment, and common sense.
“Real leaders,” he added, “will lead with hope — hope that technology will do good for people.”
When asked whether the company’s goal was a “lights-out” warehouse with no people at all, Brady dismissed the idea. “There’s no such thing as 100 percent automation,” he said. “That doesn’t exist.”

Instead, he emphasized designing machines with real utility — ones that improve safety, increase efficiency, and create new types of technical jobs in the process.
When pressed on whether Amazon is replacing human hands with robotic ones, Brady pushed back: “People are much more than hands,” he said. “You perceive the environment. You understand the environment. You know when to put things together. Like, people got it going on. It’s not replacing a hand. That’s not the right way to think of it. It’s augmenting the human brain.”
Brady pointed to Amazon’s new Shreveport, La., fulfillment center as an example, saying the highly automated facility processes orders faster than previous generations while also adding about 2,500 new roles that didn’t exist before.
“That’s not a net job killer,” he said. “It’s creating more job efficiency — and more jobs in different pockets.”
The New York Times report offered a different view of Shreveport’s impact on employment. Describing it as Amazon’s “most advanced warehouse” and a “template for future robotic fulfillment centers,” the article said the facility uses about 1,000 robots.
Citing internal documents, the Times reported that automation allowed Amazon to employ about 25% fewer workers last year than it would have without the new systems. As more robots are added next year, it added, the company expects the site to need roughly half as many workers as it would for similar volumes of items under previous methods.
Wall Street sees big savings
Analysts, meanwhile, are taking the potential impact seriously. A Morgan Stanley research note published Wednesday — the same day as Amazon’s event and in direct response to the Times report — said the newspaper’s projections align with the investment bank’s baseline analysis.
Rather than dismissing the report as speculative, Morgan Stanley’s Brian Nowak treated the article’s data points as credible. The analysts wrote that Amazon’s reported plan to build around 40 next-generation robotic warehouses by 2027 was “in line with our estimated slope of robotics warehouse deployment.”
More notably, Morgan Stanley put a multi-billion-dollar price tag on the efficiency gains. Its previous models estimated the rollout could generate $2 billion to $4 billion in annual savings by 2027. But using the Times’ figure — that Amazon expects to “avoid hiring 160,000+ U.S. warehouse employees by ’27” — the analysts recalculated that the savings could reach as much as $10 billion per year.
Back at the event, the specific language used by Amazon executives aligned closely with details in the Times report about the company’s internal communications strategy.
According to the Times, internal documents advised employees to avoid terms such as “automation” and “A.I.” and instead use collaborative language like “advanced technology” and “cobots” — short for collaborative robots — as part of a broader effort to “control the narrative” around automation and hiring.
On stage, Brady’s remarks closely mirrored that approach. He consistently framed Amazon’s robotics strategy as one of augmentation, not replacement, describing new systems as tools built for people.
In the follow-up interview, Brady said he disliked the term “artificial intelligence” altogether, preferring to refer to the technology simply as “machines.”
“Intelligence is ours,” he said. “Intelligence is a very much a human thing.”
Amazon’s New Help Me Decide Button Will Make You Pick the Right Product
Almost every other tech brand is adopting AI into its apps and services. Now, the e-commerce giant Amazon has introduced a new AI feature designed to help buyers choose the right product. When you compare similar products on the Amazon app or website, the “Help Me Decide” button will recommend the best-suited product based on your purchase history and the product’s reviews.
Amazon’s new Help Me Decide button will simplify the shopping experience
The idea behind this new feature is to simplify decision-making while comparing or buying two or more similar products. For reference, if you are looking to buy a smartphone, the AI might notice your past search history, related product order history, and other information. Based on that, it could suggest a few particular smartphone models. Essentially, the tool studies what you’ve viewed or purchased before and uses that context to give a personalized recommendation.
Now, once you click on the “Help Me Decide” button, Amazon’s AI will recommend a product with a brief explanation of why it might suit you. Alongside the current option, you may also see a “budget pick” and an “upgrade pick” option. The former option will adjust the recommendation for affordable shoppers, while the latter is for those who prefer premium choices.
Amazon wants AI to shape the way its consumers shop
This isn’t the first time the e-commerce giant has integrated AI into its app or service. Amazon previously launched Rufus, an AI chatbot that guides customers through purchases, and a tool that automatically generates product buying guides. More recently, Amazon introduced Lens Live AI, which scans your surroundings using your phone’s camera and suggests matching items from its store.
Now, with the latest addition, the Help Me Decide tool, the company continues its push to make online shopping faster and more intuitive. For now, the feature is rolling out to millions of users across the US. It may be rolled out later to the users of other territories.
The post Amazon’s New Help Me Decide Button Will Make You Pick the Right Product appeared first on Android Headlines.
-
GeekWire
- How the AWS outage happened: Amazon blames rare software bug and ‘faulty automation’ for massive glitch
How the AWS outage happened: Amazon blames rare software bug and ‘faulty automation’ for massive glitch

A detailed explanation of this week’s Amazon Web Services outage, released Thursday morning, confirms that it wasn’t a hardware glitch or an outside attack but a complex, cascading failure triggered by a rare software bug in one of the company’s most critical systems.
The company said a “faulty automation” in its internal systems — two independent programs that began racing each other to update records — erased key network entries for its DynamoDB database service, triggering a domino effect that temporarily broke many other AWS tools.
AWS said it has turned off the flawed automation worldwide and will fix the bug before bringing it back online. The company also plans to add new safety checks and improve how quickly its systems recover if something similar happens again.
Amazon apologized and acknowledged the widespread disruption caused by the outage.
“While we have a strong track record of operating our services with the highest levels of availability, we know how critical our services are to our customers, their applications and end users, and their businesses,” the company said, promising to learn from the incident.
The outage began early Monday and impacted sites and online services around the world, again illustrating the internet’s deep reliance on Amazon’s cloud and showing how a single failure inside AWS can quickly ripple across the web.
Related: The AWS outage is a warning about the risks of digital dependance and AI infrastructure
The AWS outage is a warning about the risks of digital dependance and AI infrastructure

Unless you’ve been on a “digital cleanse” this week, you know that Amazon Web Services (AWS) had a major outage at the start of the week.
You know this because apps and sites you use were down. Credible reports estimate at least 1,000 sites and apps were affected. Large swaths of modern digital life went dark: from finance (Venmo and Robinhood) to gaming (Roblox and Fortnite) to communications (Signal and Slack). Some people couldn’t even get a good night’s sleep because the outage took out “smart beds.” Even sporting events were impacted when Ticketmaster failed.
We’ve seen outages before, but this one seemed broader and harder to ignore.
In the wake of the outage, many well-intentioned hot takes boiled down to: “They should’ve used more cloud providers.”
Setting aside the subtle victim-blaming, there’s also the fact that in a world with only three major cloud providers (AWS, Microsoft Azure, Google Cloud) if you want to “diversify” there’s not a lot of diversity out there.
And the argument for diversity in cloud providers is really about market diversity, not individual organizations juggling multiple vendors. More competition in the cloud market would mean fewer cascading failures when one provider goes down.
The key question when something like this happens is whether we’re taking the risk lessons and expanding them beyond the immediate problem to see the emerging problems.
Instead of saying organizations need to have multiple cloud providers, we should be asking how we’re dealing with the reality of highly concentrated risks with exceptionally broad impact because we just had an object lesson in what that really means.
In this recent outage there’s a pointer to where we should be looking proactively to apply this lesson: generative AI. This recent AWS outage gives us two lessons for the emerging generative AI ecosystem.
Concentration crisis in AI
With the generative AI ecosystem, I’m talking not about chatbots — I mean AI-native applications that are built on generative AI as a platform. We just saw that when there’s no cloud, there’s no cloud-native application. Likewise, when there’s no generative AI provider, there’s no AI-native application.
The first lesson from the AWS outage for AI-native applications is what happens to an industry when there’s a limited number of providers for centralized resources and there’s an outage. We just saw: it has huge rippling effects across the industry and all walks of life built on it.
It’s a throwback to the mainframe era: when “the computer” is down, it’s down for everyone.
There are as few, if not fewer, generative AI providers as there are cloud providers. A major outage is inevitable — that’s just engineering reality. When that happens, every AI-native app built on that generative AI platform will also go down, full stop.
The impact could be even more severe than the AWS outage. It will be more like “the computer is down, and the people are gone” for many different industries and services. Ironically, the “smarter” the industry and service, the greater the potential fallout.
The second lesson is one of intertwined risk. OpenAI itself was affected by this week’s AWS outage.
That means AI-native apps have double exposure to the risks around a limited number of providers for critical, centralized resources. For AI-native apps, it’s like the mainframe era squared. If the generative AI platform fails, everything built on it fails. And if the cloud that hosts the AI platform fails, it all goes down, too.
This is not to say don’t do cloud or don’t do AI. But it is to say we need to understand this new, complex intertwining of risks inherent in a world where everything is relying on a small number of key providers and that small number of key providers also rely on a small number of key providers.
The realities of physical requirements and capital investment required for cloud and generative AI make a truly diverse ecosystem impracticable for either. I don’t think anyone sees more than a literal handful of providers for either of these in the future.
The bottom line
Highly concentrated risks with exceptionally broad impact aren’t going away anytime soon.
But the growth of generative AI providers — and their reliance on cloud providers — show where there is going to be growth and where and what those risks will be. The growth will be upwards, as technologies stack on top of and rely on each other. And that means these risks are only going to become more concentrated and the impacts even broader.
In the world of security, there’s the “CIA” triad: “confidentiality”, “integrity” and “availability.” In the first days of “Trustworthy Computing” at Microsoft, the principles included “availability.” But in recent years, availability has been overlooked often as security and privacy concerns understandably dominate.
A thoughtful application of the AWS outage tells us that outages like this are a kind of problem that isn’t an anomaly: it’s inherent in the nature of today’s technology reality. And since there are no easy solutions and only increasingly complex problems around this, we need to start understanding this new reality and thinking seriously about how to mitigate these risks.
Amazon unveils AI-powered augmented reality glasses for delivery drivers

MILPITAS, Calif. — Amazon is bringing delivery details directly to drivers’ eyeballs.
The e-commerce giant on Wednesday confirmed that it’s developing new augmented reality glasses for delivery drivers, using AI and computer vision to help them scan packages, follow turn-by-turn walking directions, and capture proof of delivery, among other features.
Amazon says the goal is to create a hands-free experience, making the job safer and more seamless by reducing the need for drivers to look down at a device.
Scenarios shown by the company make it clear that the devices activate after parking, not while driving, which could help to alleviate safety and regulatory concerns.
[Update, Oct. 23: Amazon executives said in briefings Wednesday that the glasses will be fully optional for drivers, and they’re designed with a hardware-based privacy button. This switch, located on the device’s controller, allows drivers to turn off all sensors, including the camera and microphone.
From a customer perspective, the company added that any personally identifiable information, such as faces or license plates, will be blurred to protect privacy.
Overall, Amazon is positioning the glasses as a tool to improve safety and the driver’s experience. We had a chance to try the glasses first-hand this week, and we’ll have more in an upcoming post.]
The wearable system was developed with input from hundreds of drivers, according to the company. It includes a small controller worn on the driver’s vest that houses operational controls, a swappable battery for all-day use, and a dedicated emergency button.

The glasses are also designed to support prescription and transitional lenses. Amazon says future versions could provide real-time alerts for hazards, like pets in the yard, or notify a driver if they are about to drop a package at the wrong address.
According to Amazon, the smart glasses are an early prototype, currently in preliminary testing with hundreds of drivers in North America. The company says it’s gathering driver feedback to refine the technology before planning a broader rollout.
The announcement at Amazon’s Delivering the Future event in the Bay Area today confirms a report by The Information last month. That report also said Amazon is developing consumer AR glasses to compete with Facebook parent Meta’s AI-powered Ray Ban smart glasses.
The enterprise AR market is in flux, with early mover Microsoft pivoting away from HoloLens hardware, creating an opening for players like Magic Leap and Vancouver, Wash.-based RealWear.
A demo video released by Amazon shows a delivery driver using augmented reality (AR) glasses throughout their workflow. It begins after the driver parks in an electric Rivian van, where the glasses overlay the next delivery address directly onto a view of the road.
“Dog on property,” the audio cue cautions the driver.
Upon parking, the driver moves to the cargo area. The AR display then activates to help with sorting, with green highlights overlaid on the specific packages required for that stop. As the driver picks each item, it’s scanned and a virtual checklist in their vision gets updated.
After retrieving all the packages from the cargo hold, the driver begins walking to the house. The glasses project a digital path onto the ground, guiding them along the walkway to the front door.
Once at the porch, the display prompts the driver to “Take photo” to confirm the delivery. After placing the items, the driver taps a chest-mounded device to take the picture. A final menu then appears, allowing the driver to “Tap to finish” the stop before heading back to the van.
-
GeekWire
- ‘Too dumb to fail’: Ring founder Jamie Siminoff promises gritty startup lessons in upcoming book
‘Too dumb to fail’: Ring founder Jamie Siminoff promises gritty startup lessons in upcoming book

Jamie Siminoff has lived the American Dream in many ways — recovering from an unsuccessful appearance on Shark Tank to ultimately sell smart doorbell company Ring to Amazon for a reported $1 billion in 2018.
But as with most entrepreneurial journeys, the reality was far less glamorous. Siminoff promises to tell the unvarnished story in his debut book, Ding Dong: How Ring Went From Shark Tank Reject to Everyone’s Front Door, due out Nov. 10.
“I never set out to write a book, but after a decade of chaos, failure, wins, and everything in between, I realized this is a story worth telling,” Siminoff said in the announcement, describing Ding Dong as the “raw, true story” of building Ring, including nearly running out of money multiple times.
He added, “My hope is that it gives anyone out there chasing something big a little more fuel to keep going. Because sometimes being ‘too dumb to fail’ is exactly what gets you through.”
Siminoff rejoined the Seattle tech giant earlier this year after stepping away in 2023. He’s now vice president of product, overseeing the company’s home security camera business and related devices including Ring, Blink, Amazon Key, and Amazon Sidewalk.
Preorders for the book are now open on Amazon.
At Amazon event, San Francisco Mayor Daniel Lurie defends city and touts AI-driven rebound

SAN FRANCISCO — Facing renewed threats of federal intervention from President Trump, Mayor Daniel Lurie used an appearance at an Amazon event Tuesday to make the case that San Francisco is “on the rise,” citing its AI-fueled revival as proof of a broader comeback.
Without naming Trump or explicitly citing the proposal to deploy the National Guard, Lurie pushed back on the national narrative of urban decline — pointing to falling crime rates, new investment, and the city’s central role in the AI boom.
Lurie, who took office earlier this year, said San Francisco is “open for business” again, name-checking OpenAI and other prominent companies in the city as examples of the innovation fueling its recovery. Mayors of other cities, he said, would die to have one of the many AI companies based in San Francisco.
“Every single metric is heading in the right direction,” Lurie said, noting that violent crime is at its lowest level since the 1950s and car break-ins are at a 22-year low, among other stats.
He was speaking at the San Francisco-Marin Food Bank, as Amazon hosted journalists from around the country and the world on the eve of its annual Delivering the Future event, where the company shows its latest robotics and logistics innovations.
“I want you to tell everybody, wherever you come from, that San Francisco’s on the rise,” he said. “You tell them there’s a new mayor in town, that we’ve got this, and we do.”
Amazon and leaders of San Francisco-Marin Food Bank highlighted their partnership that uses the company’s delivery network to bring food to community members who can’t get to a pantry. The company said Tuesday it has delivered more than 60 million meals for free from food banks across the US and UK, committing to continue the program through 2028.
A New York Times report on Tuesday, citing internal Amazon documents, said the company wants to automate 75% of its operations in the coming years to be able to avoid hiring hundreds of thousands of workers. It noted that the company is looking at burnishing its image through community programs to counteract the long-term fallout.
Executives noted that Amazon has focused in the Seattle region on affordable housing, in line with its approach of adapting to different needs in communities where it operates.
Lurie pointed to the company’s San Francisco food bank partnership as a model for other companies. “Amazon is showing that they are committed to San Francisco,” he said.
Amazon customers report delivery delays after major AWS outage

Amazon’s e-commerce customers are experiencing unusual delivery delays following the Amazon Web Services outage on Monday — suggesting that the cloud glitch has impacted the company’s own operations more than previously reported.
Customers posting on Reddit and X reported Amazon orders that were scheduled for Monday delivery but did not arrive. Some of the comments:
- “I received a delay email on everything due today. Coming tomorrow and I’m fine with that.”
- “I have 4 items that are suppose to be delivered today as well and they haven’t even left the facility. So I’m sure it’s the outage.”
- “My amazon fresh order was cancelled at 5:15PM.”
Amazon workers posting on the “r/AmazonFC” Reddit community cited downtime at fulfillment centers.
- “Today was the first day I’ve experienced an entire day of downtime, and not as a shutdown for maintenance. Very odd feeling to maintain a constant state of readiness for 10 hours in case the system comes back at any moment.”
We reached out to Amazon for details about delayed deliveries.
Amazon’s package fulfillment systems run atop AWS infrastructure — so disruptions in key AWS services can ripple directly into its retail and logistics network.
Amazon’s logistics arm processes about 17.2 million delivery orders per day, according to Capital One.
The fallout from delayed deliveries could lead to increased costs due to potential refund obligations and additional labor needs.
The outage started shortly after midnight Monday and lasted for about three hours, but the aftershock effects were felt by Amazon’s cloud customers for much of the day. The company blamed a DNS resolution issue with its DynamoDB service in US-EAST-1 region, it oldest and largest digital hub. Major outages originating from this same region also caused widespread disruptions in 2017, 2021, and 2023.
The outage impacted everything from sites including Facebook, Coinbase, and Ticketmaster, to check-in kiosks at LaGuardia Airport. Amazon’s own retail site, its Prime Video streaming service, and its Ring subsidiary were also affected.
Despite the major outage, Amazon’s stock was up Monday and in early Tuesday trading.
AWS outage affects Ticketmaster for pivotal Mariners vs. Blue Jays playoff game in Toronto

The effects of the massive AWS outage reached the sports world on Monday.
Ticketmaster was dealing with ticket management issues as a result of the outage, according to messages shared by several sports teams hosting games on Monday, including the Toronto Blue Jays and Seattle Seahawks.
The Blue Jays, facing off against the Seattle Mariners in a Game 7 MLB playoff bout at Rogers Centre in Toronto, posted a statement earlier Monday about the outage and advised fans to “hold off on managing your tickets as we work through this.”
A few hours later, the team said ticket management was returning to normal.
>World Series appearance on the line
— Morning Brew ☕️ (@MorningBrew) October 20, 2025
>AWS outage sends Ticketmaster down
>Blue Jays fans can't access Game 7 tickets
>Blue Jays opponent…Seattle
>Amazon headquarters…Seattle https://t.co/OYjjDj5cdf pic.twitter.com/rbNnwKYegG
The Seahawks, which are hosting the Houston Texans for Monday Night Football in Seattle, issued a statement about the outage “that may impact access to Ticketmaster, Seahawks Account Manager, and the Seahawks Mobile App.”
The Detroit Lions, hosting their own Monday Night Football game, also had ticketing impacted.
The outage effects went beyond just ticketing. The Premier League said its VAR tech system, used to determine offside calls in soccer, would not be available for Monday’s match between West Ham and Brentford.
Amazon’s outage began shortly after midnight Pacific in Amazon’s Northern Virginia (US-EAST-1) region, which is AWS’s oldest and largest cloud region, a popular nerve center for online services.
In an initial update, AWS said the outage was related to a DNS resolution issue with its DynamoDB product, meaning the internet’s phone book failed to find the correct address for a database service used by thousands of apps to store and find data.
Amazon later said the root cause of the outage was an “underlying internal subsystem responsible for monitoring the health of our network load balancers.”
By 3 p.m. PT, the company said all AWS services had returned to normal operations.
Major sites and services including Facebook, Snapchat, Coinbase and Amazon itself were impacted — reviving concerns about the internet’s heavy reliance on the cloud giant.
The outage suggests that many sites have not adequately implemented the redundancy needed to quickly fall back to other regions or cloud providers in the event of AWS outages.
Previously: