The Nintendo Switch 2 is for sale on Amazon, no invite required

Getting your hands on Nintendo’s latest console just got easier, as Amazon is now listing the Nintendo Switch 2 for sale outright, no invitation needed. It’s been three months since the release of the Nintendo Switch 2, and the launch day frenzy is just starting to taper off. The handhelds have been selling at a blistering pace, with just under 6 million units sold in the first four weeks.

Online inventories in those initial weeks sold out in a flash, and sparse restocks were gone just as quickly. Amazon was left out of the initial pre-order process and didn’t list the console at all until over a month after its release. This conspicuous absence may have been due to Nintendo’s frustration with third-party sellers undercutting the company’s own pricing for games on the site.

We really loved the Nintendo Switch 2 in our hands-on review, and thought it was a great follow-up to the 2017 console that launched a handheld renaissance. We gave the Switch 2 a score of 93 out of 100, and were particularly impressed with its larger 7-9-inch LCD screen, the magnetic Joy-Cons, better base storage and, of course, significantly improved performance over the original. The pricing is a bit steep, the battery life could be better and the dock could more USB-C ports, but aside from those details the Switch 2 is almost perfect.

If you’ve been waiting to pick up a Nintendo Switch 2 without having to go on a scavenger hunt, then the Amazon listing should be a welcome option. The months since release have also seen a great selection of Switch 2 ports and exclusive games hit the market. Amazon’s listing offers the standalone console for $449 or the Mario Kart World bundle for $499. Sales are limited to one unit per customer.

Follow @EngadgetDeals on X for the latest tech deals and buying advice.

This article originally appeared on Engadget at https://www.engadget.com/gaming/nintendo/the-nintendo-switch-2-is-for-sale-on-amazon-no-invite-required-132503609.html?src=rss 

Meta CTO explains the cause of its embarrassing smart glasses demo failures

When Mark Zuckerberg announced Meta’s latest smart glasses at the company’s Connect 2025 keynote, he encountered two glitches that prevented him from properly demonstrating some of the devices’ features. Now, Meta’s Chief Technology Officer, Andrew Bosworth, said in an AMA on Instagram that they were demo failures and not actual product failures. The first glitch took place in the middle of a live demo with a cooking content creator, who asked Live AI for instructions on how to make a Korean-inspired steak sauce on his Meta glasses. Instead of giving him detailed instructions, his glasses’ AI skipped ahead by several steps and continued glitching. The chef told Zuckerberg that the “WiFi might be messed up” in the venue.

Bosworth said, however, that it was not the case. Apparently, when the chef said “Hey Meta, start Live AI,” it fired up every single Meta Ray-Ban’s Live AI in the building. And since the event was all about the company’s smart glasses, there were a lot of them in the venue at the time. The company had also routed Live AI’s traffic to its dev server to isolate it, but it ended up routing the Live AI traffic of everyone’s glasses in the building to its server. “We DDoS’d ourselves, basically,” he said. He continued that it didn’t happen at rehearsal, because there weren’t as many people wearing the glasses when they tested it out. 

Zuckerberg also ran into an issue when he tried demonstrating taking WhatsApp video calls on the Meta Ray-Ban Display. The audience could see him getting calls on the glasses’ HUD, but he couldn’t answer them to start the call. Bosworth said that it was caused by a “never-before-seen bug” that had put the display to sleep at the very instant that the notifications came in that someone was calling. Even after Zuckerberg woke up the display, there was no option to answer the call. The CTO said Meta had never come across that bug before the demo and that it has since been fixed. “You guys know we can do video calling… we got WhatsApp, we know how to do video calling,” he said, but admitted that it was a missed opportunity to be able to show on stage that the feature actually works. 

This article originally appeared on Engadget at https://www.engadget.com/wearables/meta-cto-explains-the-cause-of-its-embarrassing-smart-glasses-demo-failures-123011790.html?src=rss 

The Morning After: Meta’s Ray-Ban Display is the closest thing yet to true smart glasses

Revealed at Meta’s Connect 2025 conference, the Ray-Ban Display has a small, integrated display on the right lens, designed for quick, discreet glances at notifications, directions and even video calls. The clever part is its subtlety; to an onlooker, you’re just wearing a pair of Ray-Bans, not accessing a tiny screen with your peripheral vision. (Although you will appear to offer multiple pensive stares into the middle-distance)

Paired with a Meta Neural Band, which you wear on your wrist, the glasses respond to subtle hand gestures. A simple swipe of your thumb across your index finger navigates the interface, while a twist of the wrist handles volume control. This system makes interacting with the glasses feel impressively seamless and intuitive.

While these glasses aren’t about to make your smartphone obsolete, they represent a significant refinement of the smart eyewear concept. According to Engadget’s Karissa Bell, who tested them earlier this week, they are a practical step towards integrating digital information more naturally into our daily lives.

She also tested the Conversational Focus feature, which gives you live captions of the person you’re speaking with even in a loud environment that may be hard to hear.

The Ray-Ban Display are priced at $799 — once again a pricey test of new tech. They’re heading to select US store shelves on September 30. Check out our full impressions right here.

— Mat Smith

Get Engadget’s newsletter delivered direct to your inbox. Subscribe right here!

The news you might have missed

Meta unveils its second-gen Ray-Ban smart glasses at Connect

The best October Prime Day deals to shop now

Can-Am Origin electric motorcycle review: Good for a fun time, not a long time

Inside the Apple audio lab where AirPods are tested and tuned

Oakley Meta Vanguard are the smart glasses athletes might actually want

iPhone 17 review: Closer to Pro

Fiverr lays off employees to turn into AI-first company

250 jobs, gone.

Fivver, best known for offering gig economy job listings for all kinds of creative endeavors, is laying off 250 employees. It says it’s pivoting to being an AI-first company.

CEO Micha Kaufman says the ultimate goal is to turn Fiverr into “an AI-first company that’s leaner, faster” — comparing it to “startup mode.” I’m sure Kaufman’s salary and CEO benefits won’t be at startup levels.

Continue reading.

iPhone 17 Pro and Pro Max review

An impactful redesign.

Engadget

While the iPhone Air might be the scene-stealer, for the best specs and cameras, the iPhone Pro remains the pick. And let’s not forget: It got a redesign too! It has a versatile triple-sensor system for rear cameras, while the new aluminum unibody is scratch-resistant and feels sturdy. So do you want the technically more capable new iPhone or the new model?

Continue reading.

iPhone Air review

Thinness with purpose.

Engadget

It’s here. Maybe it’s just a stepping stone on the way towards that first foldable iPhone, but the iPhone Air is officially here. It might not be the most affordable iPhone or the one with the most cameras, but for style and sleekness, the iPhone Air is without a doubt Apple’s coolest smartphone since it ditched the home button. And you know what? Battery life isn’t terrible.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/general/the-morning-after-engadget-newsletter-102144004.html?src=rss 

Meta Ray-Ban Display hands-on: Discreet and intuitive

I’ve been testing smart glasses for almost a decade. And in that time, one of the questions I’ve been asked the most is “oh, but can you see anything in them?” For years, I had to explain that no, glasses like that don’t really exist yet.

That’s no longer the case. And while I’ve seen a bunch of glasses over the last year that have some kind of display, the Meta Ray-Ban Display glasses feel the closest to fulfilling what so many people envision when they hear the words “smart glasses.”

To be clear, they don’t offer the kind of immersive AR that’s possible with Meta’s Orion prototype. In fact Meta considers “display AI glasses” to be a totally separate category from AR. The display is only on one lens — the right — and its 20-degree field of view is much smaller than the 70 degrees on Orion. That may sound like a big compromise, but it doesn’t feel like one.

Karissa Bell for Engadget

The single display feels much more practical for a pair of glasses you’ll want to wear every day. It’s meant to be something you can glance at when you need it, not an always-on overlay. The smaller size also means that the display is much sharper, at 42 pixels per degree. This was especially noticeable when I walked outside with the glasses on; images on the display looked even sharper than in indoor light, thanks to automatic brightness features.

I also appreciated that you can’t see any light from the display when you’re looking at someone wearing the glasses. In fact the display is only barely noticeable at all when you at them up close. 

Having a smaller display also means that the glasses are cheaper, at $799, and that they don’t look like the chunky AR glasses we’ve seen so many times. At 69 grams, they are a bit heavier and thicker than the second-gen Meta Ray-Bans, but not much. As someone who has tried on way too many pairs of thick black smart glasses, I’m glad Meta is offering these in a color besides black. All Wayfarer-style frames look wide on my face but the lighter “sand” color feels a lot more flattering.

The Meta Ray-Ban Display (left) and second-gen Ray-Ban Meta glasses (right.) The display glasses are a little thicker.

Karissa Bell for Engadget

The Meta Neural Band wristband that comes with the display glasses functions pretty much the same as the band I used on the Orion prototype. It uses sensors to detect the subtle muscle movements on your hand and wrist and can translate that into actions within the glasses’ interface.

It’s hard to describe, but the gestures for navigating the glasses interfaces work surprisingly well. I can see how it could take a little time to get used to the various gestures for navigating between apps, bringing up Meta AI, adjusting the volume and other actions, but they are all fairly intuitive. For example, you use your thumb to swipe along the the top of your index finger, sort of like a D-pad, to move up and down and side to side. And you can raise and lower the speaker volume by holding your thumb and index finger together and rotating your wrist right or left like it’s a volume knob.

It’s no secret that Meta’s ultimate goal for its smart glasses is to replace, or almost replace, your phone. That’s not possible yet, but having an actual display means you can look at your phone a whole lot less.

Karissa Bell for Engadget

The display can surface incoming texts, navigation with map previews (for walking directions), and info from your calendar. I was also able to take a video call from the glasses — unlike Mark Zuckerberg’s attempted live demo during his keynote — and it was way better than I expected. I could not only clearly see the person I was talking to and their surroundings, I could turn on my glasses’ camera and see a smaller version of the video from my side.

I also got a chance to try the Conversational Focus feature, which allows you to get live captions of the person you’re speaking with even in a loud environment that may be hard to hear. There was something very surreal about getting real-time subtitles to a conversation with a person standing directly in front of me. As someone who tries really hard to not look at screens when I’m speaking to people, it almost felt a little wrong. But I can also see how this would be incredibly helpful to people who have trouble hearing or processing conversations. It would also be great for translations, something Meta AI already does very well.

I also appreciated that the wristband allows you to invoke Meta AI with a gesture so you don’t always have to say “Hey Meta.” It’s a small change, but I’ve always felt weird about talking to Meta AI in public. The display also addresses another one of my longtime gripes with the Ray-Ban Meta and Oakley glasses: framing a photo is really difficult. But with a display, you can see a preview of your shot, as well as the photo after the fact, so you no longer have to just snap a bunch and hope for the best.

I’ve only had about 30 minutes with the glasses, so I don’t really know how having a display could fit into my daily routine. But even after a short time with them, they really do feel like the beginning of the kind of smart glasses a lot of people have been waiting for.

This article originally appeared on Engadget at https://www.engadget.com/wearables/meta-ray-ban-display-hands-on-discreet-and-intuitive-002334346.html?src=rss 

Meta will let outside developers create AI-powered apps for its smart glasses

Meta’s lineup of smart glasses could soon get a lot more capabilities. The company will begin allowing outside developers to bring their apps to its RayBan and Oakley smart glasses, Meta announced on the second day of its Connect event.

Up to now, Meta has only had a limited number of third-party integrations for its glasses, with apps like Spotify and Audible. But Meta will now allow developers to start experimenting with apps that can take advantage of the built-in sensors and audio capabilities of its glasses. This means other companies will be able to create their own custom experiences that use Meta’s multimodal AI features.

The company is already working with a set of early partners, like Twitch, which is creating livestreaming capabilities for the glasses, and Disney, which is experimenting with an app for inside its parks. A demo video shows a visitor walking around Disneyland and asking the AI assistant about the rides she’s seeing and other park information. 18Birdies, a golf app, is working on an integration that can give players club recommendations and yardage stats.

Notably, these apps all seem like they work with Meta’s non-display glasses, which means that even people who have first-gen Ray-Ban Meta glasses could see a bunch of added new functionality. It’s not clear if the company will also allow developers to also build experiences that can take advantage of the display on its newest Meta Ray-Ban Display frames, but that could open up even more possibilities.

Meta’s new set of tools, officially called the “Wearables Device Access Toolkit,” will roll out as a limited developer preview ahead of broader availability in 2026.

This article originally appeared on Engadget at https://www.engadget.com/wearables/meta-will-let-outside-developers-create-ai-powered-apps-for-its-smart-glasses-194159233.html?src=rss 

Who Was Brad Everett Young? Remembering the Actor & Photographer From ‘Grey’s Anatomy’

Young died at just 46 in September 2025 after succumbing to injuries from a car accident. Learn more about the late photographer and actor here.

Young died at just 46 in September 2025 after succumbing to injuries from a car accident. Learn more about the late photographer and actor here. 

Discord will launch a native Meta Quest app next year

In addition to new hardware announcements, Meta had software news to share during its Meta Connect 2025 conference today. The company revealed that Discord will be making a native app for the Meta Quest headset. According to Meta, the native window app will be available some time in 2026.

The development makes sense. VR is a platform with a lot of gaming presence, so having Discord for easy social and voice connections while playing is a win for players and a natural match for the two businesses. Having a native app can make a big difference in the ease of use. I’m primarily a member of the PlayStation nation, and I swear I heard an angelic choir singing when the PS5 finally got call support

Meta positioned the upcoming availability of the native app as a boon for the developers of VR experiences to reach new audiences, thanks to Discord’s more than 200 million monthly active players. We’ve reached out to Discord for additional comment and will update with any more details we receive.

This article originally appeared on Engadget at https://www.engadget.com/discord-will-launch-a-native-meta-quest-app-next-year-183939524.html?src=rss 

Generated by Feedzy
Exit mobile version