Court reduces damages Meta will get from spyware maker NSO Group but bans it from WhatsApp

US District Judge Phyllis Hamilton has reduced the damages Meta is getting from the NSO Group from $167 million to $4 million, but she has also ordered the Israeli spyware maker to stop targeting WhatsApp. If you’ll recall, Meta sued the NSO Group in 2019 over its Pegasus spyware, which it said was used to spy on 1,400 people from 20 countries, including journalists and human rights activists. Meta said at the time that Pegasus can infect targets’ devices even without their participation by sending text messages with malicious codes to WhatsApp. Even a missed call is enough to infect somebody’s device. 

According to Courthouse News Service, Hamilton reduced the damages because they would need to follow a legal framework designed to proportionate damages. However, she has also handed down a permanent injunction on the NSO Group’s efforts to break into WhatsApp. In her decision, she took note of statements made by NSO’s lawyers and its own CEO revealing that it hasn’t stopped collecting WhatsApp messages and trying to get around the messaging app’s security measures. The defendants previously said that the injunction Meta was requesting would “put NSO’s entire enterprise at risk” and “force NSO out of business,” since WhatsApp is one of the Pegasus spyware’s main ways to infect targets’ devices. 

“Today’s ruling bans spyware maker NSO from ever targeting WhatsApp and our global users again,” said Will Cathcart, Head of WhatsApp. “We applaud this decision that comes after six years of litigation to hold NSO accountable for targeting members of civil society. It sets an important precedent that there are serious consequences to attacking an American company.” 

Hamilton wrote that the proposed injunction requires the Israeli company to delete and destroy computer code related to Meta’s platforms, and that she concluded that the provision is “necessary to prevent future violations, especially given the undetectable nature of defendants’ technology.” It’s not quite clear how Meta will ensure that the NSO Group doesn’t use WhatsApp to infect its users’ devices again. Notably, the NSO Group was recently acquired by an American investment group that invested tens of millions of dollars into it to take controlling ownership. 

This article originally appeared on Engadget at https://www.engadget.com/cybersecurity/court-reduces-damages-meta-will-get-from-spyware-maker-nso-group-but-bans-it-from-whatsapp-163016648.html?src=rss 

Prince Andrew’s Kids: Everything to Know About Princesses Beatrice & Eugenie

Queen Elizabeth II has a total of eight grandchildren, including the two daughters of Prince Andrew, Duke of York. Meet Princesses Beatrice and Eugenie.

Queen Elizabeth II has a total of eight grandchildren, including the two daughters of Prince Andrew, Duke of York. Meet Princesses Beatrice and Eugenie. 

Google has killed Privacy Sandbox

Google’s Privacy Sandbox is officially dead. In an update on the project’s website, Google Vice President Anthony Chavez has announced that the company was sunsetting the remaining technologies developed for Sandbox due to their “low levels of adoption.” A spokesperson has confirmed to AdWeek that Google isn’t just killing those technologies, it’s retiring the whole initiative altogether. “We will be continuing our work to improve privacy across Chrome, Android and the web, but moving away from the Privacy Sandbox branding,” the spokesperson said. “We’re grateful to everyone who contributed to this initiative, and will continue to collaborate with the industry to develop and advance platform technologies that help support a healthy and thriving web.”

The company launched Privacy Sandbox in 2019 as a future replacement to third-party cookies. It’s a set of open standards that are supposed to enable personalized ads without divulging identifying data. Over the years, Google’s plans to deprecate third-party cookies got pushed back again and again due to a series of delays and regulatory hurdles. Specifically, both the UK’s Competition and Markets Authority (CMA) and the US Department of Justice looked into the Privacy Sandbox out of concerns that it could harm smaller advertisers. 

In 2024, Google ultimately decided not to kill third-party cookies in Chrome and instead chose to roll out “a new experience in Chrome that lets people make an informed choice that applies across their web browsing.” Just this April, Google announced that it wasn’t going to make any to changes to how third-party cookies work on the Chrome browser at all, and that it was going to “maintain [its] current approach to offering users third-party cookie choice in Chrome.” At the time, the company said that it was going to keep the Privacy Sandbox initiative alive, but things have clearly changed since then. Chavez wrote in the latest update that Google will “continue to utilize learnings from the retired Privacy Sandbox technologies.”

This article originally appeared on Engadget at https://www.engadget.com/cybersecurity/google-has-killed-privacy-sandbox-130029899.html?src=rss 

A spooky NES platformer, more N++ and other new indie games worth checking out

Welcome to our latest roundup of what’s going on in the indie game space. I’ve reluctantly paused Ball x Pit for long enough to share some neat new releases and more details on upcoming games — some of which are arriving very soon. We’ve got a notable update for a classic as well. 

Steam Next Fest is taking place at the minute, and you still have until Monday to join in by checking out some of the many, many demos that have gone live for the event. Thanks partially (okay, almost entirely) to being unable to escape Ball x Pit, I’ve only tried a fewm Next Fest demos so far. 

I’m a fan of Aerial_Knight’s Never Yield and after last year’s sequel, it’s cool to see solo developer Neil Jones (aka Aerial_Knight) trying something totally different. Aerial_Knight’s DropShot is a skydiving first-person shooter with finger guns and dragons. It’s a single-player game in which the aim is to take out your opponents and reach the ground first. Like Jones’ previous games, it’s stylish and fast-paced. I’m planning to check out the full game when it arrives down the line.

It certainly helps to be a fast, accurate typer when you put words together for a living, but I wasn’t quick or precise enough to win any rounds in the Final Sentence demo. This is a battle royale for up to 100 players in which you’re at a typewriter and have to bash out sentences (or other strings of letters, numbers and symbols) in a race to the finish. If you run out of time, make too many mistakes or don’t win, it’s lights out, courtesy of the masked figure with a revolver who’s standing in front of you. 

There are some nice touches here. Having to type out the rules in the first few rounds is a clever idea on the part of developer Button Mash. I haven’t won a round myself yet (I finished in second place a couple of times), but watched some streamers play. It’s very funny when the winning player flips the bird at the guy holding a revolver in front of them.

Final Sentence is coming to Steam later this year. Maybe I’ll have learned how to spell “sphinx” by then.

There are a few other Next Fest demos I’d like to try this weekend, namely:

Crashout Crew (Overcooked-style co-op chaos with forklifts)

Reanimal (co-op horror from the team behind Little Nightmares and Little Nightmares 2)

Slots & Daggers (as if I need another slot-machine-based roguelite in my life right now) 

The Last Caretaker (sci-fi survival)

Goodnight Universe (we’ll get to that)

I’ve been looking forward to Skate Story for forever, but I think I’m going to skip that demo. I’m already sold and I’m fine with waiting a couple more months before playing the whole thing.

There are a couple of showcases coming up next week that might be worth keeping an eye on. The third annual edition of DreadXP’s indie horror showcase is set for 1PM ET on October 23. You can catch that on the publisher’s YouTube channel.

Two hours later, you’ll be able to tune into the Galaxies Autumn showcase. This will feature more than 50 games, including world premieres, gameplay trailers and other announcements. Games that will be featured include PowerWash Simulator 2, Mouse: PI For Hire and Denshattack, all of which are firmly on my to-play list.

New releases

Mister Scary is a weird little guy. I love when you get to play a game as a weird little guy. The game of the same name is a spooky NES homebrew platformer from Calgames. 

Mister Scary can stomp on his enemies, or freeze or burn them after eating a snack. When Mister Scary ducks, he becomes immune to damage because he’s taking a nap. I appreciate that. Nothing scary can happen while you’re snoozing.

Mister Scary is $10 on Itch. You’ll need to plug the ROM into a NES emulator to become Mister Scary.

The only reason I still have Flash Player installed on my PC is so I can open N, which sits on my desktop, once in a while. I’ve been playing that classic freeware platformer for a long time, and now there’s a good reason for many people to revisit the third entry in the series. 

As a thank you to the N++ community, Metanet Software is releasing a free update to mark the 10th anniversary of the game’s PS4 debut. TEN++ is said to include the developer’s “most challenging levels yet.” Given how darn tough these games are already, that’s saying something. The update is available now on Steam and it’s coming to the console versions of N++ soon.

The Cabin Factory is an anomaly-hunting (i.e. spot the difference) game in the style of The Exit 8. You’ll examine horror-themed cabins that are built for use in movies and theme parks to make sure they aren’t actually haunted. If you spot an anomaly, you’ll want to get out of the cabin post haste.

This $3 horror walking sim from International Cat Studios and publisher Future Friends Games debuted on Steam last year, and it just hit consoles in time for Halloween. It’s out now on PS4, PS5, Xbox One, Xbox Series X/S and Nintendo Switch.

Upcoming 

After CloverPit and Ball x Pit, I was planning to take a break from roguelikes/roguelites before diving into Hades 2. Alas, the latest game from the legendary Ron Gilbert now has a release date, and it’s very soon!

In Death by Scrolling, the aim is to collect enough gold to pay a ferryman so you can escape purgatory. However, there’s a wall of fire coming after you the whole time, so you’ll need to keep moving in order to try to stay alive. You’ll also need to avoid or stun an unkillable grim reaper as you collect gold and gems that unlock upgrades. 

Death by Scrolling is from Gilbert’s Terrible Toybox and MicroProse Software. It’s coming to Steam on October 28.

There’s a lot going on in Silly Polly Beast. It’s safe to say this game is a shooter, but the release date trailer rapidly flits between perspectives and genres. There’s an emphasis on survival horror, along with puzzles and stealth segments. Polly will even sometimes remove the board that’s strapped to her back for some skateboarding sequences. 

This is said to have a story that morphs and evolves as much as the gameplay does. After escaping her hellish orphanage, Polly lands right in the underworld and has to navigate her way out of that too. 

Developer Andrei Chernyshov and publisher Top Hat Studios are behind Silly Polly Beast. It’s coming to Steam, PS4, PS5, Xbox One, Xbox Series X/S and Nintendo Switch on October 28.

Also coming to Steam on October 28 is a project from Autoscopia Interactive that’s designed to be played in a single sitting. As Long As You’re Here is a first-person game that places you in the role of a woman with Alzheimer’s disease. Her memories of the past, including her late brother, blend into the present as Annie settles into living with her family. 

As Long As You’re Here started as a student project by Marlène Delrive, who was trying to better understand what her grandmother was experiencing in her final years. “The aim is to create a mature and nuanced experience that shows the difficult repercussions of losing not only your memory, but also your agency and sense of time and place,” the developers said.

Let’s close things out with a new trailer for Goodnight Universe. This is a cinematic adventure in which you play as a six-month-old baby. This particular infant is incredibly intelligent and has psychic powers. Isaac simply desires familial love and acceptance but (shock horror!) a tech company wants to take away the tot.

As with Nice Dreams’ last game (the stupendous Before Your Eyes), you control Goodnight Universe with your peepers via your device’s camera. It seems fascinating, and I really have to check out the Next Fest demo. Publisher Skybound Games is bringing it to Steam, Xbox Series X/S, PS5, Nintendo Switch and Switch 2 on November 11.

This article originally appeared on Engadget at https://www.engadget.com/gaming/a-spooky-nes-platformer-more-n-and-other-new-indie-games-worth-checking-out-110000259.html?src=rss 

SpaceX’s Starshield satellites are reportedly transmitting signals on unauthorized frequencies

SpaceX may be violating international telecommunication standards by allowing its Starshield satellites to transmit to Earth on frequencies it’s not supposed to use, NPR reports. Starshield is a classified version of SpaceX’s Starlink satellite network offered on contract to government agencies “to support national security efforts,” according to the company’s website.

The report is based on findings from amateur satellite tracker Scott Tilley, who observed what appeared to be Starshield satellites broadcasting on frequencies normally dedicated to “uplink” transmissions from the Earth to satellites in orbit. Using the frequencies that way violates standards set by the International Telecommunication Union, a United Nations agency dedicated to coordinating the use of radio spectrum across the world.

Standards around which frequencies are used for uplink and downlink broadcasts to satellites were created to avoid interference, among other technical issues. “Nearby satellites could receive radio-frequency interference and could perhaps not respond properly to commands — or ignore commands — from Earth,” Tilley told NPR. It’s not clear yet whether SpaceX ignoring these rules is causing any issues with satellite communication, but should problems arise, there’s now a possible cause.

SpaceX’s first major Starshield project was a $70 million contract with US Space Force in 2023. More recently in 2024, there were reports that SpaceX’s Starshield division had been tasked with building out a network of spy satellites to gather imagery of Earth for the Department of Defense’s National Reconnaissance Office.

This article originally appeared on Engadget at https://www.engadget.com/science/space/spacexs-starshield-satellites-are-reportedly-transmitting-signals-on-unauthorized-frequencies-212939991.html?src=rss 

Ring’s latest partnership allows police to access camera footage through Flock

Amazon’s Ring brand is entering into a new partnership with surveillance company Flock Safety to make it possible for law enforcement to request footage from smart doorbell owners. The move is part of a pivot back to collaborating with police, after Ring spent several years distancing itself and its products from law enforcement agencies.

As part of the partnership, “public safety agencies” using Flock’s Nova platform or FlockOS will be able to use Ring’s previously announced “Community Requests” program to receive footage captured by the camera of a Ring customer. Agencies investigating an event that might have been captured on camera will have to provide details like the “specific location and timeframe of the incident, a unique investigation code, and details about what is being investigated” before the request is passed on to relevant users. Throughout the process, the identity of Ring users is kept anonymous, as is whether they agree to share footage. The entire process is also entirely optional.

Amazon and Ring’s approach to working with law enforcement has varied over the years. While Ring reportedly removed the ability for police to make warrantless video requests in 2024, there were documented cases of the company providing access to law enforcement in years prior. This pivot back towards a more police-friendly stance might have been prompted by Ring founder Jamie Siminoff returning to the Amazon-subsidiary in April 2025. Now Amazon is reportedly pitching its cloud and AI services to law enforcement agencies and Ring is looking to work with Flock and other surveillance companies.

That might not bother the average Ring customer who already planned to opt out of sharing, but there’s reasons to be concerned that Amazon is budding up with Flock. 404 Media reports the company’s surveillance tools have been used by Immigration and Customs Enforcement (ICE) to find and detain people, without a formal contract. Navy and Secret Service employees also reportedly had access to Flock’s network. That doesn’t implicate Ring in anything, but it does make the connection between the two camera networks feel more fraught.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/rings-latest-partnership-allows-police-to-access-camera-footage-through-flock-194609879.html?src=rss 

Facebook’s latest AI feature can scan your phone’s camera roll

A Facebook feature that scans your phone’s photo library to make AI collages and edits is now available in North America. Meta tested it earlier this year. It’s an opt-in feature, but the company may train its models on your media if you use its AI editing or share the results.

From a user experience perspective, the idea is to help you find “hidden gems” in your library and turn them into something shareable. After scanning your photo library (with your permission), it will cough up suggestions. For example, it might recommend a collage based on a vacation, a recap of a graduation party or simply spruce up some photos with AI. For better or worse, it’s another step in the direction of automating creativity and skill.

Zooming out to Meta’s business motives, it’s easy to imagine this is a move for more AI training data. The company says it won’t train its AI on your camera roll “unless you choose to edit this media with our AI tools, or share.” If you find it useful enough to use, your media may help train Meta’s AI models.

The company says the feature’s suggestions are private to you until you choose to share them. Its permissions state, “To create ideas for you, we’ll select media from your camera roll and upload it to our cloud on an ongoing basis, based on info like time, location or themes.” However, Meta says your media won’t be used for ad targeting.

Fortunately, it’s opt-in, so you can safely ignore this altogether without privacy worries. If you grant it permission, you’ll see its suggestions (visible only to you) in Stories and Feed. And should you activate it but change your mind later, you can turn it back off through Facebook’s camera roll settings.

The feature is available now in the US and Canada. Meta says it will soon begin testing it in other countries.

This article originally appeared on Engadget at https://www.engadget.com/ai/facebooks-latest-ai-feature-can-scan-your-phones-camera-roll-200056906.html?src=rss 

The US and Saudi Arabia just derailed a global plan to cut shipping emissions

The US and Saudi Arabia have managed to derail negotiations regarding a landmark deal to cut global shipping emissions, according to a report by the BBC. The deal had already been approved and would have made shipping the world’s first industry to adopt internationally mandated emissions guidelines.

Representatives from more than 100 countries had gathered in London to formally approve the so-called global carbon tax, after nearly ten years of negotiations. However, the US government had been pressuring countries to vote “no” on the measure, threatening tariffs if met with noncompliance.

The US also threatened other sanctions, including blocking vessels from ports and visa restrictions. President Trump has called it a “global green new scam.” The country withdrew from talks back in April, just before the plan was approved.

Saudi Arabia instituted a plan to derail negotiations. The country tabled a motion to adjourn talks for a year, at a time when most countries were set to vote on it. That passed by just a handful of votes, with approving votes coming from both the US and Russia.

This essentially destroys the plan, despite technically being just a delay, as timelines will have to be renegotiated. US Secretary of State Marco Rubio declared the outcome a “huge win” for Trump.

Even the shipping industry was on-board with the plan, as it offered consistent global standards that don’t currently exist. Industries like certainty. Thomas Kazakos, secretary-general of the International Chamber of Shipping, said that the organization is “disappointed that member states have not been able to agree a way forward at this meeting.” He also said that the “industry needs clarity to be able to make the investments.”

📣 Statement from International Chamber of Shipping following conclusion of MEPC ES 📣

International Chamber of Shipping disappointed that Member States have not been able to agree

For the full statement click here – https://t.co/PXNiDhh9QG#MEPCES #IMO #ICS

— International Chamber of Shipping (ICS) (@shippingics) October 17, 2025

Meanwhile, carbon dioxide levels reached record highs in 2024 and we aren’t doing too much about it. This agreement would’ve forced ship owners to use cleaner fuels beginning in 2028, or face fines. Shipping currently makes up around three percent of global carbon emissions, but that’s expected to rise by anywhere from ten percent to 150 percent by 2050.

Countries are expected to reconvene in April to discuss the plan, but this will likely not feature a vote. It’ll likely be a renegotiation from the ground up.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/the-us-and-saudi-arabia-just-derailed-a-global-plan-to-cut-shipping-emissions-184204170.html?src=rss 

Meta Ray-Ban Display review: Chunky frames with impressive abilities

I’ve been wearing the $800 Meta Ray-Ban Display glasses daily for ten days and I’m still a bit conflicted. On one hand, I’m still not entirely comfortable with how they look. I’ve worn them on the bus, at the office, on walks around my neighborhood and during hangouts with friends. Each time, I’m very aware that I probably look a bit strange.

On the other hand, there’s a lot I really like about using these glasses. The built-in display has helped me look at my phone less throughout the day. The neural band feels more innovative than any wrist-based device I’ve tried. Together, it feels like a significant milestone for smart glasses overall. But it’s also very much a first-generation device with some issues that still need to be worked out.

Chunky statement glasses or hideously nerdy?

To once again state the obvious: The frames are extremely chunky and too wide for my face. The dark black frames I tried for this review unfortunately accentuate the extra thickness. I won’t pretend it’s my best look and I did feel a bit self-conscious at times wearing these in public. Meta also makes a light brown “sand” color that I tried at the Connect event, and I think that color is a bit more flattering, even if the frames are just as oversized. (Sidenote: Smart glasses companies, please, please make your frames available in something other than black!) 

But, everyone has a different face shape, skin tone and general ability to “pull off” what one of my friends charitably described as “chunky statement glasses.” What looks not-great on my face, may look good on someone else. I really wish Meta could have squeezed this tech into slightly smaller frames, but I did get more used to the look the more I wore them. Overall, I do think the size is a reasonable tradeoff for a first-generation product that’s pretty clearly aimed at early adopters. 

Here’s how they look in the lighter “sand” color.

Karissa Bell for Engadget

The reason the glasses are so thick compared with Meta’s other frames is because there are a lot of extra components to power the display, including a mini projector and waveguide. And, at 69 grams, the display glasses are noticeably heavier. I didn’t find it particularly uncomfortable at first, but there is a noticeable pressure after six or seven hours of wear. Plus, the extra weight and width also made them consistently slide down my nose. I’m not sure I’d feel comfortable wearing these on a bike ride or a jog as I’d worry about them falling off. 

While I tested these, I was very interested to get reactions from friends and family. I didn’t get many positive comments about how they looked on my face, though a few particularly generous colleagues assured me I was “pulling them off.” But seeing people’s reactions as soon as the display activated was another matter. Almost everyone has had the same initial reaction: “whoa.” 

Quality display with some limitations

As I discussed in my initial impressions, these glasses have a monocular display on the right side, so it doesn’t offer the kind of immersive AR I experienced with the Orion prototype last year. You have to look slightly up and to the right to focus on the full-color display. It’s impressively bright and clear, but doesn’t overtake your vision. 

At 20 degrees, the field of view is small, but it never felt like a limitation. Because the content you see isn’t meant to be immersive, it never feels like what’s on the display is being cut off or like you have to adjust where you’re looking to properly see it. The display itself has three main menus: an app launcher, a kind of home screen where you can access Meta AI and view notifications and a settings page for adjusting brightness, volume and other preferences. 

Like Meta’s other glasses, there’s an LED that lights up when the camera is in use.

Karissa Bell for Engadget

For now, there are only a handful of Meta-created “apps” available. You can check your Instagram, WhatsApp and Messenger inboxes and chat with Meta AI. There’s also a simple maps app for walking navigation, a music/audio player, camera and live translation and captioning features. There’s also a mini puzzle game called “Hypertrail.”

One of my favorite integrations was the ability to check Instagram DMs. Not only can you quickly read and respond to messages, you can watch Reels sent by your friends. While the video quality isn’t as high as what you’d see on your phone, there’s something very cool about quickly watching a clip without having to pull out your phone. Meta is also working on a standalone Reels experience that I’m very much looking forward to.

I also enjoyed being able to view media sent in my family group chats on WhatsApp. I often would end up revisiting the photos on videos once I pulled out my phone, but being able to instantly see these messages as they came in tickled whatever part of my brain responds to instant gratification. 

There’s some impressive tech inside those thick frames.

Karissa Bell for Engadget

The display also solves one of my biggest complaints with Meta’s other smart glasses: that it’s really difficult to frame photos. When you open the camera app on the display model, you can see a preview of the photo and even use a gesture to zoom in to properly frame your shot. Similarly, if you’re on a WhatsApp video call you can see both the other person’s video as well as a small preview of your own like you would on your phone’s screen. It’s a cool trick but the small display felt too cramped for a proper video call. People I used this with also told me that my video feed had some quality issues despite being on Wi-Fi.

The glasses’ live captioning and translation features are probably the best examples of Meta bringing its existing AI features into the display. I’ve written before about how Meta AI’s translation abilities are one of my favorite features of the Ray-Ban smart glasses. Live translation on the display is even better, because it delivers a real-time text feed of what the person in front of you is saying. I tried it out with my husband, a native Spanish speaker, and it was even more natural than the non-display glasses because I didn’t have to pause and wait for the audio to relay what he was saying. It still wasn’t an exactly perfect translation, and there were still a few occasions when it didn’t catch everything he said, but it made the process so much simpler overall. 

Likewise, live captions transcribes conversations in real-time into a similar text feed. I’ve found that it’s a cool way to demo these glasses’ capabilities, but I haven’t yet found an occasion to use this in anything other than a demo. However, I still think it could be useful as an accessibility aid for anyone who has trouble hearing or processing audio. 

Another feature that’s useful for travel is walking navigation. Dictate an address or location (you can say something like “take me to the closest Starbucks”) and the glasses’ display will guide you on your route. The first time I tried this was the roughly 10-minute walk from my bus stop to Yahoo’s San Francisco office. The route only required two turns, but it didn’t quite work. My glasses confidently navigated me to an alleyway behind the office building rather than the entrance. These kinds of mishaps happen with lots of mapping tools — Meta’s maps rely on data from OpenStreetMap and Overture — but it was a good reminder that it’s still early days for this product. 

I don’t use Meta AI a ton on any of my smart glasses, but having a bit of visual feedback for these interactions was a nice change. I retain information much better from reading than listening, so seeing text-based output to my queries felt a lot more helpful. It’s also nice that for longer responses from the assistant, you can stop the audio playback and swipe through informational cards instead.

Meta AI on the glasses’ display delivers information in a card-like interface.

Meta

While cooking dinner one night, I asked for a quick recipe for teriyaki salmon and Meta AI supplied what seemed like a passable recipe onto the display. The only drawback was the display goes to sleep pretty quickly unless you continue to interact with the content you’re seeing, so the recipe I liked disappeared before I could actually attempt it. (You can view your Meta AI history in the Meta AI app if you really want to revisit something.) 

My main complaint is that I want to be able to do much more with the display. Messaging app integrations are nice, but I wish the display worked with more of the apps on my phone. When it worked best, I was happy to be able to view and dismiss messaging notifications without having to touch my phone; I just wish it worked with all my phone’s notifications. 

There are also some frustrating limitations on sending and receiving texts. For example, there’s no simple way to take a photo on your glasses and text it to a friend with the glasses. You have to wait for the glasses to send a “preview” of your message to your phone and then manually send the text. Or, you can opt in to Meta’s cloud services and send the photo immediately as a link, but I’m not sure many of my friends would readily open a “media.meta.com” URL. 

The glasses also don’t really support non-WhatsApp group chats. You can receive messages sent in group chats, but there’s no indication the message originated in a group thread. And, it’s impossible to reply in the same thread; instead, replies are sent directly to the person who texted, which can get confusing if you’re not checking your phone. It was also a little annoying that reading and even replying to texts from my glasses wouldn’t mark the text as read in my phone’s inbox. Meta blames all this on Apple’s iOS restrictions, and says it’s hoping to work with the company to improve the experience. 

The band + battery life

The glasses are controlled using Meta’s Neural Band, which can translate subtle gestures like finger taps into actions on the display. Because the band relies on electromyography (EMG), you do need a fairly snug fit for it to work properly. I didn’t find it uncomfortable, but, like the glasses, I don’t love how it looks as a daily accessory. It also requires daily charging if you wear the glasses all day.

But the band does work surprisingly well. In more than a week, it almost never missed a gesture, and it never falsely registered a gesture, despite my efforts to confuse it by fidgeting or rubbing my fingers together. The gestures themselves are also pretty intuitive and don’t take long to get used to: double tapping your thumb and middle fingers wakes up or puts the display to sleep, single taps of your index and middle fingers allow you to select an item or go back, and swiping your thumb along the side of your index finger lets you navigate around the display. There are a few others, but those are the ones I used most often.

The Meta Neural Band requires a snug fit to work properly.

Karissa Bell for Engadget

Each time you make a gesture, the band emits a small vibration so you get a bit of haptic feedback letting you know it registered. I’ve used hand tracking-based navigation in various VR, AR and mixed reality devices and I’ve always felt a bit goofy waving my hands around. But the neural band gestures work when your hand is by your side or in your pocket. 

The other major drawback of these glasses is that heavy use of the display drains the battery pretty quickly. Meta says the Ray-Ban Display’s battery can go about six hours on a single charge, but it really depends on how much you’re using the display. With very limited use, l was able to stretch the battery to about seven hours, but if you’re doing display-intensive tasks like video calling or live translation, it will die much, much more quickly. 

The Meta Ray-Ban Display glasses, charging case and neural band.

Karissa Bell for Engadget

The glasses do come with a charging case that can deliver a few extra charges on-the-go, but I was a bit surprised at how often I had to recharge the case. With my normal Ray-Ban Meta glasses I can go several days without topping up the charging case, but with the Meta Ray-Ban Display case, I’m charging it at least every other day. 

Privacy and safety

Whenever I write or post on social media about a pair of Meta-branded glasses, I inevitably hear from people concerned about the privacy implications of these devices. As I wrote in my recent review of Meta’s second-gen Ray-Ban glasses, I share a lot of these concerns. Meta has made subtle but meaningful changes to its glasses’ privacy policy over the last year, and its track record suggests these devices will inevitably scoop up more of our data over time.

In terms of privacy implications of the display-enabled glasses, there isn’t a meaningful difference compared to their counterparts. Meta’s policies are the same for all its wearables. I suppose you could use live translation to surreptitiously eavesdrop on a conversation you wouldn’t typically understand, though that’s technically possible with Meta’s other glasses too. And the addition of a wrist-based controller means taking photos is a bit less obvious, but there’s still an LED indicator that lights up when the camera is on. 

The neural band allows you to snap photos without touching the capture button or using a voice command.

Karissa Bell for Engadget

I have been surprised at how many people have asked me if these glasses have some kind of facial recognition abilities. I’m not sure if that’s a sign of people’s general distrust of Meta, or an assumption based on seeing similar glasses in sci-fi flicks, but I do think it’s telling. (They don’t, to be clear. Meta currently only uses facial recognition for two safety-related features on Facebook and Instagram.) Meta hasn’t done much to earn people’s trust when it comes to privacy, and I wish the company would use its growing wearables business to try to prove otherwise.

On a more practical level, I have some safety concerns. The display didn’t hinder my situational awareness while walking, but I could see how it might for others. And I’m definitely not comfortable using the display while driving. Meta does have an audio-only “driving detection” setting that can automatically kick in when you’re traveling in a car, but the feature is optional, which seems potentially problematic. 

Should you buy these?

In short: probably not. As much as I’ve been genuinely impressed with Meta’s display tech, I don’t think these glasses make sense for most people right now. And, at $800, the Meta Ray-Ban Display glasses are more than twice as much as the company’s very good second-generation Ray-Ban glasses, which come in a wide range of much more normal-looking frame styles and colors. 

The Meta Ray-Ban Display glasses, on the other hand, still look very much like a first-gen product. There are some really compelling use cases for the display, but its functionality is limited. The glasses are also too thick and bulky for what’s meant to be an everyday accessory. At the end of the day, most people want glasses that make them look good. There’s also the fact that right now, these glasses are somewhat difficult to actually buy. They are only available at a handful of physical retailers, which currently have a very limited supply, Meta is also requiring would-be buyers to schedule demo appointments in order to buy, though some stores — like the LensCrafters where I bought my pair — aren’t enforcing this.

Still, there’s a lot to be excited about. Watching people’s reactions to trying these has been almost as much fun as using them myself. Meta also has a solid lineup of new features already in the works, including a standalone Reels app, a teleprompter and gesture-based handwriting for message replies. If you’re already all-in on smart glasses or, like me, you’ve been patiently waiting for glasses with a high quality, usable display, then the Meta Ray-Ban Display glasses are worth the investment now — as long as you can accept the thick frames.

This article originally appeared on Engadget at https://www.engadget.com/wearables/meta-ray-ban-display-review-chunky-frames-with-impressive-abilities-193127070.html?src=rss 

Generated by Feedzy
Exit mobile version