Lawsuit says Mark Zuckerberg approved Meta’s use of pirated materials to train Llama AI

Meta knowingly used pirated materials to train its Llama AI models — with the blessing of company chief Mark Zuckerberg — according to an ongoing copyright lawsuit against the company. As TechCrunch reports, the plaintiffs of the Kadrey v. Meta case submitted court documents talking about the company’s use of of the LibGen dataset for AI training. 

LibGen is generally described as a “shadow library” that provides file-sharing access to academic and general-interest books, journals, images and other materials. The counsel for the plaintiffs, which include writers Sarah Silverman and Ta-Nehisi Coates, accused Zuckerberg of approving the use of LibGen for training despite concerns raised by company executives and employees who described it as a “dataset [they] know to be pirated.”

The company removed copyright information from LibGen materials, the complaint also said, before feeding them to Llama. Meta apparently admitted in a document submitted to court that it “remov[ed] all the copyright paragraphs from beginning and the end” of scientific journal articles. One of its engineers even reportedly made a script to automatically delete copyright information. The counsel argued that Meta did so to conceal its copyright infringement activities from the public. In addition, the counsel mentioned that Meta admitted to torrenting LibGen materials, even though its engineers felt uneasy about sharing them “from a [Meta-owned] corporate laptop.”

Silverman, alongside other writers, sued Meta and OpenAI for copyright infringement in 2023. They accused the companies of using pirated materials from shadow libraries to train their AI models. The court previously dismissed some of their claims, but the plaintiffs said their amended complaint supports their allegations and addresses the court’s earlier reasons for dismissal. 

This article originally appeared on Engadget at https://www.engadget.com/ai/lawsuit-says-mark-zuckerberg-approved-metas-use-of-pirated-materials-to-train-llama-ai-141548827.html?src=rss 

A new initiative will fund and support open-source Chromium projects

Google has teamed up with the Linux Foundation to establish a new initiative called the “Supporters of Chromium-Based Browsers.” At the moment, most of the money that keeps Chromium, the open-source web browser project that produced the codebase powering Chrome, comes from Google. The company says it has no intention of reducing its contribution going forward, but it also continues to “welcome others stepping up to invest more.” 

Under the Linux Foundation’s management, the new initiative aims to fund the open development of Chromium projects and ensure proper support for contributions that could lead to technological advancements. It’s also meant to provide a “neutral space” where developers, members of the academia and big industry players can work together. Aside from Google, Microsoft, Meta and Opera have also pledged their support for the initiative. 

Google said it established the new program after hearing from “many companies and developers about how critical the Chromium project is to their work” and how they would like to give it more than direct engineering support over the years. Chrome is just one of the browsers built on Chromium — Microsoft’s Edge and Opera are also based on the project’s codebase, so their involvement in the initiative doesn’t really come as a surprise. 

It’s worth noting that the Department of Justice called for the breakup of Google last year, including a sale of the Chrome web browser. Google said in its announcement that it intends to continue supporting the Chromium project, but only time will tell if selling off Chrome will affect its contributions. 

This article originally appeared on Engadget at https://www.engadget.com/computing/a-new-initiative-will-fund-and-support-open-source-chromium-projects-143028118.html?src=rss 

Samsung isn’t talking about Eclipsa Audio at CES 2025

Before CES 2025 kicked off in Las Vegas, Samsung announced that its spatial audio collaboration with Google would be available on its 2025 TVs and soundbars. Finer details on the platform were noticeably absent from that announcement, with the company only noting that the 3D Eclipsa Audio would be available this year for YouTube content creators. There was also the general explanation that the platform would enable creators “to adjust audio data such as the location and intensity of sounds, along with spatial reflections, to create an immersive three-dimensional sound experience,” according to the press release.

If that sounds like Dolby Atmos to you, that’s what I assume Samsung and Google are trying to replicate here. And if that’s the case, if Samsung really wants its own immersive audio standard, there’s a backstory worth revisiting here. In 2023, Samsung and Google first revealed their spatial audio ambitions. At the time, Samsung said its research division had been working on 3D audio since 2020 and the first fruits of the collaboration was the open-source Immersive Audio Model and Formats (IAMF) adopted by the Alliance for Open Media (AOM) in October 2023. 

There’s also the fact that Samsung doesn’t offer Dolby Vision on its TVs. Instead, the company uses HDR10+, an open-source and royalty-free platform for encoding HDR metadata. And in that 2023 audio announcement, Samsung Research’s WooHyun Nam explained that 3D sound technology needed to be open to everyone too. “Providing a complete open-source framework for 3D audio, from creation to delivery and playback, will allow for even more diverse audio content experiences in the future,” he said.

Samsung currently supports Dolby Atmos on its soundbars, including its flagship Q990 series and the newly announced QS700F. It sounds like the company no longer wants to pay to license Atmos from Dolby. And in order to still offer immersive 3D audio on its products, this collaboration with Google aims to build the alternative. It’s worth noting that AOM counts Amazon, Apple and Netflix among its members, in addition to Google, Samsung and others. The group’s AV1 video format was introduced in 2018 and is now used across Netflix, YouTube, Twitch and other sites.

Samsung’s Q990F soundbar

Billy Steele for Engadget

The bizarre thing about all of this is that no one from Samsung wants to talk about Eclipsa Audio. I attended multiple events and product demos that the company hosted this week and the response when I asked about it was either “we haven’t been told anything” or “let me see if I can find someone who can talk about it.” The latter, of course, never manifested a “someone” or a follow-up. I even asked for a rep to tell me if the company wasn’t ready to discuss details and never heard back on that either. 

The most detailed explanation I’ve seen this week came from Arm, which is apparently also working on the development of Eclipsa Audio alongside Samsung and Google. The chip designer said that Eclipsa is a multi-channel audio surround sound format that’s built on IMAF. Vertical and horizontal channels will create the immersive sound, with the goal of making movies, music and television shows more compelling in your living room. Again, that’s exactly what Dolby Atmos already does. 

Arm further explained that Eclipsa Audio can automatically adjust sound based on the scene and that there will be a degree of customization for users. The bitstream can contain up to 28 input channels that can be fixed (instruments or microphones) or dynamic (vehicles in movie scenes), with support for LPCM, AAC, FLAC and Opus codecs. Binaural rendering is also available for earbuds and headphones, and the new tech will be available to content creators using consumer devices in their workflow. 

So far, Samsung and Google have only listed YouTube as the platform or service where Eclipsa Audio content will be available. If the duo truly wants to compete with Dolby Atmos, that list needs to expand quickly. Plus, Dolby already has the brand recognition and wide adoption in both the audio and home theater categories for Atmos. It’s even available in cars

Samsung said in its pre-CES announcement that it and Google would work with the Telecommunications Technology Association (TTA) to develop a certification program for devices that support Eclipsa Audio. So, it seems like serious groundwork has been laid to get this technology on devices, starting with Samsung’s own 2025 TVs and soundbars. But, as we saw with Sony 360 Reality Audio and the early days of Dolby Atmos Music, it can take time to build out a compelling library of content. That means Samsung will likely have to keep reminding us that Eclipsa Audio is a thing, even when it doesn’t have much more to say. 

This article originally appeared on Engadget at https://www.engadget.com/home/home-theater/samsung-isnt-talking-about-eclipsa-audio-at-ces-2025-130041782.html?src=rss 

Tesla finally launches the refreshed 2025 Model Y in the Asia-Pacific region

Tesla has quietly unveiled its facelifted Model Y with new styling that will help it keep up with rivals like Kia and Volvo. Though currently only available in the Asia Pacific region, the refreshed “Juniper” model is likely to appear stateside in the coming months. That was the case with the revised Model 3, which first appeared in Asia in September 2023 and went on sale in the US in January the following year

The new Model Y retains the gawky proportions of its predecessor, but looks sleeker thanks to smoothed out front and rear ends. The smaller headlights bookend a slim lightbar across the front, with a similar treatment for the taillights. In the case of the lights, the new design language is more aligned with the Cybertruck than the Model 3. 

Tesla

Many interior treatments on the Model Y are similar to the Model 3, with one notable exception. Like the Model 3, it has new ventilated seats, a rear-seat display and a light strip that wraps around much of the vehicle. However, the new steering wheel lacks the turn signal buttons found on the Model 3 — instead, the Juniper Model Y uses a stalk like its predecessor. Tesla may have done that to keep it competitive with rivals, particularly in China where it’s up against juggernaut rival BYD. 

Tesla is offering rear-wheel drive and long-range all-wheel drive versions in Australia, but no performance option for now. It’s promising up to 342 miles (551 km) of range by the WLTP cycle on the long-range model, or around 307 miles by US EPA standards. However, US models could have different battery specs and thus different range numbers.

Tesla

The new model arrives in good time for Tesla. In 2024, the company saw its first drop in vehicle deliveries since 2012, even though it improved in its key market, China. The redesigned Model Y will start shipping there in March 2025 and is likely to arrive elsewhere in several months, though the company has yet to nail down a date for US deliveries. 

This article originally appeared on Engadget at https://www.engadget.com/transportation/evs/tesla-finally-launches-the-refreshed-2025-model-y-in-the-asia-pacific-region-133010038.html?src=rss 

The Morning After: Introducing the best of CES 2025 winners

As we finish up our live coverage of all things CES, it’s time to pick the best in show. So many of the new things we saw this year had an AI component, with a noticeable uptick in AR glasses, hearing aid earbuds, solar-powered tech, emotional support robots and robot vacuums. (Why this year, robovacs?)

Our list of CES 2025 winners covers various categories, ranging from typical Engadgety things like PCs, home entertainment and gaming to themed winners in sustainability and accessibility.

In fact, our best-in-show winner was an accessibility pick: the WeWalk Smart Cane 2. A high-tech version of the mobility cane for people who are blind seemed like the best helpful application of AI. With a new voice assistant powered by GPT, users can speak directly to the cane to get navigation guidance, with sensors that alert the user of upcoming obstacles. Since the cane can handle things like turn-by-turn navigation, users don’t have to worry about holding a smartphone while trying to get around.

There were plenty of other winners too. Which laptop beat the rest? Read on for more!

— Mat Smith

Get this delivered daily direct to your inbox. Subscribe right here!

The biggest tech stories you missed

The CES gadgets you can actually buy right now

Ropet is the cute-as-hell emotional robot that the modern Furby wishes it could be

Sony’s XYN mixed-reality headset is being used in very different ways at CES 2025

Sony Honda Mobility’s Afeela 1 feels like a PlayStation 4 in the PS5 era

As the EV approaches the finish line, it’s time to get critical.

Engadget

The automotive talk of CES was the Sony Afeela 1 — again. The company has been showing off some variation of this EV for five years at this point. Now, the car is almost ready to launch, and the more specifications we hear, the warier we’re getting. The maximum charge rate of the Afeela 1 is 150 kW for its 91 kWh battery, which provides an estimated 300 miles of range. Compare that to a cheaper Lucid Air, which can charge twice as quickly and cover over 400 miles on a charge, you begin to see the problems. All of this in a car that’s a heady almost-$90,000. The charming Tim Stevens takes Sony Honda Mobility to task — and not just for the company name.

Continue reading.

The weirdest tech of CES 2025

Sloth-koala robots? Sure.

Engadget

We’ve curated all the crazy (and sometimes useful) devices we spotted out in the wild of the show floor at CES. Weird doesn’t necessarily mean bad — it just might not have the might of a multinational corporation… or the desire to change the world. Still, solar sun hat? Yes, please.

Continue reading.

Samsung’s The Frame Pro is a big upgrade for the art TV series

Better screen, a better premise.

Samsung’s The Frame TV lineup was a success. It doesn’t just look like a black box when you’re not using it, but rather blends in with your home decor by showing art on the screen, with a single-cable build that tidies the usual mess of the back of TVs. It inspired many imitators, but Samsung is finally back with a pro iteration. Most importantly, The Frame Pro now has a Neo QLED display — the same Mini LED tech that powers the company’s high-end QN900 series TVs.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/general/the-morning-after-engadget-newsletter-121506805.html?src=rss 

X’s Grok AI assistant is now a standalone app

Grok, the AI assistant that’s for some reason baked into X, is now available as a standalone app. Like the version that exists as a tab on the social media platform, the Grok app can be used to generate images, summarize text and answer questions, with a conversational tone xAI, the AI assistant’s creator, calls “humorous and engaging.”

The app was first tested with a limited set of users in December 2024, right around the same time X debuted a free tier of Grok that’s available to anyone. Prior to that, you needed to pay at least $8 a month for X Premium to have the privilege of using the AI.

Ian Campbell for Engadget

The limitations of that free access — 10 requests every two hours, three image analysis request per day — may also apply to the Grok app. You can use the app without signing in, or sign-in with an Apple account, X account, Google account, or a plain old email. It’s not clear whether an X Premium subscription gets you added benefits in the Grok app in the same way it does X.

Grok has struggled with similar issues around accuracy and bizarre image generation choices as other AI assistants like Gemini and ChatGPT. The chatbot mainly stands out from its competitors because xAI pitched it as being able to answer “spicy questions” other AI assistants avoid, and a version of the Grok AI model is open source. You’ll have to see for yourself how “spicy” the Grok app ultimately is, but at least you don’t have to go to X to use it now.

This article originally appeared on Engadget at https://www.engadget.com/apps/xs-grok-ai-assistant-is-now-a-standalone-app-225151579.html?src=rss 

Breylon’s immersive display is the TARDIS of monitors

At CES 2025, Brelyon showed off its latest immersive display called the Ultra Reality Extend and even after seeing it in person, my brain still can’t fully comprehend a monitor that looks bigger and deeper on the inside than it does on the outside.

Billed as the world’s first commercial multi-focal monitor, the Ultra Reality Extend merges the ease-of-use and simplicity of a traditional desktop display with the kind of spatial depth you can normally only get from VR headset. Granted, the max simulated depth the Extend delivers is only 2.5 meters, which isn’t nearly as far as you’d get from devices like a Meta Quest 3S or an Apple Vision Pro, but considering that Brelyon’s monitor doesn’t require any additional equipment (aside from a connected PC), the effect is truly impressive. And it’s much easier to use too, all you have to do is set yourself in front and the monitor will do the rest, which results in much less eye strain or the potential nausea that many people experience with modern VR goggles.

Brelyon

This allows the monitor to defy its dimensions, because even though it’s much chunkier than a typical display, the view inside is absolutely monstrous. From a 30-inch frame, the Ultra Reality Extend provides a virtual display that’s equivalent to a curved 122-inch screen. Meanwhile, its 4K/60Hz resolution uses 1-bit of monocular to deliver spatial content that looks closer to 8K with elements of the scene capable of looking closer or further away depending on the situation.

When I watched a game clip from Spiderman, the trees and light poles whipping past in my face felt so real I started to flinch subconsciously. Then in other scenes, Brelyon’s monitor was able to separate different layers of the content to make snow in the foreground look blurry as it whipped across the screen while characters in the distance remained tack sharp. It’s rather uncanny because the effect is visceral in a way that games and movies on flat screens just can’t match.

Meanwhile, underpinning the monitor is Brelyon’s Visual Engine, which allows the display to automatically assign different depths to elements in games and videos on the fly without additional programming. That said, developers can further optimize their content for Brelyon’s tech, allowing them to add even more depth and immersion.

Unfortunately, the downside is that the Ultra Reality Extend’s unique approach to spatial content is quite expensive. That’s because while the monitor is available now, the company is targeting pricing between $5,000 to $8,000 per unit, with the exact numbers depending on the customer and any partnerships with Brelyon. Sadly, this means the display will be limited to enterprise buyers who will use it for things like making ultra-realistic flight simulators with depth-enabled UI instead of normal folk who might want a fancy monitor for movies and games. But if Brelyon’s tech takes off, one day, maybe…

This article originally appeared on Engadget at https://www.engadget.com/computing/breylons-immersive-display-is-the-tardis-of-monitors-233606873.html?src=rss 

Ropet is the cute-as-hell emotional robot at CES 2025 that the modern Furby wishes it could be

I wouldn’t go as far as to say it’s been dethroned, but Mirumi — the clingy fluffball with a staring problem — now has some serious competition for the title of cutest robot at CES 2025. I just met Ropet, a wide-eyed companion robot with warm, soft fur, little flapping arms and big feelings. And damn is that thing adorable.

Ropet’s sole mission is to love and be loved. Think of it like a living plushie; it has a personality, will listen to your deepest darkest secrets without judgment, and will reach out to hug you when you’re sad. Its appearance is customizable and it has optional ChatGPT integration, so it’s there if you want it for advanced conversations with the robot but you’re not forced into it. Its little button nose is a camera that it uses for face and object recognition; Ropet can identify and bond more closely with its owner, but it will remember other people too. All of this data is stored and processed locally, meaning it never leaves the device itself.

If you don’t trust that (fair), you can keep Ropet entirely disconnected from the internet, and it’ll still be capable of performing all of its functions minus ChatGPT. That includes reacting to audio, touch and gesture inputs. If you pet or cuddle it, its expression will change to visible happiness. Or, it might look grumpy if you shake it around. (What are you, evil? Don’t do that). Show it one of the few dozen objects it knows, like a hotdog or a banana, and you’ll see an emoji of that pop up in its eyes. You can give it the “shush” sign to quiet it down, and it’ll dance along if you’re listening to music.

Karissa Bell for Engadget

We’re at the point of CES week where we’re all running on fumes, and Ropet brought unexpected childlike glee to this burnt-out gremlin for a few minutes. Emotional companion robots are a ubiquitous presence at CES, but the ones that actually have some degree of smarts are not usually so snuggleable. And the ones that are snuggleable usually aren’t very smart, tending to come across more like animatronics. Ropet looks kind of like a fluffy baby seal — but not realistic enough to dip into uncanny valley territory — and its body gives off heat to simulate the feeling of holding a living creature.

It’s hard not to draw comparisons to Furby, which is probably the best-known example of a robotic creature pet that responds to voice and touch commands. But Ropet takes the whole idea to another level. You can change the color of its eyes in the app, and buy different face plates and furs if you want to mix things up or just can’t decide how you want its appearance. There are also little outfits you can purchase.

A Kickstarter campaign for Ropet managed to pull in $228,091, wildly surpassing its $1,285 goal, so I’m definitely not the only one who thinks this little guy seems pretty promising. There are two purchase options for anyone who is interested: Ropet Basic ($299), which comes with a case and a USB-C charging cord, and Ropet Pro ($329), which comes with those things plus a charging base that has light effects and lets Ropet rotate a little. The Kickstarter doesn’t end until January 21, so if you catch it before then you can get it for significantly cheaper. Early bird orders are expected to begin shipping in March, and the rest will be unleashed upon the world later this year. 

Now we sit back and see whether Ropet will follow in the footsteps of Furby to develop its own mildly sinister lore that endures for decades to come.

This article originally appeared on Engadget at https://www.engadget.com/home/ropet-is-the-cute-as-hell-emotional-robot-at-ces-2025-that-the-modern-furby-wishes-it-could-be-214046211.html?src=rss 

Pick up BioShock 2 Remastered and Deus Ex in Prime Gaming’s January freebies

Amazon shared the latest list of video game titles that Prime members can snag for free this month. Members can pick up a code for BioShock 2 Remastered right now, and if you’re patient, you can also grab a free copy of Deus Ex GOTY Edition or Super Meat Boy Forever later in January.

The cloud-based Amazon Luna gaming service has also shared its current lineup of titles that Prime members can play. Airhead, Guacamelee! 2 Complete, The Magical Mixture Mill, Metro Exodus and Super Meat Boy are in the rotation for that service this month alongside Fallout 3: Game of the Year Edition, Fallout New Vegas: Ultimate Edition, Fortnight, LEGO Fortnite, Fortnite Festival, Fortnite Battle Royale, Rocket Racing and Trackmania.

Some of Prime Gaming’s freebies last for longer than 30 days, so you’ve also got some time left to pick up a copy of some of the December titles if you haven’t already loaded up on those deals. But if you’re looking ahead, here’s the full lineup of upcoming free Prime Gaming titles this month and when they’ll be available.

Now

BioShock 2 Remastered (GOG)

The Bridge (Epic Games Store)

Eastern Exorcist (Epic Games Store)

SkyDrift Infinity (Epic Games Store)

Spirit Mancer (Amazon Games App)

January 16

Are You Smarter Than a 5th Grader (Epic Games Store)

GRIP (GOG)

SteamWorld Quest: Hand of Gilgamech (GOG)

January 23

Deus Ex GOTY Edition (GOG)

Spitlings (Amazon Games App)

Star Stuff (Epic Games Store)

To the Rescue! (Epic Games Store)

Zombie Army 4: Dead War (Epic Games Store)

January 30

Blood West (GOG)

ENDER LILIES: Quietus of the Knights (Epic Games Store)

Super Meat Boy Forever (Epic Games Store)

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/pick-up-bioshock-2-remastered-and-deus-ex-in-prime-gamings-january-freebies-215515330.html?src=rss 

Sony’s XYN XR headset is being used in very different ways at CES 2025

At CES last year, Sony teased an AR/VR headset prototype focused on “spatial content creation.” And at the same time, Siemens announced it was working with Sony to use that same hardware, including the two new controllers it developed, for something it was calling the “industrial metaverse.” That’s a lot of buzzwords, but at CES 2025 both Siemens and Sony showed the headsets and associated software in action which helped clear up a lot of what the companies are trying to do here.

During Sony’s CES press conference, it announced its XYN brand of software and hardware solutions, with the headset being a key part of the equation. The XYN “spatial capture solution” uses mirrorless cameras to scan and make photorealistic 3D objects. Using the XYN headset, you can see those objects in 3D production software for animation, video games and other potential uses.

I got a chance to try the XYN headset on, as well as see some samples of the 3D objects that were scanned and manipulated. The demo itself was a little rocky, as so many VR demos can be, but essentially I was placed inside an animated world that had already been constructed. From there, I was able to import a geode / crystal-like object that had been scanned using the spatial capture tools. I could move it all around the virtual space, scaling it up to massive size or shrinking it down to a tiny pebble.

Sony

The headset itself felt well-constructed and sturdy for a prototype — the display flips up so you can get back into the real world quickly, and the headband was pretty comfortable and secure. As usual, though, it’s hard to evaluate how it’ll feel after an hour or two around your noggin. The controller wand felt a little fiddly to me — its somewhat unusual shape makes it well-suited to pointing, but figuring out how to “grab” down on things took me a bit. I can’t say how steep the learning curve is, but at least everything felt responsive and well-made.

While the demo itself wasn’t ground-breaking, it was a good example of showing the whole XYN pipeline, from capturing a 3D object to manipulating and using it to build out a virtual environment. Sony says the XYN headset and its controllers are still in the prototype phase, but it wouldn’t surprise me if we find out more about public availability sooner than later.

Nathan Ingraham for Engadget

That’s because Siemens announced this week that what appears to be the exact same headset and controllers are now on sale, albeit with a very different focus. Siemens coined the “industrial metaverse” phrase last year, and I got a chance to learn more about just what that means. It turns out that Sony originally built the headset for internal use for designers and engineers to build things in 3D space. They were already using Siemens software, so the companies started working together to optimize both sides of the experience — and now Siemens thinks they’re at a point where they can sell the headset and software bundles to enterprise customers.

Siemens highlighted its AR capabilities a bit more, showing off how you could pin its NX Immersive Designer and use the headset as a virtual workspace — but one that lets you enlarge and manipulate the 3D objects you’re designing. You can also jump into VR mode and see the objects at full size and move around them using the headset’s controller. In this demo, I got to fly around massive 3D reproductions of a few airplanes, and while they weren’t the most detailed objects, the utility was clear.

Sony

I also used the second controller Sony developed in the Siemens demo. In addition to the pointer-style device, I had a ring over my index finger on my left hand. I used that to move around the virtual space; holding and turning my hand a specific direction moved me forwards and backwards or up and down. As always, it took a minute to get my bearings, but I was getting right up close to the virtual planes and “flying” up to check out their details before long.

Siemens is definitely further along in the quest to bring this product to end users: the XR HMD is up for pre-order now for $4,750, and the company says it’ll begin shipping next month. So the hardware is definitely beyond the prototype phase — in Sony’s case, it’s probably more a matter of making sure the whole pipeline of XYN software and hardware works together before making it widely available.

Sony and Siemens definitely face a challenge showing people how these tools can be useful — a four-minute demo doesn’t really do the trick, and I’m neither an engineer nor a “content creator” who might use the XYN tools. But what I find most intriguing about this strategy is that Sony is recognizing that its headset isn’t a broad consumer product; instead, they’re finding different places and industries where it might be useful. At this point, that’s probably a smart strategy, given that consumer-grade AR and VR remains very niche outside of the gaming sphere. But assuming Sony’s headset hardware is up to snuff, it wouldn’t surprise me to see other companies adopt it for their specific needs.

This article originally appeared on Engadget at https://www.engadget.com/ar-vr/sonys-xyn-xr-headset-is-being-used-in-very-different-ways-at-ces-2025-204020872.html?src=rss 

Generated by Feedzy
Exit mobile version