Xbox is bringing Avowed to PS5

Another first-party Xbox game is making the leap to PlayStation 5. This time around, Obsidian’s Avowed — one of our favorite games of last year — is crossing the great divide. The fantasy action RPG will hit Sony’s console on February 17, one day shy of the game’s first anniversary. 

As it happens, an anniversary update is set to go live on all platforms at the same time. This includes a new game+ mode (allowing those who have beaten the RPG to replay it with all their gear and upgrades from their previous run), a photo mode, a new weapon type and more. 

Avowed is set in the same universe as Obsidian’s Pillars of Eternity games. It tasks you with investigating a fungal plague that has infested the world. “The writing is stellar throughout, though the sidequests that reveal your companions’ backstories are particularly poignant,” Engadget senior reporter Jessica Conditt wrote. “Avowed is gorgeous, its combat systems are fully customizable, its characters are intriguing and its encumbrance limit is generous. There’s a real sense of magic about the entire game — and no, that’s not just the mind-altering mushrooms talking.”

Microsoft has brought a string of first-party Xbox games to PS5 over the last couple of years, freeing them from console exclusivity. Forza Horizon 5, Indiana Jones and the Great Circle, Senua’s Saga: Hellblade II and Sea of Thieves are among the games that have crossed over to PlayStation. Later this year, you’ll even be able to play a Halo game on PS5, something that was utterly unthinkable not too long ago. 

This article originally appeared on Engadget at https://www.engadget.com/gaming/playstation/xbox-is-bringing-avowed-to-ps5-120000035.html?src=rss 

Lumus brought a massively wider FOV to smartglasses at CES 2026

Lumus got a major boost in brand recognition when one of its waveguides was selected for use in the Meta Ray-Ban Display glasses. But that already feels like old tech now because at CES 2026, the company brought some of its latest components to the show and based on what I saw, they seem poised to seriously elevate the optical quality of the next wave of high-end smartglasses. 

When the Meta Ray-Ban Displays glasses came out, they wowed users as they were (and still are) one of a handful of smartglassess to feature a full-color in-lens display with at least a 20-degree field of view. But going by the specs on Lumus’ newest waveguides, we’re set for a major upgrade in terms of future capabilities. 

If you look closely, you can see where light from the waveguide propagates into the one of the smartglasses’ lenses.

Sam Rutherford for Engadget

The first model I tried featured Lumus’ optimized Z-30 waveguides, which not only offer a much wider 30-degree FOV, they are also 30 percent lighter and 40 percent thinner than previous generations. On top of that, Lumus says they are also more power efficient with the waveguides capable of hitting more than 8,000 nits per watt. This is a big deal because smartglasses are currently quite limited by the size of batteries they can use, especially if you want to make them small and light enough to wear all day. When I tried them on, I was dazzled by both the brightness and sharpness I saw from the Z-30s despite them being limited to 720 x 720 resolution. Not only did the increase in FOV feel much larger than 10 degrees, colors were very rich, including white, which is often one of the most difficult shades to properly reproduce.

I had to take a photo of one of Lumus’ non-functioning smartglasses with the company’s 70-degree FOV waveguide, because two out of three of the working ones had already broke and the last one that I used was being held together by tape.

Sam Rutherford for Engadget

However, even after seeing how good that first model was, I was totally not prepared for Lumus’ 70-degree FOV waveguides. I was able to view some videos and a handful of test images and I was completely blown away with how much area they covered. It was basically the entire center portion of the lens, with only small unused areas around the corners. And while I did notice some pincushion distortion along the sides of the waveguide’s display, a Lumus representative told me that it will be possible to correct for that in final retail units. But make no mistake, these waveguides undoubtedly produced some of the sharpest, brightest and best-looking optics I’ve seen from any smartglasses, from either retail models or prototypes or. It almost made me question how much wider FOV these types of gadgets really need, though to be clear, I don’t think we’ve hit the point of diminishing returns yet. 

This is one of Lumus’ thinnest waveguides measuring in at just 0.8mm.

Sam Rutherford for Engadget

Other advantages of Lumus’ geometric reflective waveguides include better overall efficiency than their refractive counterparts along with the ability to optically bond the displays to smartglasses lenses. That means unlike a lot of rivals, Lumus’ waveguides can be paired with transitions lenses instead of needing to resort to clip-on sunglass attachments when you go outside. Lumus also claims its designs also simplifies the manufacturing process, resulting in thinner waveguides (as small as 0.8mm) and generally higher yields. 

Unfortunately, taking high-quality photos of content from smartglasses displays is incredibly challenging, especially when you’re using extremely delicate prototypes, so you’ll just have to take my word for now. But with Lumus in the process of ramping up production of its new waveguides with help from partners including Quanta and SCHOTT, it feels like there will be a ton of smartglasses makers clamoring for these components as momentum continues to build around the industry’s pick for the next “big” thing. 

This article originally appeared on Engadget at https://www.engadget.com/wearables/lumus-brought-a-massively-wider-fov-to-smartglasses-at-ces-2026-233245949.html?src=rss 

YouTube will let you exclude Shorts from search results

YouTube introduced some new filters to its advanced search tools today. Possibly the most exciting change is that Shorts are now listed as a content type, so the three-minute-or-less videos can be excluded as results in your searches.

This is a welcome update for any of us who have been on the hunt for a long-form explainer only to wade through dozens of ten-second clips before finding anything close to our goal. Especially with the addition of even more AI slop last year thanks to the Google Veo 3 engine, an option to exclude Shorts may look even more appealing.

The other updates include a pair of renamed features within advanced search. The “Sort By” menu will now be called “Prioritize.” Likewise, the “View Count” option has been renamed to “Popularity;” this will allow YouTube’s algorithms to account for other metrics such as watch time to gauge how much other users are engaging with a particular video. A pair of former filter options have also been removed; there will no longer be choices to search for “Upload Date – Last Hour” and “Sort by Rating.”

This article originally appeared on Engadget at https://www.engadget.com/entertainment/youtube/youtube-will-let-you-exclude-shorts-from-search-results-204500097.html?src=rss 

Razer put a waifu in a bottle at CES 2026

Last year Razer showed off Project Ava as a digital assistant that lived inside your computer to help adjust settings or provide gaming tips. But now at CES 2026, the company’s AI companion platform has gotten a major glow-up while moving into some new digs. 

Now, in lieu of being constrained entirely to your PC’s screen, Razer has given Project Ava a real home in the form of a small tube that can display a 5.5-inch animated hologram of the AI’s avatar. You’ll still need to connect it to your computer via USB-C to provide Ava with the power and data it needs. However, all of your companion’s other components are built into its abode, including dual far-field mics so you can talk to it, a down-firing full-range speaker so it can talk and an HD camera with an ambient light sensor so the AI can see and react to its surroundings.   

But perhaps the biggest upgrade to the project is that instead of just Ava, who Razer describes as “a calm, reliable source of energy to help you keep things clear, efficient, and always on point,” there are three or four new personas (depending on how we’re counting) joining the roster. Kira looks like a TikTok e-girl decked out in a frilly outfit complete with Razer neon green accents, while Zane is her edgy masculine alternative who kind of reminds me of the Giga Chad meme, but with extra snake tattoos. Then there’s Sao, who appears to be directly inspired by iconic Japanese salary woman Saori Araki. Finally, there’s an avatar made in the likeness of Faker (Lee Sang-hyeok), the most successful and well-known League of Legends player of all time and one of Razer’s sponsored esports athletes.

The new peripheral for Project Ava is a cylinder that can display a 5.5-inch hologram of an AI companion.

Sam Rutherford for Engadget

The idea now is that instead of being trapped inside your computer, Ava or one of Razer’s other personas can sit on your desk and be your companion for everything. They can remind you of upcoming events, respond to questions or even comment on your outfit using Razer’s built-in camera. That said, if you need some privacy, the device’s mics can be muted and the company says its planning on putting a physical camera shutter on final retail models. Of course, Ava or any of the other avatars can still hang out while you game and give you advice. During my demo, Kira helped pick out a loadout in Battlefield 6 based on user criteria and even provided pros and cons for some of the game’s other equipment options. 

Project Ava’s expanded roster of AI companions

Razer

Unfortunately, while I did get to see Kira and Zane talk, dance and sway in their little bottles, Sao and Faker weren’t quite ready to make their holographic debuts. But according to Razer, that’s sort of by design as Project Ava is very much a work in progress. Currently, the avatars’ responses are generated by X AI’s Grok (yikes!), but the platform was created as a sort of open-source project that will support other models like Gemini or ChatGPT.

Down the line, Razer is hoping to add the ability for users to create their own unique avatars and companions based on their input or inspiration from real-world objects. Meanwhile, for avatars like Faker’s because he’s also an actual person, Razer wants additional time to make the AI companion helpful with topics like real-time League of Legends coaching.

Say hello to Giga Chad, I mean Zane.

Sam Rutherford for Engadget

That said, while some folks might find Project Ava a bit weird or unnerving, it actually feels pretty tame (almost cute even) in an era where people are already marrying their AI partners. And if you’re the kind of person who prefers digital companions over flesh-and-blood alternatives (you know, people), I guess it’s kind of nice to have a more tangible representation of your electronic waifus and husbandos.

Faker’s avatar was only viewable in this nearly life-size mock up.

Sam Rutherford for Engadget

Sadly, Razer has not provided full pricing for Project Ava’s holographic peripheral, though a representative said that it will be in the same ballpark as the company’s other peripherals. I’m estimating a final cost of around $200. Reservations for Project Ava are currently live with a $20 deposit before official shipments begin sometime in the second half of 2026.

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/razer-put-a-waifu-in-a-bottle-at-ces-2026-205315908.html?src=rss 

IXI’s autofocusing lenses are almost ready to replace multifocal glasses

While wave upon wave of smartglasses and face-based wearables crash on the shores of CES, traditional glasses really haven’t changed much over the hundreds of years we’ve been using them. The last innovation, arguably, was progressive multifocals that blended near and farsighted lenses — and that was back in the 1950s. It makes sense that autofocusing glasses maker IXI thinks it’s time to modernize glasses.

After recently announcing a 22-gram (0.7-ounce) prototype frame, the startup is here in Las Vegas to show off working prototypes of its lenses, a key component of its autofocus glasses, which could be a game-changer. 

IXI’s glasses are designed for age-related farsightedness, a condition that affects many, if not most people over 45. They combine cameraless eye tracking with liquid crystal lenses that automatically activate when the glasses detect the user’s focus shifting. This means that, instead of having two separate prescriptions, as in multifocal or bifocal lenses, IXI’s lenses automatically switch between each prescription. Crucially — like most modern smartglasses — the frames themselves are lightweight and look like just another pair of normal glasses.

Mat Smith for Engadget

With a row of prototype frames and lenses laid out in front of him, CEO and co-founder Niko Eiden explained the technology, which can be separated into two parts. First, the IXI glasses track the movement of your eyes using a system of LEDs and photodiodes, dotted around the edges of where the lenses sit. The LEDs bounce invisible infrared light off the eyes and then measure the reflection, detecting the subtle movements of your eye and how both eyes converge when focusing on something close.

Using infrared with just a “handful of analog channels” takes far less power than the millions of pixels and 60-times-per-second processing required by camera-based systems. IXI’s system not only tracks eye movements, but also blinking and gaze direction, while consuming only 4 milliwatts of power.

Mat Smith for Engadget

Most of the technology, including memory, sensors, driving electronics and eye tracker, is in the front frame of the glasses and part of the arms closest to the hinge. The IXI prototype apparently uses batteries similar in size to those found in AirPods, which gives some sense of the size and weight of the tech being used. The charging port is integrated into the glasses’ left arm hinge. Naturally, this does mean they can’t be worn while charging. IXI says that a single charge should cover a whole day’s usage.

The prototype frames I saw this week appeared to be roughly the same weight as my traditional chunky specs. And while these are early iterations, IXI’s first frames wouldn’t look out of place in a lineup of spectacle options.

The team has also refined the nose pieces and glasses arms to accommodate different face shapes. Apparently, when testing expanded from Finland to the UK, British faces were “…different.” A little harsh when talking to me, a Brit.

Eiden pulled out some prototype lenses, made up of layers of liquid crystal and a transparent ITO (indium tin oxide) conductive layer. This combination is still incredibly thin, and it was amazing to watch the layers switch almost instantly into a prescription lens. It seemed almost magical. As they’re so thin, they can be easily integrated into lenses with existing prescriptions. It can also provide cylindrical correction for astigmatism too.

Autofocus lenses could eliminate the need for multiple pairs of glasses, such as bifocals and progressives. Even if the glasses were to run out of power, they’d still function as a pair of traditional specs with your standard prescription, just lacking the near-sighted boost. IXI’s sensor sensitivity can also offer insight into other health conditions, detect dry eyes, estimate attentiveness and, by tracking where you’re looking, even posture and neck movement. According to Eiden, blink rate changes with focus, daydreaming and anxiety, and all that generates data that can be shown in the companion app.

Mat Smith for Engadget

Hypothetically, the product could even potentially adapt prescriptions dynamically, going beyond the simple vision correction of Gen 1. For example, it could offer stronger corrections as your eyes get fatigued through the day.

IXI appears to be putting the pieces in place to make these glasses a reality. It still needs to obtain the necessary medical certifications in order to sell its glasses and get all the production pieces in place. It’s already partnered with Swiss lens-maker Optiswiss for manufacturing. Eiden says the final product will be positioned as a high-end luxury glasses option, selling through existing opticians. The company hopes to finally launch its first pair sometime next year.

This article originally appeared on Engadget at https://www.engadget.com/wearables/ixis-autofocusing-lenses-multifocal-glasses-ces-2026-212608427.html?src=rss 

Handwriting is my new favorite way to text with the Meta Ray-Ban Display glasses

When Meta first announced its display-enabled smart glasses last year, it teased a handwriting feature that allows users to send messages by tracing letters with their hands. Now, the company is starting to roll it out, with people enrolled in its early access program getting it first,

I got a chance to try the feature at CES and it made me want to start wearing my Meta Ray-Ban Display glasses more often. When I reviewed the glasses last year, I wrote about how one of  my favorite tings about the neural band is that it reduced my reliance on voice commands. I’ve always felt a bit self conscious at speaking to my glasses in public. 

Up to now, replying to messages on the display glasses has still generally required voice dictation or generic preset replies. But handwriting means that you can finally send custom messages and replies somewhat discreetly. 

Sitting at a table wearing the Meta Ray-Ban Display glasses and neural band, I was able to quickly write a message just by drawing the letters on the table in front of me. It wasn’t perfect — it misread a capital “I” as an “H” — but it was surprsingly intuitive. I was able to quickly trace out a short sentence and even correct a typo (a swipe from left to right will let you add a space, while a swipe from right to left deletes the last character). 

Alongside handwriting, Meta also announced a new teleprompter feature. Copy and paste a bunch of text — it supports up to 16,000 characters (roughly a half-hour’s worth of speech) — and you can beam your text into the glasses’ display. 

If you’ve ever used a teleprompter, Meta’s version works a bit differently in that the text doesn’t automatically scroll while you speak. Instead, the text is displayed on individual cards you manually swipe through. The company told me it originally tested a scrolling version, but that in early tests, people said they preferred to be in control of when the words appeared in front of them. 

Teleprompter is starting to roll out now, though Meta says it could take some time before everyone is able to access. 

The updates are the among the first major additions Meta has made to its display glasses since launching them late last year and a sign that, like its other smart glasses, the company plans to keep them fresh with new features. Elsewhere at CES, the company announced some interesting new plans for the device’s neural band and that it was delaying a planned international rollout of the device.

This article originally appeared on Engadget at https://www.engadget.com/wearables/handwriting-is-my-new-favorite-way-to-text-with-the-meta-ray-ban-display-glasses-213744708.html?src=rss 

Who Is Jennette McCurdy’s ‘Half His Age’ Book Inspired by? What We Know

Jennette’s 2026 novel is fictional but offers a meta version of what she experienced as a teen in a relationship with an older man.

Jennette’s 2026 novel is fictional but offers a meta version of what she experienced as a teen in a relationship with an older man. 

Kenan Thompson’s Wife: Everything to Know About His Ex Christina Evangeline and Their Split

From how they met to why they split, here’s everything to know about Kenan Thompson’s ex, Christina Evangeline, their marriage, family, and breakup.

From how they met to why they split, here’s everything to know about Kenan Thompson’s ex, Christina Evangeline, their marriage, family, and breakup. 

Three months of Audible is only $3 right now

Have a hankering for some audiobooks? Audible is holding one heck of a sale right now, giving users three months of access for $3. That’s a dollar per month. This is something of a winter tradition for the Amazon-owned platform and the promotion ends on January 21.

An Audible subscription grants one audiobook per month to keep. This can be selected from a massive catalog of new releases and bestsellers. The collection here has just about everything.

However, it’s easy to plow through a single book in a month. Users also get streaming access to thousands of curated titles. Think of it like Netflix for audiobooks. The catalog is limited, but it gets the job done in a pinch. Subscribers do get access to all Audible original content and they will receive discounts on purchasing audiobooks outright.

In other words, it’s a neat little service and well worth a buck. The regular price is $15, so make sure to cancel at the end of that three months if you aren’t enjoying the platform.

Follow @EngadgetDeals on X for the latest tech deals and buying advice.

This article originally appeared on Engadget at https://www.engadget.com/deals/three-months-of-audible-is-only-3-right-now-193859847.html?src=rss 

The US withdraws from dozens of international bodies, including climate-focused organizations

In a new executive order, President Donald Trump has declared that the United States will withdraw from 66 international organizations and bodies, including several focused on tackling climate change. Trump made his disregard for climate change clear when he withdrew the US from the Paris climate agreement for a second time in early 2025, but these new withdrawals further confirm the second Trump administration is against global collaboration in general.

Among the organizations targeted, the US will no longer participate in the UN Framework Convention on Climate Change, the Intergovernmental Panel on Climate Change and organizations focused on trade, conservation, reproductive rights and immigration, like International Trade Centre, the International Union for the Conservation of Nature, the UN Population Fund and the Global Forum on Migration and Development. In the case of the United Nations-affiliated organizations, the US ending its participation also means withdrawing funding.

According to the White House, the organizations the US is leaving “promote radical climate policies, global governance and ideological programs that conflict with US sovereignty and economic strength.” Withdrawing is supposed to save taxpayers money, though the White House’s fact sheet on the executive order neglects to say how much will be saved or how that saved money will be spent now that it’s not supporting the United Nations.

“By withdrawing from the IPCC, UNFCCC, and the other vital international partnerships, the Trump administration is undoing decades of hard-won diplomacy, attempting to undermine climate science and sowing distrust around the world,” Former Vice President Al Gore said in a statement responding to the executive order.

While losing financial backing likely doesn’t help anyone, the actual impact of the US’sdec withdrawals is a bit of an unknown, The Washington Post reports. For example, the US remains involved with the International Energy Agency, which works on global clean energy solutions. Also, many of the organizations the White House decided to exit were deliberative bodies, or ones that the US was only marginally involved in, according to a UN official The Post spoke to.

Directly pushing back against global organizations and regulation has been a consistent theme of the second Trump administration, particularly in regards to tech regulation. The US withdrew from trade talks with Canada in June 2025 over the country’s digital services tax, and just last month the US banned former EU commissioner Thierry Breton from entering the US for his role in the creation of the Digital Services Act.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/the-us-withdraws-from-dozens-of-international-bodies-including-climate-focused-organizations-195259578.html?src=rss 

Generated by Feedzy
Exit mobile version