As Apple pursues AI, spare a thought for the poor HomePod

When the video kicking off Apple’s “Awe dropping” event began yesterday, I was pleased to see a HomePod in the opening shot. I hadn’t given any thought to Apple’s beleaguered smart home speaker in ages, so I was all set for Tim Cook and crew to deliver an actual surprise and reveal new stuff for HomePod. When the entire presentation then passed without another peep about the product, I was disappointed, but resigned.

HomePod has always been behind the curve. When smart speakers first came on the scene, the sector was quickly dominated by Amazon and Google, with Sonos as the boutique third-party option. Apple announced the HomePod in June 2017, almost three years after Amazon unveiled its original Echo speaker. 

The HomePod arrived too slow and cost too much, and (in an echo of the company’s current woes) Siri was too unhelpful, for the smart speaker to really claw back much market share from its rivals. The Mini iteration came out in late 2020, and the second-generation HomePod in February 2023 added support for Thread smart home standard. Not exactly pushing the boundaries of smart speaker innovation.

And things got worse for the poor HomePod as voice-controlled AI assistants have transitioned away from being centered around dedicated speakers. AI is now being integrated ever-more tightly into our smartphones and computers, with less need for a separate intermediary device. Which brings us to the ongoing issue of Apple’s shaky foothold in the AI race.

Apple has been promising a big overhaul to Siri for awhile, with the new and improved version currently not due until spring 2026. To show off that new AI assistant, Apple is reportedly working on an interactive smart home hub that is expected to have HomePod-like audio capabilities baked in, but won’t be ready for launch until at least 2027. What’s a company to do in the interim?

It makes sense for Apple to hold back on any big developments to its existing Siri-centric smart home speaker. I get it. But I could have been hyped to see a stripped down HomePod that focused more on being a speaker than on being smart. I have a Sonos that I love, and part of me assumed that I would eventually upgrade to an Apple offering. The AirPods are such a great element in the Apple ecosystem, and I use mine almost daily. Having a powerful, high-quality home speaker that delivers the easy interconnectivity that’s such a big part of Apple’s value proposition might have helped bridge the time gap, keeping Apple in people’s minds as a player in home tech while the company works on its next, more innovative move. Take one step back to take three steps forward.

In practice, though, it feels like the window for the HomePod to become a star in Apple’s lineup has shrunk to almost nothing. As I now look at the recent trends in AI and home tech, I don’t see an obvious space for a smart speaker. That’s not limited to Apple; it’s also pretty telling that both Amazon and Google haven’t been giving much love to their smart speakers either. In fact, a majority of Engadget’s favorite smart speakers this year are from specialist Sonos as the big tech names have put their focus on AI instead of audio. We’ll still have smart speakers, but they’ll be folded into multi-purpose gadgets and pitched as general smart-home aids. 

To be clear, there’s been no indication that Apple will sunset the HomePod. But my personal prediction is that HomePod will stay on the sidelines for now and get pushed even farther away from the spotlight when the new smart home products are unveiled, receiving only occasional attention until Apple officially and quietly pulls the plug for good. And so will end the life of a star-crossed product that never got the chance to shine at its full potential. Here’s hoping I’m wrong.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/as-apple-pursues-ai-spare-a-thought-for-the-poor-homepod-223250670.html?src=rss 

The PS Plus Game Catalog additions include Persona 5 Tactica and WWE 2K25

On Wednesday, Sony rolled out its September Game Catalog additions for PS Plus subscribers. This month’s entries run the gamut from turn-based tactics to survival horror and pro wrestling.

Persona 5 Tactica (PS5/ PS4) may be the most critically acclaimed title in the batch. The 2023 Persona spinoff takes the mainline games’ battles and shifts them into grid-based tactics. Think XCOM with Phantom Thieves.

The Invincible (PS5) is an adaptation of the 1963 sci-fi novel by Stanisław Lem. As its inspiration may suggest, this isn’t an action-heavy combat-fest. Instead, the narrative adventure invites you to explore the planet Regis III, searching for lost crew members. Your decisions will shape the story, so tread carefully.

2K / Sony

If exploring lost worlds as an astrobiologist isn’t your thing, then maybe pile drivers and elbow drops are. (No judgment!) WWE 2K25 (PS5 / PS4) is also on this month’s list, letting you step into the ring as a steroid-infused behemoth. You can take satisfaction in knowing your video game match’s outcomes are less pre-determined than the scripted matches you see on TV.

Other games in this month’s entries include the action RPG title Fate / Samurai Remnant, the survival horror game Crow Country and the first-person survival sim Green Hell. You can check out the PlayStation Blog’s announcement for all the details.

This article originally appeared on Engadget at https://www.engadget.com/gaming/playstation/the-ps-plus-game-catalog-additions-include-persona-5-tactica-and-wwe-2k25-211006881.html?src=rss 

Sony is rolling out a PlayStation parental controls mobile app

Sony is finally catching up to something Nintendo and Microsoft have had for years. The new PlayStation Family app mainly serves as a mobile extension of on-console parental controls. However, parents also get a few extra perks in the mobile version.

The app includes a “thoughtfully guided” onboarding process. (I imagine many people will prefer their phone or tablet over the console for that.) Once things are set up, parents can do everything they already could on the console. This includes setting playtime limits, viewing activity reports (daily and weekly), managing spending and creating content filters. Parents can also use the app to configure privacy settings for social features.

One of the mobile app’s nicer perks is real-time notifications of what the child is playing. Parents can also approve or deny requests from their children for extra playtime or access to restricted games from within the app. That feature will likely get a lot of use.

Although it’s a welcome rollout, Sony is quite late to the party. The Xbox Family Settings app launched over five years ago. Nintendo’s parental controls came even earlier, alongside the original Switch’s arrival.

The app starts rolling out globally starting today. If you don’t see it yet, you can try the storefront links for iOS or Android.

This article originally appeared on Engadget at https://www.engadget.com/gaming/playstation/sony-is-rolling-out-a-playstation-parental-controls-mobile-app-195002596.html?src=rss 

Meta tests letting anyone rate Community Notes

As part of a new test, Meta will let anyone rate a Community Note or request one for a post, Meta’s Chief Information Security Officer Guy Rosen shared on X. After testing the feature in March, the company formally introduced Community Notes as a replacement for its fact-checking program in April of this year.

You have to apply to actually write Community Notes, but Meta’s new test means that anyone who sees one can rate it to signal whether it’s helpful or not. They’ll also be able to request a note if a post is incorrect or needs additional context. Based on the screenshot Rosen shared, Meta’s rating system is a simple thumbs up or down, but the fact the company is opening the system up to more input at all is one sign of its continued expansion.

We’re testing new Community Notes features at Meta:
Anyone can now request a note or rate if a note is helpful
– Users get notified when posts they’ve interacted with receive a Community Note
– 70,000+ contributors have written 15,000+ notes (6% published).
Learn more or join:… pic.twitter.com/WCQC3CMnbe

— Guy Rosen (@guyro) September 10, 2025

The test also includes a new system for notifying users if they interact with a post that receives a Community Note. Meta did something similar with posts that were fact-checked in the past, so this seems like a good way to let people know if they’ve read something misleading. Don’t expect to be receiving those notifications too often just yet, though. Rosen says that while there are over 70,000 people writing Community Notes and over 15,000 notes have actually been written, only six percent have been published. Meta is still very early in this whole process.

Community Notes are just one component of a larger right-wing turn Meta has taken in the wake of Trump’s reelection. While the system has been styled as pro-free speech, it doesn’t necessarily offer the same ability to counter misinformation that fact-checking does. For example, multiple reports found that X’s Community Notes program did little to address the platform’s misinformation problem.

This article originally appeared on Engadget at https://www.engadget.com/social-media/meta-tests-letting-anyone-rate-community-notes-201208279.html?src=rss 

Apple is slowly morphing AirPods into an always-on wearable

The AirPods Pro 3 Apple introduced at the iPhone 17 event yesterday have better active noise cancellation and foam-filled ear tips, but their most important new feature is a subtle one: Apple came up with even more reasons for you to never take them out.

Wearing headphones while you’re talking to someone or interacting in public was at one point a social faux pas, but the ubiquity of AirPods and new features Apple has added have started to change that. The AirPods Pro’s Conversation Awareness feature, which can automatically duck audio while you’re talking to someone, is the simplest expression of this idea, but the vast majority of the improvements the company has made to its wireless earbuds have also created reasons to keep them in.

Take the hearing health features Apple debuted in 2024. Not only do they let your AirPods Pro act as a tool for checking your ear health, they can also act as a hearing aid and even hearing protection in a loud environment. With the AirPods Pro 3, you can add heart rate monitoring and live translation to the growing list of reasons to constantly wear AirPods. The Pro 3’s new heart rate sensor means you can use them to track some workouts and display your health metrics on your TV during an Apple Fitness+ class, a feature usually exclusive to the Apple Watch. The Live Translation feature, meanwhile, lets your AirPods translate the world around you, and can even beam your translated voice into another pair of AirPods Pro 3. The fact the feature will also be available on AirPods 4 and AirPods Pro 2 should make keeping your headphones in even more common, too.

It’s hard to say how useful these new AirPods Pro 3 features will be without trying them, but they do highlight how much Apple seems to view its headphones as more than just an add-on purchase to every iPhone. Not many people are going to buy the $249 AirPods Pro 3 as a replacement for the $249 Apple Watch SE 3, but the fact the headphones can fill in for the smartwatch could be attractive to some. More importantly for Apple, it could make it easier to convince someone to subscribe to Fitness+ or buy an Apple Watch if they like the company’s approach to tracking workouts.

Apple has reportedly investigated going further down the path of making the AirPods Pro even more of a standalone device. Bloomberg reported last year that the company has explored adding cameras to AirPods so they can be used for Apple Intelligence features and visually understand the world around you. Whether or not that ever happens, the more immediate explanation for all this feature-creep is that making AirPods an always-on wearable is good for the company’s bottom line. The relationship between the AirPods and the Apple Watch could become similar to the iPad and the Mac in time. New features get added, and functionality continues to overlap, but the devices are always distinct and useful enough that many people are compelled to buy both.

Maybe there’s a future where your AirPods feel as essential to daily life as a smartphone does, and we’re wearing them all the time. For now though, Apple seems to have decided that tiptoeing towards that wearable future is a pretty good way to sell new wireless earbuds in the present, and maybe several of its other products in the process.

This article originally appeared on Engadget at https://www.engadget.com/audio/headphones/apple-is-slowly-morphing-airpods-into-an-always-on-wearable-203511552.html?src=rss 

Lyft launches autonomous fleet with May Mobility in Atlanta

Lyft and May Mobility have teamed up to launch a fleet of autonomous vehicles in Atlanta. It’s a pilot program, so it’s currently only available to Lyft riders in the area of midtown Atlanta. The companies promise a “measured, safety-first approach” with this rollout.

The fleet consists of hybrid-electric Toyota Sienna Autono-MaaS vehicles equipped with May Mobility’s self-driving technology. Lyft and May Mobility announced this partnership last year, but Atlanta is the first city to get a fleet of self-driving vehicles.

Atlanta, we’re here! Our autonomous vehicle pilot program with @Lyft is now live in Midtown. Find us in the Lyft app! https://t.co/dUqF95q93r pic.twitter.com/TFfDg23D8Y

— May Mobility (@May_Mobility) September 10, 2025

The rides will be fully autonomous, but each vehicle will feature a human just in case something goes wrong. These standby operators are trained to take the wheel if needed. The companies haven’t announced a timeframe for when these standby operators will no longer be required.

Customers will have access to temperature controls, which is nice. However, hailing one of these cars is something of a crap shoot. You have to be in the service area, use the app and hope for the best. Lyft and May Mobility say they will increase the number of available vehicles and expand service hours in the “months ahead.”

This is May Mobility’s second launch in Georgia, as it operates a fleet of driverless vehicles in the Atlanta suburb of Peachtree Corners. Lyft’s primary rival Uber has also been making serious moves in this space. The company has entered into a partnership with Lucid to create a massive fleet of 20,000 autonomous vehicles. It also has plans to launch self-driving pilot programs throughout the globe.

This article originally appeared on Engadget at https://www.engadget.com/transportation/lyft-launches-autonomous-fleet-with-may-mobility-in-atlanta-184942285.html?src=rss 

A closer look at the AirPods Pro 3: ANC, Live Translation and heart-rate tracking

The AirPods Pro 3 are a big upgrade over the AirPods Pro 2. Even though Apple has continuously added new features to those earbuds over the last three years, it hasn’t changed the design or shape of the earbuds since the first model arrived in 2019. With the AirPods Pro 3, you might not notice those tweaks until you remove the new ear tips, and the most impactful upgrades are all on the inside. After my brief hands-on immediately following the iPhone 17 launch yesterday, I’ve since spent more time test driving all of the new features on the AirPods Pro 3 — from the improved active noise cancellation (ANC) to Live Translation and heart-rate monitoring.

Stronger ANC through tech and ear tips

Apple says the ANC on the AirPods Pro 3 blocks twice as much noise as the AirPods Pro 2 and four times as much as the original AirPods Pro. A big part of this is due to the ultra-low noise microphones and computational audio on the earbuds, but the new foam-infused ear tips are also playing a vital role.

The latest ear tips are still silicone on the outside like the Pro 2’s, but they’re now injected with foam. This provides much better passive noise isolation to block out distractions, helping with high-frequency sounds like human voices. In fact, there were several times this week where an Apple representative had to tap me on the shoulder because with the AirPods Pro 3 on, I couldn’t hear them speaking right next to me. No matter how they did it, the fact that Apple does a better job of silencing chatty co-workers on the AirPods Pro 3 is a welcome change. It’s impressive when you consider much of the competition struggles to reduce the volume of human voices on their earbuds and headphones.

The AirPods Pro 3 are no slouch in general noise cancellation performance either. During simulated air plane noise and recorded sounds of a bustling cafe, the earbuds did well to reduce the unwanted distraction of both. What’s more, the AirPods Pro 3 silenced the busy demo area outside of the keynote, providing a welcome respite for a few seconds during an otherwise stressful day.

Live Translation finally arrives

Sam Rutherford for Engadget

After Apple chatted up Live Translation in iOS 26 at WWDC, I was disappointed that those initial plans didn’t include AirPods. I should’ve known the announcement for the earbuds would come with the next iteration of the AirPods Pro. Like Google’s Pixel Buds, Apple’s take on the feature relies on a connected iPhone to do all of the heavy lifting, powered by the Translate app. However, you don’t need a prolonged interaction with a phone to turn on Live Translation. You can press and hold on both AirPods, ask Siri or set the shortcut for the Action Button to the task. As a reminder, Live Translation will be available on AirPods Pro 3, AirPods Pro 2 and AirPods 4 with ANC, because they all carry the H2 chip.

During a quick demo, Live Translation worked well, quickly converting the Spanish an Apple representative was speaking into English, which Siri then conveyed in the AirPods Pro 3. There’s a slight delay, which is expected, since the captured audio is processed on an iPhone and then translated in the second language. That might make for some awkward pauses, but I’ll have to wait for more real-world testing to know for sure. I did notice that text translations appeared in the app before they came through the earbuds, but again, that’s not really a surprise since the iPhone is the brains of the operation. Plus, you’ll want to use the phone as a horizontal display here, since the app provides a real-time transcription for the person you’re talking to.

One aspect of Live Translation that may go unnoticed until you actually use it on the AirPods is the role ANC plays in the process. After you activate the translation feature, active noise cancellation kicks in to reduce the speaker’s voice so that you can clearly hear the translation from Siri in the earbuds. This happens automatically, and during my demo I never felt like I needed to manually adjust the volume so I could better hear the translated English over the speaker’s Spanish.

Heart-rate tracking, but only for workouts

Billy Steele for Engadget

Another big addition to the AirPods Pro 3 is heart-rate monitoring. Apple first debuted this capability on the Beats Powerbeats Pro 2 and is using a photoplethysmography (PPG) sensor to measure light absorption in blood flow. Heart rate stats are visible only in the Fitness app during workouts though, so if you’re looking to keep tabs in other apps or Widgets, you’re out of luck. But when it comes to activity tracking, the chorus of accelerometers, gyroscope, GPS and a new on-device AI model combine with the PPG sensor to monitor stats for 50 different workouts.

This is another feature I’ll need to test at home before I can properly gauge its merits, especially since my testing here in Cupertino consisted only of a three-minute walk. Sure enough, my live heart rate was displayed on the workout screen alongside distance covered, average pace, calories burned and elapsed time. Once I completed that strenuous session, I could see my average heart rate in the Workout Details summary, just above a graph of the info.

Improved audio through more air flow

Apple loves to discuss air flow when it comes to audio performance in AirPods and the company redesigned the venting system in the AirPods Pro 3 to improve sound quality. The company also turned the ear tip so it’s beaming audio more directly into the ear. Along with Adaptive EQ, this combination provides noticeably deeper bass and a wider soundstage for more immersive spatial listening.

To move all of that air around, Apple’s acoustics team devised a new set of fine-tuned chambers to maximize the overall flow. And as a result, the vent system had to be larger, so now the one on top of the earbuds is nearly twice as large as the one on AirPods Pro 2. Then, to properly harness all of that available air space, Apple had to slightly redesign the driver/transducer to achieve the necessary frequencies. Adaptive EQ has also been expanded since the inward facing microphones on the AirPods Pro 3 have been moved so they’re not obstructed by the sides of the ear canal as much.

Sam Rutherford for Engadget

Maggie Rogers’ “Alaska” was the test track of choice during my demo, a song I’m familiar with since I’ve listened to the album Heard It in a Past Life a ton. Beyond the enhancements to bass and the spatial effect, the thing that struck me about the audio upgrades was the level of detail the AirPods Pro 3 now provide. The separation of the bass drum and hand pan enhance the immersion, but there’s also the texture in the sound of both that is typically lost on most earbuds and headphones. I listened to the AirPods Pro 2 on the flight out here to refresh my memory and it was immediately apparent that Apple has made some big upgrades to sound quality on this new model.

The AirPods Pro 3 are available for preorder now for $249. It arrives September 19 alongside the iPhone 17 family and new Apple Watches.

This article originally appeared on Engadget at https://www.engadget.com/audio/headphones/a-closer-look-at-the-airpods-pro-3-anc-live-translation-and-heart-rate-tracking-193956229.html?src=rss 

Grammarly’s AI writing assistance tools now work in five new languages

Since its debut in 2009, Grammarly has only been available in one language: English. Sure, you could switch between dialects, including Canadian and Indian English, but if you wrote in any other language, you were out of luck. That’s changing today with Grammarly rolling out beta support for five additional languages: French, German, Italian, Portuguese and Spanish.

The update is available to all Grammarly customers — whether you live in a country that speaks the language you want to write in or not — with support for the platform’s signature features included. As you write in any one of the new languages, you’ll see Grammarly highlight spelling and grammatical errors, as well offer suggestions for how you might rewrite certain paragraphs to refine their tone, style and flow. Additionally, with any of the six primary languages Grammarly now supports, the app offers in-line translation, with the ability to convert your text into 19 different languages.    

“The new features are Grammarly’s first step toward more comprehensive multilingual writing assistance,” said Grammarly. “In the coming year, the company plans to launch more advanced clarity suggestions in the supported languages, similar to what it offers in English.”

If you want to start writing in French, German, Italian, Portuguese or Spanish, you don’t need to tweak any settings in Grammarly. Provided you’re using the Windows or Mac app or Chrome extension, you can simply start writing in one of the new languages. In addition to being available to Pro, Enterprise and Education customers, free users can also take advantage of the expanded support — though with some limitations.

This article originally appeared on Engadget at https://www.engadget.com/grammarlys-ai-writing-assistance-tools-now-work-in-five-new-languages-180432231.html?src=rss 

Reddit, Yahoo, Medium and more are adopting a new licensing standard to get compensated for AI scraping

With web publishers in crisis, a new open standard lets them set the ground rules for AI scrapers. (Or, at least it will try.) The new Really Simple Licensing (RSL) standard creates terms that participants expect AI companies to abide by. Although enforcement is an open question, it can’t hurt that some heavy hitters back it. Among others, the list includes Reddit, Yahoo (Engadget’s parent company), Medium and People Inc.

RSL adds licensing terms to the robots.txt protocol, the simple file that provides instructions for web crawlers. Supported licensing options include free, attribution, subscription, pay-per-crawl and pay-per-inference. (The latter means AI companies only pay publishers when the content is used to generate a response.)

Launching alongside the standard is a new managing nonprofit, the RSL Collective. It views itself as an equivalent of nonprofits like ASCAP and BMI, which manage music industry royalties. The new group says its standard can “establish fair market prices and strengthen negotiation leverage for all publishers.”

Participating brands include plenty of internet old-schoolers. Reddit, People Inc., Yahoo, Internet Brands, Ziff Davis, wikiHow, O’Reilly Media, Medium, The Daily Beast, Miso.AI, Raptive, Ranker and Evolve Media are all on board. Former Ask.com CEO Doug Leeds and RSS co-creator Eckart Walther lead the group.

“The RSL Standard gives publishers and platforms a clear, scalable way to set licensing terms in the AI era,” Reddit CEO Steve Huffman wrote in a press release. “The RSL Collective offers a path to do it together. Reddit supports both as important steps toward protecting the open web and the communities that make it thrive.” (It’s worth noting that Reddit has licensing deals with OpenAI and Google.)

It’s unclear whether AI companies will honor the standard. After all, they’ve been known to simply ignore robots.txt instructions. But the group believes its terms will be legally enforceable.

In an interview with Ars Technica, Leeds pointed to Anthropic’s recent $1.5 billion settlement, suggesting “there’s real money at stake” for AI companies that don’t train “legitimately.” (However, that settlement is up in the air after a judge rejected it.) Leeds told The Verge that the standard’s collective nature could also help spread legal costs, making challenges to violations more feasible.

As for technical enforcement, the RSL standard can’t block bots on its own. For that, the group is partnering with the cloud company Fastly, which can act as a sort of gatekeeper. (Perhaps Cloudflare, which recently launched a pay-per-crawl system, could eventually play a part, too.) Leeds said Fastly could serve as “the bouncer at the door to the club.”

Leeds suggested to Ars that there are incentives for AI companies, too. Financially, it could be simpler for them than inking individual licensing deals. It could prevent a problem in AI content: using multiple sources for an answer to avoid using too much from any one. If content is legally licensed, the AI app can simply use the best source, which provides the user with a higher-quality answer and minimizes the risk of hallucinations.

He also referenced complaints from AI companies that there’s no effective means of licensing web-wide content. “We have listened to them, and what we’ve heard them say is… we need a new protocol,” Leeds told Ars Technica. “With the RSL standard, AI firms get a “scalable way to get all the content” they want, while setting an incentive that they’ll only have to pay for the best content that their models actually reference. If they’re using it, they pay for it, and if they’re not using it, they don’t pay for it.”

This article originally appeared on Engadget at https://www.engadget.com/ai/reddit-yahoo-medium-and-more-are-adopting-a-new-licensing-standard-to-get-compensated-for-ai-scraping-180946671.html?src=rss 

‘No Tax on Tips’ apparently also applies to your favorite streamer

Streamers, YouTubers and other content creators are eligible for the new “No Tax on Tips” policy in the One Big Beautiful Bill Act President Donald Trump signed into law on July 4, 2024. “Digital Content Creators” are included in a preliminary list of occupations that are eligible for the new tax deduction on tips the US Treasury Department released last week. That means a podcaster could receive the same tax relief as a waiter or bartender.

Under that guidance, the “Bits” received during a Twitch stream or the “Super Thanks” a YouTuber receives for a great upload could go untaxed when next year’s tax season rolls around. As The Hollywood Reporter notes, though, there are limits to how much of that tipped income will be deducted — up to $25,000 per year and it’s phased out for single filers who make more than $150,000 per year — and language that suggests not every tipping scenario content creators face might apply.

According to the Treasury, tips won’t qualify for the deduction “if they are received in the course of certain specified trades or businesses,” which includes “the fields of health, performing arts, and athletics.” Does that mean this is a much narrower carve out for content creators than it appears? Possibly, but these classifications will need to be finalized before anyone will be able to say for sure. Ultimately, content creators have multiple possible sources of income: direct subscriptions, ad revenue, paid partnerships, direct sales and digital tips. How much a new tax deduction changes their calculus will vary.

Making tips tax deductible was one of several campaign promises Trump made leading up to his reelection in November 2024. The idea was eventually folded into the One Big Beautiful Bill, which is perhaps better known for the catastrophic cuts it made to social welfare and clean energy spending. As it turns out, the bill might also reshape the creator economy, too.

This article originally appeared on Engadget at https://www.engadget.com/entertainment/youtube/no-tax-on-tips-apparently-also-applies-to-your-favorite-streamer-182932748.html?src=rss 

Generated by Feedzy
Exit mobile version