ESA releases stunning first images from Euclid, its ‘dark universe detective’

The European Space Agency (ESA) has released the first images from its Euclid space telescope — a spacecraft peering 10 billion years into the past to create the largest 3D map of the universe yet. From the distinctive Horsehead Nebula (pictured above) to a “hidden” spiral galaxy that looks much like the Milky Way, Euclid is giving us the clearest look yet at both known and previously unseen objects speckling enormous swathes of the sky.

Euclid is investigating the “dark” universe, searching for signs of how dark energy and dark matter have influenced the evolution of the cosmos. It’ll observe one-third of the sky over the next six years, studying billions of galaxies with its 4-foot-wide telescope, visible-wavelength camera and near-infrared camera/spectrometer. Euclid launched in July 2023, and while its official science mission doesn’t start until early 2024, it’s already blowing scientists away with its early observations.

ESA

Euclid’s observation of the Perseus Cluster (above), which sits 240 million light-years away, is the most detailed ever, showing not just the 1,000 galaxies in the cluster itself, but roughly 100,000 others that lay farther away, according to ESA. The space telescope also caught a look at a Milky-Way-like spiral galaxy dubbed IC 342 (below), or the “Hidden Galaxy,” nicknamed as such because it lies behind our own and is normally hard to see clearly.

ESA

Euclid is able to observe huge portions of the sky, and it’s the only telescope in operation able to image certain objects like globular clusters in their entirety in just one shot, according to ESA. Globular clusters like NGC 6397, pictured below, contain hundreds of thousands of gravity-bound stars. Euclid’s observation of the cluster is unmatched in its level of detail, ESA says.

The spacecraft is able to see objects that have been too faint for others to observe. Its detailed observation of the well-known Horsehead Nebula, a stellar nursery in the Orion constellation, for example, could reveal young stars and planets that have previously gone undetected.

ESA
ESA

Euclid also observed the dwarf galaxy, NGC 6822 (pictured above), which sits just 1.6 million light years away. This small, ancient galaxy could hold clues on how galaxies like our own came to be. It’s only the beginning for Euclid, but it’s already helping to unlock more information on the objects in our surrounding universe, both near and far. 

“We have never seen astronomical images like this before, containing so much detail,” said René Laureijs, ESA’s Euclid Project Scientist, of the first batch of images. “They are even more beautiful and sharp than we could have hoped for, showing us many previously unseen features in well-known areas of the nearby universe.”

This article originally appeared on Engadget at https://www.engadget.com/esa-releases-stunning-first-images-from-euclid-its-dark-universe-detective-203948971.html?src=rss 

Spotify subscribers in the US now get 15 hours of audiobooks every month

In addition to music and podcasts, Spotify has recently been working to cement its presence in the audiobook space. Today, the company announced Premium users in the US will be able to stream 15 hours of free audiobook content monthly as a part of their subscription. This offering was previously only available to Premium users in the UK and Australia.

The company says there’s no need for users to do anything. Audiobooks that are available to stream will be marked as “Included in Premium” and users can hit play right away. Spotify notes that 15 hours is roughly two average audiobooks per month. If you end up hitting the limit, you can purchase a 10-hour top-up.

The company says its Spotify Premium audiobook catalog now has something for everyone. Users with a Premium subscription can access over 70 percent of today’s bestsellers, including Britney Spears’ The Woman in Me and Jesmyn Ward’s Let Us Descend. There are also many classic pieces of literature, like Emily Brontë’s Wuthering Heights. Spotify believes its listeners will “love exploring the depths of our 200,000-strong catalog, unearthing genres from ‘cozy mystery’ to ‘historical romance.'”

Books that aren’t eligible for free streaming will need to be purchased outright. Those books will have a lock on the play button, which means you’ll need to purchase the title. To make a purchase, you’ll follow a link to your browser. Once that’s completed, you’ll be taken back to the app to listen to your new book. All your purchased titles will show up in your library and be available for offline listening. Spotify also gives you the option to control playback speed so you can listen at your own pace.

It makes sense that Spotify has included audiobooks in its app, but there are a few things that may deter users from tapping in. Yes, having a single place to listen to your music, podcasts and books is convenient but unlike with music and podcasts, you have a streaming limit here. Additionally, only a limited number of books are free to stream with your $11 subscription. While Audible also charges a subscription fee, users get one book to own every month, which may make it the more appealing and affordable option for some.

This article originally appeared on Engadget at https://www.engadget.com/spotify-subscribers-in-the-us-now-get-15-hours-of-audiobooks-every-month-192000398.html?src=rss 

Airbnb will soon let you open smart locks in its app

Winter is almost upon us and Airbnb has announced a new feature that could help folks avoid fumbling for keys while wearing a bunch of layers. Starting in the US and Canada later this year, Airbnb hosts who are in the invite-only Early Access program will be able to link compatible smart locks to their Airbnb account and generate a unique code for each reservation. Guests will then be able to input the code in the Airbnb app to open the lock. At the outset, Airbnb will support some models from Schlage, August and Yale.

That could make some Airbnb pain points much easier to deal with. Hosts won’t have to worry about bad actors sharing entry codes with other people after they check out, and guests should find it more straightforward to find and enter their code. They also won’t have to download a separate app if they’re staying at a place that uses a compatible smart lock.

Airbnb is making a string of other changes as part of its winter update. You’ll be able to access a collection of the 2 million most-loved homes on the platform. These Guest Favorites all have an average rating of above 4.9 with high marks for things like value, the check in process, cleanliness, listing accuracy, host communication and location. Hosts of Guest Favorites will all have strong track records of reliability and almost two-thirds of the listings are from Superhosts.

You’ll soon start seeing a badge denoting a listing as a Guest Favorite on the listing page and in search results. There’ll also be an option to filter results by Guest Favorites.

Elsewhere, you’ll be able to sort reviews by recency or rating, while a new chart should make the distribution of reviews on the five-star scale easier to grok. When you leave a review, you’ll be able to include more details that may be useful for context, such as where you’re from, how long you stayed and whether you traveled with family, another group or pets. Airbnb is starting to roll out the reviews and Guest Favorites updates this week.

Since last year, Airbnb has been making its pricing more transparent. To that end, service fees will now be included in the prices that hosts set. According to Airbnb, that will give hosts a better idea of how much guests are paying overall. It should be easier for hosts to compare their prices to similar listings through the calendar too.

Hosts will have access to other new listing tools, such as an Ai-powered photo tour. Airbnb says its AI engine can recognize photos and assign them to up to 19 rooms to help guests better understand the layouts of properties. Hosts will be able to edit the photo tour whenever they like and pinpoint amenities in each room.

This article originally appeared on Engadget at https://www.engadget.com/airbnb-will-soon-let-you-open-smart-locks-in-its-app-192753343.html?src=rss 

Lego’s 5,200-piece Avengers Tower set ships with 31 minifigures, including Kevin Feige

Lego just unveiled another set based on the Marvel Cinematic Universe, and boy is it a doozy. The massive 5,200-piece Avengers Tower set measures nearly three feet tall and ships with 31 minifigures, including Marvel Studios head honcho Kevin Feige. It also includes several dioramas that let you create many of the important scenes that took place in Avengers Tower, from the Chitauri battle of the original film to the party scene from Age of Ultron and beyond.

The set releases on November 24 and will cost an eye-watering $500. Still, this is the 17th-largest collection the company has ever made and the one with the most minifigures. Beyond Feige, other figures include Captain America, Thor, Loki, some Ultron drones and just about every other major character that appeared in Avengers Tower throughout the films. There’s even an appropriately-scaled Hulk.

In addition to the tower itself, which actually opens to allow for interior sequences, the set ships with a Quinjet and a Chitauri invasion ship. You also get plenty of accessories to help pose the minifigures in a variety of action-packed scenarios. About the only thing missing is the shawarma shop down the street.

As previously mentioned, this isn’t Lego’s first MCU-adjacent set. The company has released a giant Hulkbuster suit from Age of Ultron, a battle scene based on Black Panther: Wakanda Forever and Iron Man’s armory, among others. It has also shipped some sets based on other Marvel properties, like a Miles Morales figure and a Daily Bugle collection. Beyond superheroes, Lego launched a nifty Pac-Man arcade console set this year and one based on the Xbox 360.

This article originally appeared on Engadget at https://www.engadget.com/legos-5200-piece-avengers-tower-set-ships-with-31-minifigures-including-kevin-feige-193359347.html?src=rss 

Google’s AI-empowered search feature goes global with expansion to 120 countries

Google’s Search Generative Experience (SGE), which currently provides generative AI summaries at the top of the search results page for select users, is about to be much more available. Just six months after its debut at I/O 2023, the company announced Wednesday that SGE is expanding to Search Labs users in 120 countries and territories, gaining support for four additional languages and receiving a handful of helpful new features.

Unlike its frenetic rollout of the Bard chatbot in March, Google has taken a slightly more measured tone in distributing its AI search assistant. The company began with English language searches in the US in May, expanded to English-language users in India and Japan in August and on to teen users in September. As of Wednesday, users from Brazil to Bhutan can give the feature a try. In addition to English, SGE now supports Spanish, Portuguese, Korean and Indonesian (in addition to the existing English, Hindi and Japanese) so you’ll be able to search and converse with the assistant in natural language, whichever form it might take. These features arrive on Chrome desktop Wednesday with the Search Labs for Android app versions slowly rolling out over the coming week.

Among SGE’s new features is an improved follow-up function where users can ask additional questions of the assistant directly on the search results page. Like a mini-Bard window tucked into the generated summary, the new feature enables users to drill down on a subject without leaving the results page or even needing to type their queries out. Google will reportedly restrict ads to specific, denoted, areas of the page so as to avoid confusion between them and the generated content. Users can expect follow-ups to start showing up in the coming weeks. They’re only for English language users in the US to start but will likely expand as Google continues to iterate the technology. 

SGE will start helping with clarifying ambiguous translation terms as well. For example, if you’re trying to translate “Is there a tie?” into Spanish, both the output, the gender and speaker’s intention are going to change if you’re talking about a tie, as in a draw between two competitors (e.g. “un empate”) and for the tie you wear around your neck (“una corbata”). This new feature will automatically recognize such words and highlight them for you to click on, which pops up a window asking you to pick between the two versions. This is going to be super helpful with languages that, say, think of cars as boys but bicycles as girls, and you need to specify the version you’re intending. Luckily, Spanish is one of those languages and this capability is coming first to US users for English-to-Spanish translations.

Finally, Google plans to expand its interactive definitions normally found in the generated summaries for educational topics like science, history or economics to coding and health related searches as well. This update should arrive within the next month, again, first for English language users in the US before spreading to more territories in the coming months. 

This article originally appeared on Engadget at https://www.engadget.com/googles-ai-empowered-search-feature-goes-global-with-expansion-to-120-countries-180028084.html?src=rss 

GM recalls nearly 1,000 Cruise robotaxis after pedestrian collision

Cruise, the autonomous vehicle company owned by General Motors, has issued a recall for 950 of its robotaxis following a collision with a pedestrian in San Francisco last month, as originally reported by NBC. This move comes after California revoked the company’s driverless permits, requiring a human on-board at all times.

The collision that started all of this occurred on October 2 when a pedestrian was thrown into the path of a Cruise robotaxi after being hit by a human driver. The robotaxi tried to brake aggressively and pull over to the side of the road, but ended up dragging the pedestrian 20 feet before finally stopping.

This triggered a federal probe and several independent investigations into the company, which dredged up some unsavory data. For instance, reports indicate that Cruise’s algorithm had real trouble identifying children. The data also suggests that Cruise knew about this incredibly dangerous blind spot but still kept its vehicles on the streets.

Internal safety documents acquired by The Intercept state that “Cruise AVs may not exercise additional care around children,” and that the robotaxis may “need the ability to distinguish children from adults so we can display additional caution around.” The company responded by touting its safety features, writing in a statement that it has “the lowest risk tolerance for contact with children.”

All of that’s moot now, as Cruise’s robotaxis are being recalled. GM and Cruise have not issued statements as to when and if the cars would return to the streets. GM did announce, however, that it has already lost $1.9 billion on the venture through September of this year, as reported by CNBC.

Rival companies like Google-owned Waymo are still operating driverless vehicles in California and beyond. As a matter of fact, the company just doubled the service area for its robotaxis in San Francisco and Phoenix.

This article originally appeared on Engadget at https://www.engadget.com/gm-recalls-nearly-1000-cruise-robotaxis-after-pedestrian-collision-183049933.html?src=rss 

Google workers publish letter criticizing company’s Israel-Palestine ‘double standard’

A group of Google employees has published an open letter on Medium calling out an alleged double standard in the company related to freedom of expression surrounding the Israel-Palestine war. The essay condemns “hate, abuse and retaliation” within the company against Muslim, Arab and Palestinian workers. The employees who penned the letter, which doesn’t include specific names out of fear of retaliation, demand that CEO Sundar Pichai, Google Cloud CEO Thomas Kurian and other senior leaders publicly condemn “the ongoing genocide in the strongest possible terms.” In addition, they urge the company to cancel Project Nimbus, a $1.2 billion deal to supply AI and other advanced tech to the Israeli military.

“We are Muslim, Palestinian, and Arab Google employees joined by anti-Zionist Jewish colleagues,” the letter opens. “We cannot remain silent in the face of the hate, abuse, and retaliation that we are being subjected to in the workplace in this moment.”

The letter cites specific examples of emotionally charged and inappropriate workplace behavior. These include unnamed Googlers accusing Palestinians of supporting terrorism, committing “slander against the Prophet Muhammad,” and publicly calling Palestinians “animals” on official Google work platforms. The group describes leadership as “standing idly by” in the latter two cases, and it says Google managers have called employees “sick” and “a lost cause” for expressing empathy toward Gaza residents.

The employees say Google managers have publicly asked Arab and Muslim people in the company if they support Hamas as a response to their concern for Palestinian families. “There are even coordinated efforts to stalk the public lives of workers sympathetic to Palestine and to report them both to Google and law enforcement for ‘supporting terrorism,’” the letter reads.

Google CEO Sundar Pichai

ASSOCIATED PRESS

Other examples cited include “heartfelt appeals” to donate to a charity for Gaza citizens being “met with multiple comments dehumanizing Gazans as being ‘animals,’ disregarding their plight and calling upon Googlers to boycott relief work for civilians due to the fact that Palestinian schools and hospitals were being used for ‘terrorism.’” The letter also accuses Google managers of using their rank to “question, report, and attempt to get fired Muslim, Arab, and Palestinian Googlers who express sympathy with the plight of the besieged Palestinian people.” It describes one manager endorsing “surveillance of Google employees on social media,” and then openly harassing them on Google work platforms.

“You have to be very, very, very careful, because any sort of criticism toward the Israeli state can be easily taken as antisemitism,” Sarmad Gilani, a Google software engineer who took part in the letter, said in an interview with The New York Times. “It feels like I have to condemn Hamas 10 times before saying one tiny, tiny thing criticizing Israel.”

Engadget contacted Google for a comment but didn’t immediately receive a response. We will update this article if we hear back.

The tensions inflamed in the last month by the Israel-Palestine war have resurfaced resentments about Google’s involvement in Project Nimbus. In 2021, Google and Amazon workers penned a similar open letter calling on their companies to pull out of the deal, which they said would enable surveillance of and unlawful data collection on Palestinians. Today’s letter echoes that sentiment. “We demand that Google stop providing material support to this genocide by canceling its Project Nimbus contract and immediately cease doing business with the Israeli apartheid government and military,” it reads.

This article originally appeared on Engadget at https://www.engadget.com/google-workers-publish-letter-criticizing-companys-israel-palestine-double-standard-181516404.html?src=rss 

The director of Sundance darling ‘We Met in Virtual Reality’ launches a VR studio

We Met in Virtual Reality, a documentary shot entirely inside VRChat (now available to stream on Max), was one of the highlight’s of last year’s Sundance Film Festival. It deftly showed how people can form genuine friendships and romantic connections inside of virtual worlds — something Mark Zuckerberg could only dream of with his failed metaverse concept. Now the director of that film, Joe Hunting, is making an even bigger bet on virtual reality: He’s launching Painted Clouds, a production studio devoted to making films and series set within VR.

What’s most striking about We Met in Virtual Reality, aside from the Furries and scantily-clad anime avatars, is that it looks like a traditional documentary. Hunting used VRCLens, a tool developed by the developer Hirabiki, to perform cinematic techniques like pulling focus, deliberate camera movements and executing aerial drone shots. Hunting says he aims to “build upon VRCLens to give it more scope and make it even more accessible to new filmmakers,” as well as using it for his own productions.

Additionally, Hunting is launching “Painted Clouds Park,” a world in VRChat that can be used for production settings and events. It’s there that he also plans to run workshops and media events to teach people about the possibilities of virtual reality filmmaking.

His next project, which is set to begin pre-production next year, will be a dramedy focused on a group of online friends exploring an ongoing mystery. Notably, Hunting says it will also be shot with original avatars and production environments, not just cookie-cutter VRChat worlds. His aim is to make it look like a typical animated film — the only difference is that it’ll be shot inside of VR. It’s practically an evolution of the machinima concept, which involved shooting footage inside of game engines, using existing assets.

“Being present in a headset and being in the scene yourself, holding the camera and capturing the output, I find creates a much more immersive filmmaking experience for me, and a much more playful and joyful one, too,” Hunting said. “I can look up and everyone is their characters. They’re not wearing mo-cap [suits] to represent the characters. They just are embodying them. Obviously, that experience doesn’t translate completely on screen as an audience member. But in terms of directing and the kind of relationship I can build with my actors and the team around me, I find that so fun.”

Throughout all of his work, including We Met in Virtual Reality and earlier shorts, Hunting has been focused on capturing virtual worlds for playback on traditional 2D screens. But looking forward, he says he’s interested in exploring 360-degree immersive VR projects as well. It could end up being part of behind-the-scenes footage for his next VR film, as a part of an experimental project in the future. In addition to his dramedy project, Hunting is also working on a short VR documentary, as well as a music video.

This article originally appeared on Engadget at https://www.engadget.com/the-director-of-sundance-darling-we-met-in-virtual-reality-launches-a-vr-studio-164532412.html?src=rss 

Walmart Black Friday deals 2023: Save $50 on the Apple Watch Series 9, plus up to 70 percent on AirPods, Roku devices and more

Walmart has already kicked of its Black Friday sale. That’s good news for anyone who wants to spend the day after Thanksgiving doing something other than shopping. The early Black Friday deals went live online today — Walmart+ members get a few hours of early access before everyone else — and will hit physical stores this Friday.

We’ve unearthed the best tech deals from this portion of Walmart’s Black Friday sale and gathered them here so you can start ticking gifts off your holiday list — or grab something for yourself at a discount. The highlights are probably the brand new Apple Watch Series 9 for $349 and the second-generation AirPods for $69 — both are all-time lows. 

Apple Watch Series 9

If you’ve been waiting to get Apple’s brand new flagship smartwatch, your patience has paid off. The Apple Watch Series 9 is down to $349 for the 41mm, GPS-only model. That’s a $50 discount and the lowest price yet for the barely-two-month-old wearable. If you’d prefer a little more room on the screen, you can go for the 45mm case size for $379, also $50 and a new low price. The biggest improvement this time around is the S9 processor. It allows for a new Double Tap feature and onboard Siri processing for faster responses to your queries. It’s also got a brighter screen and, when paired with the Sport Loop, is carbon neutral. We gave the wearable a solid score of 92 in our review, praising the new features and the comprehensive fitness and health tracking. 

Apple AirPods (second gen)

The second-generation AirPods are now on sale for $69, which is a $90 discount and their lowest price ever, thanks to Walmart’s sale. They are a little older at this point; we gave them a review score of 84 when they came out back in 2019. But they’re a good pick for someone who needs a knock-around pair of buds that pair seamlessly with an iPhone. Some people even prefer the smooth fit of the older model, which is more like Apple’s wired EarPods. To be sure, both the third-generation AirPods and the new, second-generation AirPods Pro have seen significant improvements in both sound quality and features like noise cancellation, Spatial Audio and Transparency mode. The newer buds may make more sense for audiophiles, but at this price, the second-generation AirPods will make a nice stocking stuffer for an iPhone user. 

Canon EOS R50 

We tested out the Canon EOS R50 mirrorless camera when it came out earlier this year and gave it an 87. During the early Black Friday sale, it’s $180 off, which is a new all-time low. We think this is a great camera for street photographers or travelers because it has a slim and light design and is capable of shooting 4K video. It offers fast shooting, has a reliable auto-focus and takes great images — particularly for the price point. And now that it’s on sale, you’ll be hard pressed to find a higher quality hybrid camera that can do pretty much everything vloggers and photographers need. 

Sony WF-C500 earbuds 

Sony’s WF-C500 earbuds are 70 percent off right now, which makes them $29 instead of $100. That’s a pretty great price, though the buds are about two years old at this point. Still, they’re our favorite budget pair of wireless headphones for working out. They were one of the lightest pair of earbuds we tested and though they have a more bulbous design, they were still comfortable. While they don’t have active noise cancellation, the shape does a good job of passively blocking out most sounds (but traffic noises still get through, which is important for outdoor workouts). 

Roku Ultra LT

The Roku Ultra LT is 57 percent off thanks to the Walmart Black Friday sale. We’re fans of Roku streaming devices and recommend the Ultra set top box in our guide. The Ultra LT is a Walmart exclusive version that has a less-expensive starting price and a few tradeoffs over the regular Ultra. There’s a slightly different remote on the LT, that lacks to personalization buttons and there’s no Bluetooth connectivity and it lacks a USB port in the back of the set top box. But you still get one of the best smart TV interfaces plus support for 4K video and Dolby Vision. Plus at this price, it’s $65 cheaper than the other version.  

Your Black Friday Shopping Guide: See all of Yahoo’s Black Friday coverage, here. Follow Engadget for Black Friday tech deals. Learn about Black Friday trends on In The Know. Hear from Autoblog’s experts on the best Black Friday deals for your car, garage, and home, and find Black Friday sales to shop on AOL, handpicked just for you.

This article originally appeared on Engadget at https://www.engadget.com/walmart-black-friday-deals-2023-save-50-on-the-apple-watch-series-9-plus-up-to-70-percent-on-airpods-roku-devices-and-more-170010855.html?src=rss 

NVIDIA’s Eos supercomputer just broke its own AI training benchmark record

Depending on the hardware you’re using, training a large language model of any significant size can take weeks, months, even years to complete. That’s no way to do business — nobody has the electricity and time to be waiting that long. On Wednesday, NVIDIA unveiled the newest iteration of its Eos supercomputer, one powered by more than 10,000 H100 Tensor Core GPUs and capable of training a 175 billion-parameter GPT-3 model on 1 billion tokens in under four minutes. That’s three times faster than the previous benchmark on the MLPerf AI industry standard, which NVIDIA set just six months ago.

Eos represents an enormous amount of compute. It leverages 10,752 GPUs strung together using NVIDIA’s Infiniband networking (moving a petabyte of data a second) and 860 terabytes of high bandwidth memory (36PB/sec aggregate bandwidth and 1.1PB sec interconnected) to deliver 40 exaflops of AI processing power. The entire cloud architecture is comprised of 1344 nodes — individual servers that companies can rent access to for around $37,000 a month to expand their AI capabilities without building out their own infrastructure. 

In all, NVIDIA set six records in nine benchmark tests: the 3.9 minute notch for GPT-3, a 2.5 minute mark to to train a Stable Diffusion model using 1,024 Hopper GPUs, a minute even to train DLRM, 55.2 seconds for RetinaNet, 46 seconds for 3D U-Net and the BERT-Large model required just 7.2 seconds to train.

NVIDIA was quick to note that the 175 billion parameter version of GPT-3 used in the benchmarking is not the full-sized iteration of the model (neither was the Stable Diffusion model). The larger GPT-3 offers around 3.7 trillion parameters and is just flat out too big and unwieldy for use as a benchmarking test. For example, it’d take 18 months to train it on the older A100 system with 512 GPUs — though, Eos needs just eight days. 

So instead, NVIDIA and MLCommons, which administers the MLPerf standard, leverage a more compact version that uses 1 billion tokens (the smallest denominator unit of data that generative AI systems understand). This test uses a GPT-3 version with the same number of potential switches to flip (s the full-size (those 175 billion parameters), just a much more manageable data set to use in it (a billion tokens vs 3.7 trillion).

The impressive improvement in performance, granted, came from the fact that this recent round of tests employed 10,752 H100 GPUs compared to the 3,584 Hopper GPUs the company used in June’s benchmarking trials. However NVIDIA explains that despite tripling the number of GPUs, it managed to maintain 2.8x scaling in performance — an 93 percent efficiency rate — through the generous use of software optimization.

“Scaling is a wonderful thing,” Salvator said.”But with scaling, you’re talking about more infrastructure, which can also mean things like more cost. An efficiently scaled increase means users are “making the best use of your of your infrastructure so that you can basically just get your work done as fast [as possible] and get the most value out of the investment that your organization has made.”

The chipmaker was not alone in its development efforts. Microsoft’s Azure team submitted a similar 10,752 H100 GPU system for this round of benchmarking, and achieved results within two percent of NVIDIA’s.

“[The Azure team have] been able to achieve a performance that’s on par with the Eos supercomputer,” Dave Salvator Director of Accelerated Computing Products at NVIDIA, told reporters during a Tuesday prebrief. What’s more “they are using Infiniband, but this is a commercially available instance. This isn’t some pristine laboratory system that will never have actual customers seeing the benefit of it. This is the actual instance that Azure makes available to its customers.”

 NVIDIA plans to apply these expanded compute abilities to a variety of tasks, including the company’s ongoing work in foundational model development, AI-assisted GPU design, neural rendering, multimodal generative AI and autonomous driving systems.

“Any good benchmark looking to maintain its market relevance has to continually update the workloads it’s going to throw at the hardware to best reflect the market it’s looking to serve,” Salvator said, noting that MLCommons has recently added an additional benchmark for testing model performance on Stable Diffusion tasks. “This is another exciting area of generative AI where we’re seeing all sorts of things being created” — from programming code to discovering protein chains.

These benchmarks are important because, as Salvator points out, the current state of generative AI marketing can a bit of a “Wild West.” The lack of stringent oversight and regulation means, “we sometimes see with certain AI performance claims where you’re not quite sure about all the parameters that went into generating those particular claims.” MLPerf provides the professional assurance that the benchmark numbers companies generate using its tests “were reviewed, vetted, in some cases even challenged or questioned by other members of the consortium,” Salvator said. “It’s that sort of peer reviewing process that really brings credibility to these results.”

NVIDIA has been steadily focusing on its AI capabilities and applications in recent months. “We are at the iPhone moment for AI,” CEO Jensen Huang said during his GTC keynote in March. At that time the company announced its DGX cloud system which portions out slivers of the supercomputer’s processing power — specifically by either eight H100 or A100 chips running 60GB of VRAM (640 of memory in total). The company expanded its supercomputing portfolio with the release of DGX GH200 at Computex in May.

This article originally appeared on Engadget at https://www.engadget.com/nvidias-eos-supercomputer-just-broke-its-own-ai-training-benchmark-record-170042546.html?src=rss 

Generated by Feedzy
Exit mobile version