• Like
  • Comment
  • Favorite

Amazon to use Nvidia tech in AI chips, roll out new servers

Reuters12-03

<a href="https://laohu8.com/S/AMZN">Amazon</a> to use <a href="https://laohu8.com/S/NVDA">Nvidia</a> tech in AI chips, roll out new servers

AWS to adopt Nvidia's NVLink Fusion in future AI chips

Amazon's cloud computing unit unveils AI Factories for faster AI model training

New AWS servers offer 4 times computing power, 40% less energy use

By Stephen Nellis and Greg Bensinger

LAS VEGAS, Dec 2 (Reuters) - Amazon.com's AMZN.O AWS cloud computing unit on Tuesday said it will adopt key Nvidia NVDA.O technology in future generations of its artificial intelligence computing chips as the firm ramps up efforts to get major AI customers using its services.

AWS, or Amazon Web Services, said it will adopt a technology called "NVLink Fusion" in a future chip, with no specified release date, known as Trainium4. The NVLink technology creates speedy connections between different kinds of chips and is one of Nvidia's crown jewels.

The companies made the announcement as part of AWS' annual week-long cloud computing conference in Las Vegas, which draws some 60,000 people. Amazon is expected to also show off new versions of its Nova AI model, initially unveiled last year.

Nvidia has been pushing to get other chip firms to adopt its NVLink technology, with Intel, Qualcomm and now AWS on board.

The technology will help AWS build bigger AI servers that can recognize and communicate with one another faster, a critical factor in training large AI models, in which thousands of machines must be strung together. As part of the Nvidia partnership, customers will have access to what AWS is calling AI Factories, exclusive AI infrastructure inside their own data centers for greater speed and readiness.

"Together, Nvidia and AWS are creating the compute fabric for the AI industrial revolution - bringing advanced AI to every company, in every country, and accelerating the world's path to intelligence," Nvidia CEO Jensen Huang said in a statement.

Separately, Amazon said it is rolling out new servers based on a chip called Trainium3. The new servers, available on Tuesday, each contain 144 chips and have more than four times more computing power than AWS' previous generation of AI, while using 40% less power, Dave Brown, vice president of AWS compute and machine learning services, told Reuters.

Brown did not give absolute figures on power or performance, but said AWS aims to compete with rivals - including Nvidia - based on price.

"We've got to prove to them that we have a product that gives them the performance that they need and get a right price point so they get that price-performance benefit," Brown said. "That means that they can say, 'Hey, yeah, that's the chip I want to go and use.'"

(Reporting by Stephen Nellis in San Francisco and Greg Bensinger in Las Vegas; Editing by Paul Simao)

((stephen.nellis@thomsonreuters.com))

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Report

Comment

empty
No comments yet
 
 
 
 

Most Discussed

 
 
 
 
 

7x24