In a sign that the tech industry’s next big boom is taking hold, NVIDIA On Wednesday a sharp increase in the already huge demand for chips made to build artificial intelligence systems was predicted.
The Silicon Valley company’s products, called graphics processing units, or GPUs, are used to build most AI systems, including the popular ChatGPT chatbot. Tech companies ranging from start-ups to industry giants struggling to hold their,
Nvidia said revenue for the second quarter ended in July jumped 101 percent from a year earlier to $13.5 billion, while profit more than ninefold, due to strong customer demand for cloud computing services and other chips that power AI systems. increased to approx. $6.2 billion.
This was even better than what Nvidia had forecast in late May, when its $11 billion revenue estimate for the quarter surprised Wall Street and helped push Nvidia’s market value above $1 trillion for the first time. has helped.
Nvidia’s forecast and elevated market cap have become emblematic of the growing excitement around AI, which is changing many computing systems and the way they are programmed. He also raised interest in what Nvidia might next say about chip demand for its current quarter, which ends in October.
Nvidia forecast third-quarter sales of $16 billion, nearly three times the year-ago level and $3.7 billion above analysts’ average expectation of about $12.3 billion.
Chip makers’ financial performance is often considered a harbinger for the rest of the tech industry, and Nvidia’s strong results rekindle for tech stocks on Wall Street. Other tech companies like Google and Microsoft are spending billions on AI and making little, but Nvidia is taking advantage.
Nvidia Chief Executive Jensen Huang said major cloud services and other companies are investing to bring Nvidia’s AI technology to every industry. “A new computing era has begun,” he said in prepared remarks.
Nvidia’s share price was up more than 8 percent in after-hours trading.
Until recently, Nvidia received the largest portion of its revenue from the sale of GPUs for rendering images in video games. But AI researchers began using those chips in 2012 for tasks like machine learning, a trend that Nvidia exploited over the years by adding enhancements to its GPUs and multiple pieces of software to ease labor for AI programmers. Is.
Chip sales to data centers, where most AI training is accomplished, is now the company’s biggest business. Nvidia said revenue from that business rose 171 percent to $10.3 billion in the second quarter.
Patrick Moorhead, an analyst at Moor Insights & Strategy, said the rush to add generative AI capability has become a basic imperative for corporate heads and boards of directors. Nvidia’s only limitation at the moment, he said, is its struggle to supply enough chips — a gap that could create opportunities for major chip companies like Intel and Advanced Micro Devices, and start-ups like Grok.
Nvidia’s strong sales stand in stark contrast to the fortunes of some of its chip industry peers, which have been hit by soft demand for personal computers and data center servers used for general-purpose tasks. Intel said in late July that second-quarter revenue declined 15 percent, though the results were better than Wall Street expected. Advanced Micro Devices’ revenue fell 18 percent during the same period.
Some analysts believe that spending on AI-specific hardware, such as Nvidia’s chips and the systems that use them, is taking money away from spending on other data center infrastructure. Market research firm IDC estimates that cloud services will increase their spending on server systems for AI by 68 percent over the next five years.
Demand has been particularly strong for the H100, a new GPU made by Nvidia for AI applications that began shipping in September. Companies large and small are struggling to find supplies of the chips, which are manufactured in an advanced production process and require equally sophisticated packaging that combines GPUs with specialized memory chips.
Nvidia’s ability to increase deliveries of the H100 is largely tied to the actions of Taiwan Semiconductor Manufacturing Co., which handles the packaging as well as manufacturing of the GPUs.
Industry executives expect the H100 shortage to continue throughout 2024, a problem for AI start-ups and cloud services hoping to sell computing services that take advantage of the new GPUs.