by Prasanth Aby Thomas

Nvidia’s strong earnings highlight AI’s rapid incursion across industries

News
Feb 22, 20245 mins
Artificial IntelligenceTechnology Industry

Despite the potential for heightened growth from the swift expansion of AI, supply limitations and broader geopolitical issues troubling the semiconductor industry continue to cause some concerns for the company.

Nvidia hq
Credit: Nvidia

Nvidia has announced stellar earnings for the fourth quarter, further fueling the momentum behind its soaring valuations. This highlights a broad interest in AI across various sectors, including enterprise, healthcare, and automotive.

Revenue for the quarter, which ended January 28, reached $22.1 billion, marking a surge of 265% year-on-year. For fiscal 2024, revenue climbed to $60.9 billion, a 126% rise from the previous year.

The excitement surrounding AI has boosted many companies lately, but Nvidia is the single major tech company that appears to have benefited most. Nvidia’s valuation recently surged beyond those of Alphabet and Amazon while dethroning Tesla as the most traded stock.

However, despite the potential for heightened growth from the swift expansion of AI, supply limitations and broader geopolitical issues troubling the semiconductor industry continue to cause some concerns for the company.

AI factories fueling growth

For fiscal 2024, a significant growth segment was data centers, wherein revenue reached $47.5 billion, more than tripling the previous year’s figures. The fourth quarter saw a record data center revenue of $18.4 billion, increasing 409% year-over-year.

This surge was primarily fueled by Nvidia’s Hopper GPU computing platform and InfiniBand end-to-end networking.

“As Moore’s Law slows while computing demand continues to skyrocket, companies may accelerate every workload possible to drive future improvement in performance, TCO, and energy efficiency,” Colette Kress, EVP & CFO of Nvidia, said in the earnings call. “At the same time, companies have started to build the next generation of modern data centers, what we refer to as AI factories, purpose-built to refine raw data and produce valuable intelligence in the era of generative AI.”

AI breaking barriers across verticals

The growth in data center operations during the fourth quarter was attributed to the training and inference of generative AI and large language models, spanning a diverse array of industries, use cases, and regions.

At the earnings call, Nvidia noted that AI is expanding across various sectors and solutions, with the field of large-language models flourishing. Companies like Recursion Pharmaceuticals and Generate:Biomedicines are developing foundation models for biology, indicating AI’s broad adoption in industries including automotive, healthcare, and financial services.

“Almost 80 vehicle manufacturers across global OEMs, new energy vehicles, trucking, robotaxi, and tier 1 suppliers are using NVIDIA’s AI infrastructure to train LLMs and other AI models for automated driving and AI cockpit applications,” Kress said.

Nvidia’s suite of AI solutions — spanning text generation, chatbots, and even protein prediction — empower businesses to develop tailored AI solutions for automating their operations, according to Thomas George, president of Cybermedia Group and CMR.

“Forging strategic alliances with tech titans such as Google, Microsoft, and Amazon have also bolstered Nvidia’s standing as a frontrunner in AI technology,” George added. “The ubiquitous integration of Nvidia’s GPUs, exemplified by groundbreaking initiatives like ChatGPT, has driven growth in the company’s AI revenue streams and enriched its business portfolio with diversified offerings.”

Supply constraints and navigating China restrictions  

Hopper GPU, the main driver of data center revenue for the company, presents a potential concern due to expected supply constraints, as demand substantially outstrips supply. Replying to analysts’ questions about how the supply chain constraints may pan out, Jensen Huang, CEO of Nvidia said that while there is overall improvement, Hopper’s complex tech makes this difficult.

“…the NVIDIA Hopper GPU has 35,000 parts,” said Huang. “It weighs 70 pounds. These things are really complicated things we’ve built. People call it an AI supercomputer for good reason. If you ever look in the back of the data center, the systems and the cabling system are mind-boggling. It is the most dense complex cabling system for networking the world’s ever seen.”

Recent headlines have also highlighted concerns regarding the US government’s restrictions on exports to China, affecting companies like Nvidia. Following these restrictions, Nvidia had to suspend its operations in China and revise its product offerings for the Chinese market.

“And we’re going to do our best to compete in that marketplace and succeed in that marketplace…within the specifications of the restriction,” Huang added.

Leveraging enterprise customers’ foray into AI

Leading OEMs such as Dell, HPE, Lenovo, and Super Micro are collaborating with Nvidia through their global sales channels to extend Nvidia’s AI solution to enterprises around the globe. Nvidia is poised to ship Spectrum-X within this quarter, according to Cress.

“Nvidia is now entering the ethernet networking space with the launch of our new Spectrum-X end-to-end offering designed for an AI-optimized networking for the data center,” Cress said. “Spectrum-X introduces new technologies over ethernet that are purpose-built for AI.”

The technologies embedded in the Spectrum switch, BlueField DPU, and associated software stack are expected to achieve a 1.6 times enhancement in networking performance for AI processing when compared to traditional Ethernet setups.