With its meteoric rise, Nvidia has also become a massive target.

Mon Feb 26 2024
Rajesh Sharma (2049 articles)
With its meteoric rise, Nvidia has also become a massive target.

A lack of a long-term strategy is something that Jensen Huang is fond of claiming. He claims that “now is the most important time” and hence refuses to wear a watch.

This is a time of tremendous success—and more than a hint of danger—for the CEO of Nvidia.

Huang has become the undisputed ruler of the IT world following an AI-driven commercial bonanza that has propelled Nvidia’s valuation into the stratosphere of $2 trillion. He has successfully shifted the focus of his company’s semiconductor designs from computer graphics to teaching AI systems, ensuring that his chips are indispensable to tech giants like Tesla and Microsoft. Huang has amassed a wealth of about $68 billion as a result of his efforts.

The phenomenal rise of the immigrant from Taiwan who, more than three decades ago, made headlines for famously hatching business schemes at a Denny’s restaurant is nothing short of remarkable. At Nvidia’s annual conference, he now wears his trademark black jacket, jeans, and sneakers while mingling with world leaders, courting fellow billionaires who are hooked on his chips, and delivering lengthy speeches on his views on artificial intelligence.

Many businesses, including Nvidia’s competitors and even some of the consumers shelling out billions of dollars year for the essential chips, have set their sights on the company in an effort to undermine its supremacy.

Both Advanced Micro Devices and Intel have quickened the pace at which they release AI chips. Even though they purchase a lot of Nvidia’s AI-training chips, Amazon, Google, and Microsoft all have or are working on their own designs. Microsoft and Intel announced a partnership on Wednesday, coinciding with Nvidia’s unveiling of its most recent blockbuster quarterly results. Under the terms of the contract, Microsoft would use Intel’s manufacturing operation to create custom chips.

While Huang is very much focused on the here and now, he is also quite concerned about the future of the organization. He is working on new applications for Nvidia chips, funding entrepreneurs that are developing their companies around his technology, and presenting it to governments throughout the world so that they can create their own artificial intelligence infrastructure.

The 61-year-old Huang promotes a startup-like culture in which his 30,000 employees behave as if the company were about to go bankrupt, drawing inspiration from Andy Grove’s “Only The Paranoid Survive”—a book on turning near-failure into lasting success. He claims to seek out challenging situations, finds solutions, and then attempts to put the difficulty of the problem behind him.

“It must be the feeling that marathoners go through,” he said to a gathering of Indian technology graduates in Santa Clara in September. The entertainment value of pain and suffering can’t be underestimated. I used to use this statement more often, but now I use it less.

A lot of people, including Huang’s rivals and critics, are shocked that Nvidia has maintained its lead in the artificial intelligence battle. More than 3.5 times as much money came in for Nvidia in the most recent quarter as the same period last year. Profit increased by a factor of about nine. Within a span of sixteen months, its stock has increased sixfold. People have great expectations.

In the lightning-fast world of Silicon Valley, Huang’s stay as CEO of Nvidia, which he founded over 30 years ago, is virtually unprecedented. At a Denny’s in San Jose, he and his co-founders, Chris Malachowsky and Curtis Priem, who were both engineers, came up with the idea. Last September, once Nvidia reached a $1 trillion valuation, the booth where they spoke received a plaque—now considered outdated—in recognition of the occurrence.

Accelerated computing was the initial inspiration for Nvidia. Central processing units (CPUs), the “chips at the heart” of every computer, are digital “jacks-of-all-trades” that can reasonably perform a range of tasks. This idea that specialized chips could perform better at certain jobs gave rise to the concept of accelerated computing. The goal of Nvidia was to enhance computer visuals.

Graphics processing units (GPUs) were the company’s bread and butter for the first two decades as it catered to PC gamers who craved higher resolution and quicker screen refresh rates.

After seeing that Nvidia graphics cards were being used by medical researchers, Huang made the GPUs available to everyone in 2006.

Outside researchers found Nvidia’s GPUs to be great for AI computing a few years down the road. The mathematics required to construct intricate AI systems aligns with the operation of graphics circuits, which do several calculations simultaneously.

Language models, such as OpenAI’s ChatGPT, which are part of the AI systems driving the present generative AI boom, rely heavily on GPUs from Nvidia. Data processing on an unprecedented scale is required for the purpose of training these systems. Tens of thousands of GPUs from Nvidia were used to build ChatGPT.

A technological arms race began with OpenAI’s ChatGPT in late 2022. The demand for Nvidia GPUs exceeded production. While things are looking improved, the corporation did announce on Wednesday that supplies were still low.

The tech world’s gravitational center shifted to Nvidia out of nowhere. Business moguls in the tech sector griped about the scarcity of H100s, the cutting-edge GPUs manufactured by Nvidia.

This past May, when working on AI capabilities at Tesla and launching a new startup called xAI, Elon Musk quipped, “GPUs at this point are considerably harder to get than drugs.” So said.

Larry Ellison, chairman and founder of business software giant Oracle, detailed a dinner he and Musk had with Huang at the posh Palo Alto, California, Japanese restaurant Nobu during a company conference in September. To put it simply, Ellison and Elon were pleading, as he recalled. “An hour of begging and sushi.”

Facebook CEO Mark Zuckerberg recently boasted about his intentions to invest billions more dollars this year into his expanding artificial intelligence ambitions, as reported by Meta Platforms. He predicted that 350,000 Nvidia H100s will be in stock by year’s end.

Due of a scarcity of Nvidia chips, Huang now has tremendous influence. Who comes out on top in the AI race may depend on how Nvidia distributes its limited supplies. When asked how it makes its decisions, Nvidia remained mum. On Wednesday, an analyst questioned Huang about his process for doing this, and Huang explained that while the company strives for equitable allocation, it avoids selling chips to customers who won’t utilize them right away.

According to Huang, Nvidia frequently partners with cloud-computing businesses and provides them with chip allocations. According to Huang, the cloud businesses would benefit from those actions.

Nvidia is said to control over 80% of the AI chip industry, even though the AI trend has been going strong for over a year.

If customers succeed in creating their own AI processors, it may put billions of dollars worth of revenue at risk for Nvidia. Artificial intelligence (AI) processors developed in-house by tech giants like Google and Amazon are getting better all the time. The likes of Microsoft and Meta have only lately joined the battle.

The largest purchasers of Nvidia’s processors are critically important to the company. They didn’t mention which one, but that one was responsible for almost $11 billion—almost 20% of their sales in their most recent fiscal year.

In the most recent quarter, cloud-computing companies such as Google, Microsoft, and Amazon spent over $35 billion on capital expenditures, accounting for over a quarter of Nvidia’s sales of over $9.2 billion.

In a statement, Microsoft clarified that their proprietary processors work in tandem with Nvidia’s, rather than as a substitute, and give consumers more options in terms of pricing and performance. If one is familiar with Google’s chip strategy, they will tell you that the firm offers both its own processors and Nvidia’s chips, so customers may choose the best chip for their demands and budget. Amazon has grown its long-term cooperation with Nvidia last year and boasted that its cloud service supplied the company’s chips in the widest variety. This is in addition to the homegrown solutions that Amazon is developing.

On the other hand, Nvidia announced on Wednesday that it expects its gross profit margins to be around 76% in the current quarter. This makes the company an attractive target for both consumers and competition. According to Raymond James analyst Srini Pajjuri, Nvidia spends somewhat more than $3,000 on each of its high-end H100 chips, which retail for around $25,000.

Chips designed to rival Nvidia’s are already on sale from Advanced Micro Devices. Intel and British chipmaker Arm Holdings are both promoting AI-ready chips, and recent market movements may benefit these companies. Nvidia may have the best chips for training AI models, but Intel, AMD, and a slew of AI chip startups can hold their own when it comes to deploying trained systems.

Recent efforts by Huang to cultivate prospective government clients have been focused on expanding their business. When he spoke at a government conference in Dubai earlier this month, he used the term “sovereign AI” to describe the idea that governments should not outsource the development of artificial intelligence but instead maintain their own data and computing infrastructure.

However, Huang has to deal with geopolitical issues surrounding Nvidia’s semiconductors, particularly as they pertain to China. The United States has, for the last two years, restricted the export of artificial intelligence (AI) chips manufactured by companies like Nvidia to China over fears that the Chinese military may utilize these processors to develop sophisticated weaponry and launch cyberattacks. The last quarter of fiscal year 2018 saw a precipitous decline in Chinese sales for Nvidia.

Despite investigations into Nvidia’s business practices initiated by competition regulators in China, the United Kingdom, France, and the European Union—all of which are aware of the company’s commanding market share—no consequences have been imposed thus far.

As Nvidia grows in popularity, Huang has poured resources into businesses in the fields of artificial intelligence (AI), robotics (robotics), automation (automation), and healthcare (healthtech) that are building technologies around Nvidia’s chips. As of the end of January, the company’s assets in other companies were worth over $1.55 billion, a fivefold increase from the previous fiscal year. According to data compiled by Dealogic, its investment activity in 2023 more than tripled from the previous year, totaling around 30 businesses.

The investment approach of Nvidia aligns with the co-founder’s propensity to place bets based on a vision ten years ahead, according to Umesh Padval, a managing director at the venture capital company Thomvest Ventures who has known Huang for decades. Last year, Thomvest and Nvidia were part of an investment round for the Canadian AI startup Cohere.

“He began insistently asking, ‘Listen, how can I build the ecosystem to increase sales of my chips and systems?'” He spoke.

Nvidia also poured money into CoreWeave, a cloud computing startup. Rivals like Amazon and Microsoft are in its crosshairs as it runs massive data centers stocked with Nvidia’s AI chips and rents out their processing power.

In contrast to its bigger competitors, CoreWeave has purchased a sizable fleet of Nvidia’s most cutting-edge chips rather than investing in chip development in-house. In a $2.3 billion financing arrangement last year, CoreWeave utilized the chips as security because of how valuable they are.

Additionally, Huang is depending on the tenacity of CUDA, Nvidia’s software gateway for utilizing its GPUs, which has also become entangled in the AI growth. Over the years, CUDA’s millions of lines of code have simplified the process of creating new artificial intelligence applications that run on Nvidia’s GPUs. Even while they haven’t achieved widespread momentum, Nvidia’s competitors have created their own AI software to challenge the company.

Customers who do business with Nvidia’s rivals may face retaliation since access to Nvidia’s chips is so valuable. Groq CEO Jonathan Ross argued that consumers were understandably afraid to poke the beast when they desired alternatives.

He spoke about how many individuals we meet with who claim that Nvidia would deny the meeting if they knew about it. “The issue is that you’re required to prepay Nvidia a year before you can receive your hardware. The delivery timeframe is uncertain, and it’s frustrating to deal with third parties, so it’s understandable that there might be some delay.”

The expansion of Nvidia has had an impact on Huang’s life. He does not drive himself anymore, he claims, for safety reasons, so someone else drives him around in a black Mercedes EQS electric vehicle.

Beyond artificial intelligence, one of his long-term goals is to see his chips used in computational biology and drug development. When questioned about his future plans, he generally brings it up. Also involved is Nvidia, which last year put $50 million into the drug-discovery startup Recursion.

Huang calls it one of his “zero-billion dollar markets,” and many in the chip industry see it as similar to what artificial intelligence could have been a decade ago: an impossibility with a small potential customer base.

At a healthcare conference hosted by J.P. Morgan in January, he engaged in a conversation with Recursion’s chairman about nebulous industry concepts like structure prediction, cryo-electron microscopy, and X-ray crystallography.

I don’t intend to brag, but I don’t think any other CEO of a semiconductor company would speak that way,” he remarked. “Please do not hesitate to contact us; we have the algorithms, mathematics, and expertise to work with you as a partner throughout the whole drug and medicine discovery pipeline.”

According to Huang, he has more time to solve difficult problems if he jumps into emerging businesses like AI early on.

“I’m not very fast, but I’m very persistent,” he recently told a group of Chinese semiconductor professionals. For hours at a time, I work on a single project. As a result, I have plenty of time to tackle challenging problems if I so desire, and our organization is happy to do the same.

Rajesh Sharma

Rajesh Sharma

Rajesh Sharma is Correspondent for Stock Market of South East Asia based in Mumbai. He has been covering Asian markets for more than 5 years.