Nvidia seeks to reduce its dependence on large technological companies by concluding new partnerships to sell its artificial intelligence chips in nation states, business groups and challengers to groups such as Microsoft, Amazon and Google.
This week, the American chip giant announced an American chip agreement of several billion dollars with the human of Saudi Arabia, while the United Arab Emirates announced its intention to build one of the The biggest data centers in the world In coordination with the American government, the Gulf States plan to build a massive AI infrastructure.
These “sovereign” transactions constitute a crucial part of NvidiaThe courtyard of the Court of Courts far beyond the Silicon Valley. According to business managers, industry initiates and analysts, the 3.2 TN flea manufacturer intends to develop his activities beyond so-called hyperscalers-large cloud computing groups which, according to Nvidia, represent more than half of his income from the data center.
The American company strives to strengthen potential competitors of the Amazon, Azure and Google Cloud web services from Microsoft. This includes the manufacture of “neoclouds”, such as Coreweave, Nebius, Crusoe and Lambda, part of its growing network of “Nvidia Cloud Partners”.

These companies receive preferential access to the internal resources of the flea manufacturer, such as its teams which advise how to design and optimize their data centers for its specialized equipment.
Nvidia also facilitates its easier cloud Partners to work with suppliers who integrate their chips into servers and other data center equipment, for example, by accelerating the purchasing process. In some cases, Nvidia has also invested in the Neoclouds, notably Coreweave and Nebius.
In February, the flea manufacturer announced that Coreweave was “The first cloud service provider To make the NVIDIA Blackwell platform generally available ”, referring to its latest generation of processors for AI data centers.
In recent months, NVIDIA has also struck alliances with suppliers, including Cisco, Dell and HP, to help sell to business customers, who manage their own corporate infrastructure of business business instead of outsourcing at the Cloud.
“I am more certain (from the commercial opportunity beyond the major cloud suppliers) today that I was not a year ago,” said Jensen Huang, Director General of Nvidia, Jensen Huang said The Financial Times in March.

Huang’s tour in the Gulf this week alongside US President Donald Trump has shown a strategy that the company wishes to reproduce around the world.
Analysts believe that agreements with the new company of the Saudi AI, the Human and the Emirati of the AI, the plans of G42, for a giant data center in Abu Dhabi will add billions of dollars to its annual income. The leaders of Nvidia say that he has been approached by several other governments to buy his chips for similar sovereign projects.
Huang is becoming more and more explicit on Nvidia’s efforts to diversify his activities. In 2024, the launch of its Blackwell chips was accompanied by support for all large technological companies. But when Huang unveiled his successor, Rubin, at his GTC conference in March, these allies were less visible during his presentation, replaced by Coreweave and Cisco.
He said at the event that “each industry” would have its own “AI factories” – specially designed installations dedicated to its powerful chips – which represent a new sale opportunity reaching hundreds of billions of dollars.
The challenge for NVIDIA, however, is that large technological companies are the “only that can monetize AI sustainably”, according to a Neocloud executive which works in close collaboration with the flea manufacturer. “The business market may be the next border, but they are not yet there.”
Sales of the corporate data center have doubled year by year in the last quarter budget of Nvidia, ending in January, while regional cloud suppliers have taken more of its sales. However, NVIDIA warned investors in regulatory documents that it still depends on a “limited number of customers”, widely considered as large technological companies that operate the greatest Cloud and general public internet services.
These same large technological groups develop their own rival ia chips and push them to their customers as an alternative to Nvidia.
Amazon, the largest cloud supplier, is considering a position in the formation of the AI that Nvidia dominated in the two and a half years since the Openai chatgpt launched the generation of the generator. AI Start-up Anthropic, which has Amazon as a large investor, uses AWS trainium processors to train and exploit its next models.
“There are many customers at the moment at trainium tires and working on models,” said Dave Brown, vice-president of calculation and networking at AWS.
Vipul Ved Prakash, Managing Director of Together AI, an Open-Source IA Neocloud who became a Cloudia partner in March in March, said the designation “gives you very good access to the Nvidia organization itself”.
“If hyperscalers will possibly be competitors and stop being customers, it would be important for Nvidia to have its own cloud ecosystem. I think this is one of the areas of intervention, to build this. ”
An executive of another Neocloud supplier said that the flea manufacturer was “concerned” that large technological companies move to their own personalized chips.
“This is why, I think, they invest in the neoclouds. Half of their income are hyperscalers, but they will eventually lose it, more or less.”