Global Sourcing OEM Limited is a global supply chain solutions company in the industry of electronic components. Our strong brands: TI, ADI/LT, ST, Maxim, Cypress, Altera and Microchip

News Center-Global Sourcing OEM Limited

“The more you buy, the more money you save.”

A few days ago, when he delivered a keynote speech at the annual conference of the Association for Computing Machinery (ACM) Computer Graphics and Interactive Technology Special Interest Organization (SIGGRAPH), Nvidia CEO Huang Renxun called the newly released GH200 “the world’s fastest AI chip.”

Huang Renxun said that in order to meet the growing demand of generative AI, data centers need to have accelerated computing platforms for special needs. The new GH200 Grace Hopper superchip platform offers superior memory technology and bandwidth for increased throughput, the ability to losslessly connect GPU aggregate performance, and a server design that can be easily deployed throughout a data center.

Huang Renxun said that in the age of AI, Nvidia’s technology can replace traditional data centers, and a new technology with an investment of 8 million US dollars can replace a 100 million US dollar facility built with old equipment, and electricity consumption can be reduced by 20 times. “That’s why data centers are moving to accelerated computing. The more you buy, the more you save.”

As the destocking cycle of the chip market is coming to an end, but the market demand has yet to be boosted, Huang Renxun, who has never done strategic planning or talked about market share, is not a “Buddhist” or “lying flat”, but uses GH200 It has shown its detachment from the low-dimensional Red Sea market and low-dimensional homogeneous competition, and has further consolidated Nvidia’s position as the leader in AI chips.

With the popularity of ChatGPT, large models of generative AI (foundation models) are rapidly coming out of research laboratories and rushing to various scenarios and applications in the real world. The world’s top management consulting firm McKinsey believes that with the in-depth application of generative AI, high-tech, banking, retail and consumer packaged goods, health care, and advanced manufacturing will become the most affected industries, which will generate about US$1 trillion annually. economic benefits. Among them, Nvidia’s AI chip is undoubtedly an indispensable existence.


Large domestic order for Nvidia GPU

Nvidia’s GPUs have become the hottest commodity in the global tech industry since its breakthrough in generative artificial intelligence. However, the worldwide shortage of GPUs has the tech giants worried.

The Chinese internet giant has ordered Nvidia chips worth $5 billion, according to sources. According to various sources, Baidu, ByteDance, Tencent and Alibaba have issued orders worth US$1 billion to Nvidia to purchase about 100,000 A800 chips, which are scheduled to be delivered this year. Because of restrictions imposed by the U.S. government, Chinese companies can only source the A800 chip, which is slightly less powerful than Nvidia’s cutting-edge chip, the A100. In addition, these companies have also purchased graphics processors worth $4 billion, which are scheduled to be delivered in 2024.

This ordering behavior helps the Internet giant build the high-performance hardware needed for generative artificial intelligence systems, but as global demand for GPUs far outstrips supply, the price of Nvidia GPUs has also risen. According to an Nvidia reseller, the price of the A800 in the hands of the reseller has risen by more than 50%. This also reflects the growing demand for Nvidia GPUs in the market.

At the same time, the current large-scale model products in my country have initially achieved commercial capabilities. Various manufacturers are stepping up their own computing power layout, data compliance improvement, and product polishing. Some products have already been well evaluated in the stage of industry customer layout. With the deepening of the layout of various manufacturers, their demand for GPU will be further released, which will undoubtedly exacerbate the imbalance of supply and demand, and then make GPU prices reach new highs.

Among domestic large-scale model-related companies, HKUST Xunfei will release a major version upgrade of the Spark Cognitive Large Model V2.0 on August 15, which will break through the code capability and multi-modal interaction capabilities; the company expects that on October 24, the Spark Cognitive Model will The goal of benchmarking against ChatGPT will be achieved.

Baidu released Wenxin Yiyan, an internal beta version for iOS, in early July, and 150,000 companies have applied for access to Wenxin Test. Wenxin and more than 300 ecological partners have achieved good test results in more than 400 scenarios, and have taken the lead in more than ten industries such as energy, automobiles, government affairs, and transportation. Subsequent policies for large AI models may be gradually released and improved. At that time, Wenxin’s To C business may begin to be gradually implemented, centering on Baidu’s own search and small business.

On August 9, 360 released the first large-scale model of the security industry in China that can be delivered, “360 Security Model”.

In addition, Huawei Hongmeng 4.0 has been connected to the Pangu large model, and Tencent’s self-developed Tencent Hunyuan large model has entered the company’s application testing stage.


Saudi Arabia, UAE, China, U.S. rush to buy AI chips

Supply of AI chips, the hottest commodity in Silicon Valley right now, is being squeezed by the global AI race.

So far, the world’s most advanced large language models are owned by American companies such as OpenAI and Google. They are also major buyers of Nvidia’s H100 and A100 chips. The H100 is Nvidia’s latest chip, and the A100 is its predecessor.

According to multiple sources close to Nvidia and its chip foundry TSMC, Nvidia will ship about 550,000 of the latest H100 chips globally in 2023, mainly to US technology companies.

Gulf powerhouses, Saudi Arabia and the United Arab Emirates have joined the AI arms race and are procuring thousands of high-performance Nvidia chips critical to building AI software. To accelerate economic development and transformation, Saudi Arabia and the UAE are pursuing ambitious AI initiatives. They have launched their respective national AI strategies, declaring publicly that their goal is to become a global leader in AI.

Saudi Arabia has purchased at least 3,000 Nvidia H100 chips worth about $120 million through the King Abdullah University of Science and Technology (Kaust), a public research institution, for delivery by the end of 2023, according to people familiar with the matter. So, are there many of these chips? According to statistics, OpenAI used 1024 A100 chips to train the advanced GPT-3 model in just over a month.

Saudi Kaust also has at least 200 A100 chips and is developing a supercomputer called Shaheen III that will be operational this year, the people said. The machine will be equipped with 700 Nvidia super chips Grace hopper, designed for cutting-edge AI applications.

To sum up, the arrival of the AI era has become an established trend, and the development of the AI business has become an important strategy for countries all over the world. AI chips, especially the AI chips of Nvidia, a major GPU manufacturer, have become increasingly popular due to their excellent performance and unrivaled competitiveness. , It is bound to be highly praised by the market. Under the general situation of sluggish demand in the traditional chip track, it has become a highlight and hot spot in the chip industry, and has far-reaching and lasting market potential and prospects.

Under the premise that the chip becomes the rigid need of the AI strategy, recall what Huang Renxun said earlier, Nvidia’s technology can replace the traditional data center, and the new technology with an investment of 8 million US dollars can replace the 100 million US dollar facility built with old equipment, and the electricity consumption can be reduced. 20 times. “The more you buy, the more you save.” There’s nothing wrong with this saying!