钛媒体 05-15
Tencent Touts Enough GPU Stockpile for AI Model Training amid Tightening U.S. Chip Curbs
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_font3.html

 

TMTPOST -- Chinese internet behemoth Tencent Holdings Ltd. on Wednesday played down the tightening U.S. export controls on advanced chips that power artificial intelligence ( AI ) .

Credit:China Daily

"I think we have a pretty strong stockpile of chips that we acquired previously and that would be very useful for us in executing our AI strategy,"   Tencent President Martin Lau said when asked about the impact of the recent U.S. licensing requirement for the high-end graphics processing unit ( GPU )   on an earnings conference on Wednesday.

As to allocation of these chip   stocks, Tencent will first use them "for the applications that will generate immediate returns for us", namely, the advertising business as well as content recommendation product, and the training of Tencent ’ s   large language models   ( LLMs ) is the company ’ s second   priority,   Lau told analysts. While the   LLM   training requires higher-end chips, Lau noted a new trend showed it won ’ t need as much silicon as people previously expected.

"Over the past few months, we start to move off the concept or the belief of the American tech companies, which they call the scaling law, which required continuous expansion of the training cluster," Lau said. "And now we can see even with a smaller cluster you can actually achieve very good training results." He continued that there's a lot of potential that Tencent   can get on the post-training side which do not necessarily meet very large clusters. Therefore, he concluded "we should have enough high-end chips to continue our training of models for a few more generations going forward."

Lau acknowledged the demand for GPUs in AI inferences   workloads keeps growing, "and if we move into Agentic AI, it requires even more tokens than there's actually a lot of need on the inference side."    He said there are a lot of ways for Tencent to fulfill the expanding and growing inference needs.

One of these ways is software optimization   as Lau thought there's still quite a bit of room for the company   to keep on improving the inference efficiency. "So if you can improve inference efficiency 2x, then basically that means the amount of GPUs get doubled in terms of capacity",   he said, calling to invest in improving   the inference efficiency   "a very good way".

For Lau, Tencent can also customize different sizes of models   since   some applications do not require very large models. It can tailor-made models and distil models so that they can be used for different use cases and that can actually save on the inference usage of GPUs.  

In addition, Lau believed Tencent can   diversify sourcing of chips.   "We can potentially make use of other chips, compliant chips available in China or available for us to be imported, as well as ASICs and GPUs in some cases for smaller models,"   he said. "We just need to sort of keep exploring these avenues and spend probably more time on the software side rather than just force buying GPUs."  

Noting the U.S. export ban on Nvidia H20 chips and the rescission of the Biden-era ’ s rule to curb worldwide AI chip exports this week, Lau said Tencent has to keep balance to confront the   very dynamic situation. "On one end, sort of in a completely compliant way, and on the other end, sort of, we try to figure out the right solution for us to make sure that our AI strategy can still be execute,"   the president said.  

Nvidia Corporation,the leading AI chipmaker, disclosed on April 9   it was   informed by the U.S. government the same day that the government requires a license for export to China, including its two special administrative regions Hong Kong and Macau, of the company ’ s "H20 integrated circuits and any other circuits achieving the H20's memory bandwidth, interconnect bandwidth, or combination thereof."  

China accounted for $17 billion, or 13%, of Nvidia's revenue in its fiscal year 2025, Bernstein noted earlier this month.   Though DA Davidson analyst Gil Luria estimates that as much as 40% of Nvidia's revenue comes from China   owing to the chip smuggling.

The Bureau of Industry and Security ( BIS )   under   the U.S. Department of Commerce   announced on Tuesday the department initiated a rescission of the Biden administration ’ s AI Diffusion Rule ( AIDR ) , and   BIS enforcement officials would not to enforce the AIDR with compliance   requirements that were set to come into effect on May 15.

To scrap the AIDR can be deemed as a win for Nvidia and other chipmakers as the export caps under the rule were unveiled during Biden ’ s last week in office in January, as part of efforts from the Democratic administration to limit access to the chips needed to power cutting-edge AI. The rule, if enacted, is set to effectively eliminate backdoors for Chinese firms to circumvent existing U.S. export controls on high-tech.

"These new requirements would have stifled American innovation and saddled companies with burdensome new regulatory requirements. The AI Diffusion Rule also would have undermined U.S. diplomatic relations with dozens of countries by downgrading them to second-tier status, "   BIS said in a statement. The agency said it will issue a replacement rule in the future.  

"The Trump Administration will pursue a bold, inclusive strategy to American AI technology with trusted foreign countries around the world, while keeping the technology out of the hands of our adversaries,"said Kesslar.

宙世代

宙世代

ZAKER旗下Web3.0元宇宙平台

一起剪

一起剪

ZAKER旗下免费视频剪辑工具

相关标签

gpu the internet
相关文章
评论
没有更多评论了
取消

登录后才可以发布评论哦

打开小程序可以发布评论哦

12 我来说两句…
打开 ZAKER 参与讨论