Through Hole Donut Load Cells,Cheap Load Cell,Donut Load Cell,Through Hole Load Cell Xiaogan Yueneng Electronic Technology Co., Ltd. , https://www.xgsensor.com
"AI Chip Giant Hegemony Era: Google, NVIDIA, Intel, Huawei AI Chips Set to Debut Next Week"
By Zhang Shaohua
Today, Yu Chengdong, Huawei's senior vice president, posted a video on Weibo to hype up the company's upcoming AI chip. He emphasized that "the pursuit of speed never stops with imagination," predicting that the AI chip will debut on September 2 at IFA 2017.
At Huawei's mid-year performance media briefing last month, Yu Chengdong revealed that they would release AI chips this fall, and Huawei will be the first company to introduce artificial intelligence processors in smartphones. Additionally, at the 2017 China Internet Conference, Yu Chengdong mentioned that the chips produced by Huawei’s HiSilicon division will integrate CPU, GPU, and AI functions, possibly based on the new AI chip design unveiled by ARM at Computex earlier this year.
According to Yu Chengdong’s latest video, Huawei’s AI processor is expected to significantly boost the data processing speed of the Kirin 970. If the AI chip can be integrated into the Huawei Mate 10 smartphone launching in October, the data processing capabilities of Huawei Mate 10 will be quite impressive.

Like Huawei, global tech giants such as Intel, Lenovo, Nvidia, Google, and Microsoft are all actively embracing AI, with AI chips becoming a top priority.
Intel
In an interview with Media Xinzhiyuan this month, Song Jiqiang, director of Intel China Research Institute, highlighted the significance of AI chips, stating that we need to leverage technology to process large amounts of data and create value for customers. In this process, the chip plays an indispensable role.
By 2020, it is conservatively estimated that 50 billion devices will be interconnected globally. Future data will no longer come solely from human actions like calling, playing mobile games, or sending emails. Instead, data will be generated by autonomous vehicles, smart homes, surveillance cameras, and more.
Each driverless car generates over 4,000 gigabytes of data daily, and this data won’t be transmitted via 5G networks. Instead, much of it needs to be processed and analyzed locally before selective uploads. Local technologies beyond modern server capabilities will be essential.
As a traditional chip powerhouse, Intel launched a new generation of Xeon server chips in July this year, showing dramatic performance improvements. Its deep learning ability is 2.2 times that of the previous generation, capable of handling training and reasoning tasks. Intel also showcased Field Programmable Gate Array (FPGA) technology, which will play a critical role in AI’s future. They plan to release the Lake Crest processor, aimed at deeper learning codes.
Lenovo
According to Lenovo Group President Yang Yuanqing, "The AI general-purpose processor chip is the strategic commanding height of the AI era." Senior Vice President He Zhiqiang echoed this view, emphasizing that AI chips are the engines of artificial intelligence and will play a decisive role in the development of the smart internet.
Just last week, Lenovo Capital Partners and top investors like Alibaba Ventures jointly invested in Cambricon Technologies, known as the world’s first unicorn chip startup.
Nvidia
Over the past few years, Nvidia has shifted its focus to AI and deep learning. In May this year, Nvidia released a heavyweight AI processor: the Tesla V100.
The chip contains 21 billion transistors and is far more powerful than the Pascal processor with 15 billion transistors released by Nvidia last year. Despite being roughly the size of an Apple Watch, it features 5,120 CUDA (Compute Unified Device Architecture) cores and double-precision floating-point performance of 7.5 trillion operations per second. Nvidia CEO Jensen Huang said that building this chip cost $3 billion, and its price will be $149,000.
Google
Shifting to an "AI-first" strategy, Google released its TPU (Tensor Processing Unit) last year, specifically designed for machine learning. Compared to CPUs and GPUs, TPU efficiency has increased by 15-30 times, and energy consumption has dropped by 30-80 times.
At this year’s Google Developers Conference in May, Google announced a new product, the Cloud TPU, featuring four processing chips capable of completing 180 teraflops of computing tasks per second. Connecting 64 Cloud TPUs forms a supercomputer called Pod, which will deliver 11.5 petaflops of computational power—this will be an essential tool for AI research.
Currently, TPU has been deployed across nearly all of Google’s products, including Google Search, Google Assistant, and even in AlphaGo’s matches against Lee Sedol, where TPU played a crucial role.
Microsoft
Last month, reports indicated that Microsoft will include an independently designed AI coprocessor in the next-generation HoloLens to analyze the content users see and hear on the device without wasting time transferring data to the cloud. This AI chip is still under development but will be integrated into the next-generation HoloLens Holographic Processing Unit (HPU). Microsoft stated that the AI coprocessor will be their first chip designed for mobile devices.
In recent years, Microsoft has been heavily investing in developing its own AI chips. They created an action-tracking processor for the Xbox Kinect gaming system and customized field-programmable gate arrays (FPGAs) to compete in cloud services against Google and Amazon. Microsoft also purchased programmable chips from Intel subsidiary Altera to write custom software to meet specific demands.
Last year, Microsoft used thousands of AI chips at a conference to translate all English Wikipedia articles into Spanish—about 5 million articles—in under 0.1 seconds. Moving forward, Microsoft aims to enable customers using their cloud platform to perform tasks like image recognition or consumer purchase prediction through AI chips powered by machine learning algorithms.
With AI chips set to debut soon from these tech giants, the race for dominance in the AI era is heating up."