Taipei, Wednesday, Nov 27, 2024, 06:27

Article

AI-Driven Industrial Transformation and Trends

By Ji Ping
Published: Sep 01,2024

The immense computing power of AI is reshaping the technology industry. From the development of ChatGPT by OpenAI in late 2022 to the launch of Sora, a text-to-image AI capable of generating 60-second videos from single prompts, in February 2024, AI's extraordinary computing, learning abilities, and evolutionary speed are evident. OpenAI also unveiled GPT-4o, a higher-performance and more efficient AI model, ahead of the I/O developer conference. GPT-4o can engage in realistic voice conversations, boasting comprehension and dialogue capabilities comparable to a super Siri. It can process 50 languages and operates twice as fast as GPT-4 Turbo at half the cost. Powering GPT-4o is Nvidia's graphics processing unit (GPU).

More on This

SnapMagic and Harwin Partner to Help Engineers Design Faster

SnapMagic and Harwin have partnered to release over 3500 new computer-aided design (CAD) models that will help engineers design electronics faster...

NexGen Wafer Systems Introduces SERENO: A Versatile, High-Throughput Wet Etch and Clean Solution

NexGen Wafer Systems announced the launch of SERENO, its latest multi-chamber platform designed for Wet Etch and Clean applications...



Figure 1 :   GPT-4o can adeptly and swiftly narrate the same story in various tones and voices based on user preferences.(Source:OpenAI)
Figure 1 : GPT-4o can adeptly and swiftly narrate the same story in various tones and voices based on user preferences.(Source:OpenAI)

Tech Giants Wave the AI Banner, Accelerating Industrial Transformation

Following Google's launch of the generative AI chatbot Gemini in 2023, the company unveiled its sixth-generation AI chip, Trillium, at the Google I/O 2024 developer conference on May 14th this year. Trillium boasts a 4.7x performance boost compared to its fifth-generation predecessor, enhancing the execution efficiency of the Google Cloud platform while being Google's "most energy-efficient" AI chip to date. Google also introduced the more powerful Gemini 1.5 Pro AI model and the lightweight Gemini 1.5 Flash model. The rapid iteration of AI models by these two leading companies underscores the astonishing pace of AI's "intelligence growth." Gemini boasts a multitude of features, including the real-time voice conversation function Gemini Live. Project Astra offers robust visual analysis capabilities, allowing users to simply point their phone camera and converse with Gemini to identify objects and sounds, even providing vivid descriptions. The video creation function "Veo" enables the generation of up to 1-minute high-definition AI videos with simple prompts.



Figure 2 :   Google CEO Sundar Pichai unveiled the more powerful AI model Gemini 1.5 Pro at the I/O developer conference.(Source:Google)
Figure 2 : Google CEO Sundar Pichai unveiled the more powerful AI model Gemini 1.5 Pro at the I/O developer conference.(Source:Google)

Generative AI has made it clear that AI is no longer just a distant promise, but a reality that has entered our daily lives. The changes brought about by AI are rapidly transforming global industries and supply chains. The "Big Seven" tech giants in the United States have fostered or adopted AI in their respective fields. Nvidia, Apple, Alphabet (Google's parent company), Microsoft, Amazon, Meta, and Tesla are all leading the AI wave, while others like Super Micro, AMD, and Intel face the pressure to keep up. These companies collaborate and compete simultaneously. For instance, the four major players in AI cloud computing—Amazon, Google, Meta, and Microsoft—are the largest buyers of Nvidia's GPUs, yet they are also developing their own chips or actively partnering with startups beyond Nvidia.


Nvidia, the leader in the AI chip market with nearly 90% market share, virtually monopolizes AI computing resources. The H100, a powerful data center-grade AI accelerator chip released by Nvidia in 2023, plays a crucial role. Generative AI primarily utilizes existing data to train AI models in various tasks like translation, summarization, and image synthesis, which demands immense computing power. The H100 is four times faster than its predecessor, the A100, in training large language models (LLMs), and thirty times faster in responding to user prompts.


In March of this year, Nvidia CEO Jensen Huang showcased a humanoid robot with generative AI capabilities at the Nvidia GTC technology conference, while also unveiling the next-generation AI chip, the Blackwell B200 GPU. It is understood that OpenAI, Amazon, Google, Meta, Microsoft, Oracle, and Tesla will all adopt the Blackwell architecture chip. Huang stated that the era of smarter, faster-reacting humanoid robots is imminent. The following day, not only did "robot concept stocks" in the Taiwan stock market surge, but AI-related industries also benefited from this future trend. In fact, besides Nvidia, companies like Tesla, Figure, and Sanctuary AI are actively investing in general-purpose humanoid robots. It seems the day when AI/robots become our friends, colleagues, partners, children, pets, and caregivers is not far off.



Figure 3 :   The GX B200, equipped with eight Nvidia Blackwell GPUs, boasts a 3x increase in training performance and a 15x increase in inference performance compared to previous generations. (Source:Nvidia)
Figure 3 : The GX B200, equipped with eight Nvidia Blackwell GPUs, boasts a 3x increase in training performance and a 15x increase in inference performance compared to previous generations. (Source:Nvidia)

AI robot control systems have largely integrated Generative AI technologies such as Large Language Models (LLMs) and Large Behavior Models (LBMs). To mimic human behavior, AI humanoid robots rely on semiconductors for perception, analysis, and motor control, with companies like NXP Semiconductors and Infineon Technologies playing a significant role. The automation applications of humanoid robots require the assistance of computer-aided design software from companies like Altair and Ansys, along with the collaboration of humanoid robot manufacturers such as FANUC Corporation, Teradyne, and YASKAWA.


In the fiercely competitive AI chip market, Nvidia is not without rivals. AMD has been actively building its AI hardware ecosystem in recent years, introducing the flagship MI300X AI chip while expanding product functionality and ecosystem support to attract users with more tools and resources.


Taiwan's AI Supply Chain Looks Promising

The soul of AI lies in its chips, specifically designed for AI algorithms. These chips can execute machine learning tasks on field-programmable gate arrays (FPGAs), graphics processing units (GPUs), and application-specific integrated circuit (ASIC) accelerators. They handle more variables and detailed calculations with significantly larger data volumes than traditional chips. AI chips play a crucial role in managing complex, data-intensive computing tasks, and are essential in high-end image processing, servers, automobiles, and mobile phones. This highlights the importance of the AI leader, Nvidia.


The ChatGPT-driven generative AI boom has fueled the demand for related software and hardware. With Nvidia's shipments, profits, and market value soaring, its supply chain, including raw material suppliers, component manufacturers, assembly OEMs, brand manufacturers, and software service providers, has also benefited. The upstream and midstream segments of the AI supply chain comprise key component hardware manufacturers, with servers playing a pivotal role. Servers are responsible for the execution speed, accuracy, and system stability of AI algorithms. The downstream segment primarily consists of software service providers.


Under the leadership of tech giants, Taiwan's AI supply chain is also riding the wave, especially TSMC, the "guardian deity of the nation." TSMC holds over 50% of the global wafer foundry market and over 90% of the advanced AI chip application market, driving the booming development of Taiwan's AI supply chain. AMD's MI300A product utilizes TSMC's 5nm process for its CPU and GPU chiplets, and the 6nm process for its IO chiplets, all integrated using TSMC's advanced packaging technologies like SoIC and CoWoS. Nvidia is also one of TSMC's major customers, with its H200 and GH200 chips in high demand. New products like the B100 and GB200, based on TSMC's 3nm process, are expected to be launched by the end of the year.



Figure 4 :   TSMC commands over 50% of the entire wafer foundry market and an even more dominant share of over 90% in the AI advanced chip application market.(source:TSMC)
Figure 4 : TSMC commands over 50% of the entire wafer foundry market and an even more dominant share of over 90% in the AI advanced chip application market.(source:TSMC)

As the four major cloud service providers (CSPs) - Amazon, Meta, Google, and Microsoft - actively embrace AI and shift towards developing AI chips using application-specific integrated circuits (ASICs), IP companies are reaping the benefits, with Taiwanese firms like Alchip, Global Unichip, and eMemory also set to gain. With the CSPs heavily investing in AI servers, their combined capital expenditures for this year and next are projected to reach 370 billion US dollars, benefiting Taiwanese companies such as Wiwynn, Quanta, and Wistron, with strong business forecasts extending into 2025.


Meanwhile, the surging demand for AI servers, new traditional server platforms, and 400G/800G switches is also benefiting Taiwan's high-end copper clad laminate (CCL) and high-density interconnect (HDI) supply chain. Goldman Sachs estimates a compound annual growth rate (CAGR) of approximately 27% for the global high-end CCL market from 2023 to 2025, with very low loss (VLL) and above high-end products in particularly high demand.


In the future, driven by Nvidia and Super Micro, Taiwan's PCB supply chain is expected to benefit, including companies like Tripod Technology, Kinsus Interconnect Technology, and Elite Material, the world's largest halogen-free substrate supplier and a leading manufacturer of HDI/substrate-like materials for handheld devices, is also a CCL supplier for Nvidia's AI servers. As the sole high-end CCL material supplier in Taiwan, TUC's future revenue is promising as it rides the AI wave.


With Nvidia, Google, and Cisco successively launching 800G products, data center power consumption issues have surfaced, and co-packaged optics (CPO) has emerged as a key technology to address this challenge. Network equipment manufacturers (like Accton Technology) and upstream and midstream optical communication OEMs (like LuxNet and Sanchuen) are expected to benefit. Of course, due to the high computing power and power consumption of AI chips, heat dissipation specifications must also be enhanced, driving demand for high-end heat sinks. Major heat dissipation manufacturers like Asia Vital Components, Auras Technology, Jentech, Sunon and Kaori are expected to benefit as well.


Opportunities and Challenges Brought by the AI Megatrend

In April, the Market Intelligence & Consulting Institute (MIC) released its forecast for the information and communication technology (ICT) industry, predicting a slow global economic recovery in 2024, but with the AI wave driving growth in global information systems products. Global shipments of laptops and desktops are expected to reach 176 million and 69.77 million units respectively, representing growth rates of 4.4% and 2.7%. With the further integration of AI PC software and hardware in 2025, this sector is poised to become a key driver of industrial recovery.


AI servers, benefiting from the sustained demand for generative AI large language models and internal corporate model fine-tuning, will be the primary growth engine for the server market in 2024. Global shipments are projected to reach 13.49 million units, a growth of approximately 5.1%.


As for the semiconductor industry, with the global semiconductor market inventory adjustment nearing completion in 2024 and the resumption of positive growth in end-product shipments, coupled with long-term demand from automotive, HPC, and AIoT sectors, the global semiconductor market is expected to return to positive growth in 2024. MIC is particularly optimistic about three major development trends: smart city adoption of AI driving equipment opportunities, telecom operators leveraging AI for transformation and breakthroughs, and the application development of AI virtual humans.


Furthermore, advanced packaging plays a crucial role in the development of AI chips in Taiwan, enhancing functional integration capabilities or computing performance. Regarding AIoT chip development, MIC believes that if Taiwanese manufacturers aim to achieve cost control at scale, they must prioritize the flexible combination of general-purpose chips to meet customized specifications. For instance, utilizing advanced packaging to integrate different functional chiplets into a more compact and portable single-chip package, or leveraging advanced packaging technology to enable AIoT chips to stack chiplets. Memory capacity and bandwidth contribute to improving computing performance. In addition to combining HBM with advanced process computing chips through CoWoS packaging, some companies have also developed customized DRAM and mature process Edge AI chip stacking technology, which can be applied to lower-cost AI solutions.


MIC points out that for the next generation of high-end computing chips, the vertical stacking of transistors or the internal connection structure of back-end rails require chip 3D stacking technology with precision up to 10nm. This will be a significant opportunity and challenge for the development of advanced packaging.


Opening Pandora's Box

The changes brought by AI and digital technology are comprehensive, with immeasurable commercial value, but they also bring cybersecurity concerns. MIC warns of two categories of concerns: the first involves AI systems as attack targets, such as data poisoning during data processing, attacks after model deployment, adversarial attacks due to model theft, data leakage caused by prompt injection, and AI worm threats, all posing significant challenges to zero-trust mechanisms. The second category involves hackers utilizing AI for cyberattacks, such as Deepfakes, identifying system vulnerabilities, and phishing emails. Although generative AI holds immense potential, there are still many challenges to overcome in building generative AI systems, including establishing trustworthy AI mechanisms, addressing engineering challenges related to new and old software, and complying with global AI regulatory policies.


In addition to security issues, Meta CEO Mark Zuckerberg and Tesla CEO Elon Musk have both pointed out that the rise of generative AI and the extensive use of water-cooling technology by tech companies like Microsoft, Google, and Meta to cool data centers could lead to water and electricity shortages. Gary Gensler, Chairman of the U.S. Securities and Exchange Commission (SEC), even stated that while AI has the potential to revolutionize existing investment methods, it could also bring systemic risks and even trigger a financial crisis!


In Greek mythology, Pandora's opening of the box released greed, hypocrisy, pain, and other evils, causing turmoil in the once peaceful world. As humanity opens this AI "Pandora's Box," will it simply bring surprises and beauty, or will it be accompanied by other frights and uncertainties (such as fraud or crime)? Let's wait and see.


2727 Read

comments powered by Disqus