The semiconductor and related electronic design automation (EDA) chip development tool markets continued to shine during 2020 amid the pandemic. Which 5 key areas will impact these markets?

John Blyler

January 5, 2021

9 Min Read
AdobeStock_396662030_1540-800.jpeg
Adobe Stock

Industries that continued to shine during the 2020 ravages of the COVID-19 pandemic were the semiconductor and related electronic design automation (EDA) chip development tool markets.

The most recent SEMI Market Statics Service report from the ESD Alliance, a SEMI Technology Community, shows the EDA industry revenue increased 12.6% in Q2 2020 – higher than in Q2 2019. The four-quarter moving average, which compares the most recent four quarters to the prior four quarters, increased by 6.7%.

Further good news was found in the November 2020 semiconductor equipment report. According to Ajit Manocha, SEMI president, and CEO, the billings of North America-based semiconductor equipment manufacturers remain robust, though November shows some expected tapering after billings registered record highs early this fall.

What segments will help see the semiconductor and EDA markets through 2021? To answer that question, DN called upon experts to share their predictions. While what follows is by no means a comprehensive list of semiconductor and chip predictions, it does highlight several of the key chip design trends that will be in predominant play for next year.

Intellectual Property (IP)

K. Charles Janac, President and CEO, Arteris IP: The IP market is moving from leading-edge chip architecture to large data processing SoCs. Soon SoCs will be able to make decisions on their own.  Sophisticated software is required to reach decision-making SoCs and silicon that combines data processing and machine learning on the same die - or in separate chiplets. 

Related:Telling Semiconductor and Chip Design M&As of 2020

Increasingly the data processing happens in accelerators dedicated to a specific workload or types of data.  Examples would be Machine Vision accelerators processing camera, radar, or lidar data.  Increasingly these accelerators are cache coherent in order to simplify the programming task.  As a result, we see cache coherency in both the processor subsystem and the cache-coherent accelerator islands. 

At the same time, there is peer-to-peer traffic that has to be managed in the machine learning sections of the SoC.  The processor subsystem acts in a supervisory capacity and runs the operating system.  It also requires moving large amounts of different types of traffic on the SoC.   How efficiently you move data across the chip impacts power, performance, and area. These developments move to interconnect IPs; cache-coherent, non-coherent and peer to peer as one of the most important IPs in advanced SoCs.

Electronic Design Automation (EDA)

Related:Open-Source RISC-V ISA Offers More

K. Charles Janac, President and CEO, Arteris IP: The move to large numbers of IPs on an SoC and the move toward sub-16nm silicon will also have an impact on EDA.  A large number of commercially and internally developed SoCs will have to be packaged into a form the makes such IP easily deployable.  One of the most helpful standards in this task is the use of IP-XACT.  The more IPs become exchangeable through the use of IP-XACT, the more valuable they become.  You can have the greatest IP but being able to deploy it easily is the key to SoC productivity.  Together with IP-XACT IP packing, it is very helpful to have automated register management tools and RTL repartitioning capability so that all the IP blocks in the SoC can easily be assembled in a virtual platform environment.

Another trend is that physical design is rudely intruding on SoC architecture.  At 16nm and below, you can have perfectly valid architectures that are not easily implementable in physical layout.  There need to be linkages between front-end SoC architecture and back-end floor planning.  In fact, the production floor plan needs to be available early in the design process to provide the ability to estimate timing closure and compute critical net latencies based on knowing the physical locations of IP Blocks and interconnect IP elements.  Tools supporting the combination of the SoC connectivity map, the performance objectives of the SoCs, and the floorplan are a necessity for advanced design in 2021.

Cybersecurity Technology

Walden (Wally) C. Rhines, Ph.D. EE, CEO Emeritus of Mentor, a Siemens business and President and CEO, Cornami: Artificial intelligence (AI) and machine learning (ML) processors will continue to be introduced to provide non-Von Neumann alternatives to machine learning. Meanwhile, Nvidia still dominates dedicated ML servers.

Today’s computer security will be broken by quantum computers sometime between 2023-2030. Evidence comes from many sources:

2020 - 65 qubits

2023 – 1,000 qubits (source: IBM)

2030 – 1,000,000 qubits (source: Google)

Further, companies accelerate the adoption of "quantum-proof" fully homomorphic encryption (FHE) chips and technologies. Gartner has reported that 25% of all companies will have homomorphic encryption programs by 2025. Two major companies have accelerated rollout programs for 2021. Fintech and DoD companies prepare for a world of FHE. Finally, machine learning (ml) models will be marketed in an encrypted form so that they can be used without divulging the data.

autonomous vehicles_700.jpg

Modeling vehicle security.

Sandeep Sovani, Ph.D. ME, Director, ADAS & Autonomy at Ansys: At present, there is no single holistic cybersecurity analysis tool for autonomous cars. This will change in 2021, with the launch of new cybersecurity analysis solutions from vendors like Ansys. Companies will, for the first time, be able to use technology that takes a systematic approach to cybersecurity analysis that is similar to the approach taken for other safety analysis such as for system or software failure or safety issues due to sensor limitations. They will have the ability to look at attack vectors to find the loopholes, see where an attack could happen, how it would affect the vehicle’s system, assess the extent of the damage, and take action to stop that attack from being successful. 

Chip Architectures and Materials

Chris Rowen, VP Engineering at Cicso, CEO at BabbleLabs: Deep-learning at the edge for lower bandwidth streams – like audio and sensor streams – will push the envelope more on functionality and less on raw compute levels.  We expect to see widespread edge deployment of speech enhancement, recognition, and analytics-driven by improved latency, privacy, and cost.  Dedicated deep learning silicon will play a role primarily in the lowest-power applications like voice trigger while more fully programmable edge architectures (DSPs, GPUs, and CPUs) will play the biggest role in richer, more rapidly evolving applications that run above the simple triggers.

Richard Wawrzyniak, Principal Market Analyst, Semico Research Corp.: Since its appearance in 2012, the progress in the AI field has proceeded unabated at a sometimes-dizzying pace. The turnover rate for new innovations and architectural approaches is projected to be occurring every 3.5 months – much faster than in other periods of history in the semiconductor market.

There are two main areas of focus:

  • Accelerators for data center applications: work continues to innovate new silicon architectures for domain-specific accelerators to aid in training for Deep Learning applications.

  • Inference architectures for Edge and End-Point silicon solutions: evolving market requirements for processing more data closer to the end application are pushing silicon architects to increase the CPU power and the compute resources to support these more powerful CPUs in end-point devices. In addition, the view of what Security functions are necessary is being recast towards embedding more powerful and robust security capabilities in these solutions.

Mark Gary, B.S. EE, Senior VP of Analog Power Products, Texas InstrumentsRelated to new chip processes, packaging, and circuit-design, we see five key long-term trends in power management: improving power density; reducing electromagnetic interference (EMI); extending battery life with low quiescent current (IQ); increasing safety and reliability through isolation technologies; and preserving power and signal integrity with low-noise and high-precision innovations.

The most promising tech trend I see for 2021 has been 10 years in the making — and it’s a game-changer. High-voltage gallium-nitride (GaN) doubles the output power capacity for customers in the automotive and industrial areas, compared to the trend line of generational improvements in silicon-based technology over the past 10 or 20 years, and delivers a substantial decrease in the total cost of ownership (TCO) for customers who consume a lot of electricity annually. TI recently introduced an automotive GaN FET with an integrated driver, protection, and active power management. These advancements help customers get the most out of their system and make it as easy as possible for them to use, by delivering twice the power density and the highest efficiency in their automotive onboard chargers and industrial power supplies, such as enterprise and factory automation systems, where every percent of increased efficiency can lead to significant cost savings. Engineers in each area need to deliver more power in a smaller space — it’s all about power density.

Test

Robert Ruiz, B.S. EE, Marketing Director, Hardware Analytics and Test at Synopsys: In the coming year and beyond, test technology will continue its accelerated innovation pace, not just for manufacturing but also for new applications beyond the manufacturing test. Recent, early adoption of a new testing paradigm utilizing high-speed I/O ports such as USB and PCIe marks the beginning of a trend to increase test data bandwidth dramatically. Such bandwidth is a game-changer, potentially decreasing silicon test time and test cost by an order of magnitude for next-generation AI and graphic processors.  But perhaps even more exciting is how this new scheme, delivering massive test data on and off-chip, will enable other applications.

With the combination of increased bandwidth and use of standard, ubiquitous high-speed functional ports, test data can be applied and off-loaded at any point in the life of the silicon. That is, not only can die be tested for manufacturing defects via their USB ports immediately after fabrication, but they can also be tested much later after the die is packaged. Indeed, since the functional USB port is connected to the system, it is possible to test the packaged parts for defects while in the target system.  Furthermore, if built-in monitors and sensors report live chip conditions, such as voltage and temperature, and are connected to the design-for-test (DFT) network, additional information can be collected and analyzed. Effectively, the chip’s health can be determined, including if the silicon part will experience an early death or need a real-time adjustment to continue functioning. Coordinated monitoring and corresponding analytics lead to an emerging but increasingly essential activity known as silicon lifecycle management. The analytics and active silicon management will become key to increased quality as well as safety.

With test technology innovation leading to more than screening manufacturing defects, the long-established “EDA test” is becoming redefined.

AdobeStock_289084119_700.jpeg

Edge computing will be a leading market. 

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.

About the Author(s)

John Blyler

John Blyler is a former Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an engineer and editor within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to RF design, system engineering and electronics for IEEE, Wiley, and Elsevier. John currently serves as a standard’s editor for Accellera-IEEE. He has been an affiliate professor at Portland State Univ and a lecturer at UC-Irvine.

Sign up for the Design News Daily newsletter.

You May Also Like