Future Data Center Needs Impose More Memory DemandsFuture Data Center Needs Impose More Memory Demands
AI and conventional data server requirements force engineers to juggle memory resources while keeping eyes on power and security needs.

The insatiable demands of data centers for higher computing power, driven by the rapid growth of AI, are in turn imposing greater needs for memory that provides higher bandwidth while keeping an eye on system considerations such as power and security. These topics will be discussed in a DesignCon session Wednesday, January 29, at 8 am titled, “Technology Advancements for AI in the Data Center.”
Dr. Steven Woo, Fellow and Distinguished Inventor at Rambus who will deliver the presentation, recently spoke to Design News on some of the challenges memory faces in keeping with the more robust data needs.
According to Woo, one factor driving current and future memory designs is the fact processors now have many more silicon cores, which in turn means memory resources and bandwidth are spread out over more processors.
“These large core count processors are getting bigger with more memory challenges,” Woo told Design News. He added one challenge is the greater power needed to design processors with more memory.
In the case of AI, Woo noted that there’s no memory type or configuration that suits all requirements. “For AI training, you need lots of memory for training suites. For inference AI, a lot of memory swapping is involved.”
Woo added that HBM is now the preferred memory type for AI servers because there are multiple layers, each providing multiple access and thus higher bandwidth. “Multiple layers run at lower data rates, which help mitigate thermal problems more efficiently than DDR memory.”
On the other hand, Woo noted that traditional servers use DDR memory, with a number of DDR modules running in parallel. “This configuration is better for general-purpose computing.” He added the trick is how to configure a system to have both DDR and HBM memory work together. A potential solution is to assign a portion of the data that does not require high performance to DDR, while assigning the frequently accessed data to HBM.
Interface and security issues also face design engineers configuring memory systems, Woo noted. “Interface circuits can consume a lot of power.” Regarding security, Woo noted that encryption is not used at all times as the question arises on how does one compute on encrypted data.
Woo will explore these and other memory-related issues during his talk Wednesday, January 29 at 8 am in Great America Ballroom 1 at the Santa Clara Convention Center in Santa Clara, Calif. To learn more, go here.
About the Author
You May Also Like