DesignCon 2025, Day 2: It’s All About AI
January 30, 2025 | Marcy LaRont, I-Connect007Estimated reading time: 4 minutes
It’s hard to get away from the topic of artificial intelligence, but why would you? It’s everywhere and in everything, and my time attending presentations about AI at DesignCon 2025 was well worth it. The conference’s agenda featured engaging presentations and discussions focused on the technological advancements in AI, big data centers, and memory innovations, emphasizing the critical relationship between processors and circuit boards.
Those attending DesignCon 2025 at the Santa Clara Convention Center, Jan. 29–30, gained valuable perspectives on the future of design and engineering in this rapidly evolving field. The theme for the show, “Where the Chip Meets the Board,” was indicative of the presentations.
I attended many presentations specifically focused on how AI is affecting big data centers. I was really interested in “Technology Advancements for AI in the Data Center,” by Tim Messegee, director of solutions marketing at Rambus (and a show sponsor).
“AI is increasingly used to extract meaning and to extract value, and we need lots of memory, lots of interfaces, lots of capacity,” he said. The short answer to what the customer wants is always “more of everything,” something that won’t change in the foreseeable future. He talked about HBM and GDDR/LDDR/LPDDR memory and that we are no longer looking at an either/or proposition, but needing both to attain the very memory needs required to run more and more AI.
When asked about DeepSeek, Tim responded that it is open source, meaning we can all take a hard look at it. He labeled it a “Sputnik moment,” providing a wakeup call to others in the industry. He said that the leading-edge AI servers and racks must be able to use terabytes per second of bandwidth locally and will not be able to depend on stored data. He also discussed Computer Express Link (CXL™), which he described as “a new interconnect industry standards with wide industry support.” Beyond “memory expansion” is something called “memory pooling” where you can share memory between hosts, transparently redeploying memory between those hosts as needed. CXL facilitates this.
“The idea is that the resources we have in a server we’d like to compose the computing lineup of resources to the particular workload,” Tim said. He also raised the ever-present challenge of power. “The power budget for just moving the data is 25–40%, with 60% the worst-case scenario. You aren’t doing any work. You are just moving bits.”
Thus, packaging is an increasingly important part of the total equation as we stare down very dense packages that present significant power, thermal, SI, and reliability challenges. Not surprisingly, Tim was asked about the Chinese AI product that blew up Apple app downloads earlier in the week and its implications. He said it is a huge wakeup call for the industry that we cannot sit on our laurels; we must continue innovating at a rapid pace, and that while much investment must be dedicated to these endeavors, 80% will still be for infrastructure.
Next, I attended a white paper presentation on “The Influence of Copper Crystal Structure on Signal Integrity,” presented by Jaeyeol (Paul) Park, regional director at Nan Ya New Material Technology. (Stay tuned for Kelly Dack’s interview with Paul Park.)
“With the increase in signal transmission rates, the surface roughness of copper foil conductors has an increasingly significant impact on SI,” he said, adding that copper foils with different copper crystal structure and topography are compared for insertion loss. “In the industry, most of the research on conductor loss focuses on further reducing roughness and specific surface area. Considering the need to ensure peel strength, it has basically reached the application limit of copper foil.” The variables are skin depth, permeability, conductivity, and frequency. How the copper foil is processed was also considered, concluding that the annealing process does not have a measurable effect. More studies are being undertaken in this area.
The morning keynote was by John Linford, principal technical product manager at NVIDIA. “AI has arrived… it is a new industrial revolution,” he said. “At NVIDIA, we build AI to build chips for AI.” John discussed some of NVIDIA’s products unveiled at CES 2025, including its ChipNeMo AI tool that helps their engineers with every aspect of product design and analysis. He said that AI is informing every level of their design and operations, calling NVIDIA supporters of “extreme co-design.” He also spent a good amount of time talking about NIVIDIA Modulus, an open-source framework for building, training, and fine-tuning physics-ML with a simple Python interface. These new modes of aggregating and analyzing data, creating simulations and digital twins using GPUs vs. the older, traditional CPU engines provides a 15,0000x speed-up in completing designs and simulations, thereby increasing design freedom, reducing the computational cost of designs, and inevitably allowing for the creation of more energy efficient designs. “Real time performance is only possible with AI,” he concluded.
Andy Shaughnessy, managing editor of Design007 Magazine, also reviewed the show. Click here to read his highlights and see images from the show floor.
Suggested Items
NVIDIA Expected to Launch RTX PRO 6000 Special Edition for China’s AI Market, Potentially Boosting Future GDDR7 Demand
05/28/2025 | TrendForceTrendForce reports that following the new U.S. export restrictions announced in April—which require additional permits for the export of NVIDIA’s H20 or any chips with equivalent memory bandwidth or interconnect performance to China—NVIDIA is expected to release a special low-power, downscaled version of the RTX PRO 6000 (formerly B40) for the Chinese market.
Pioneering Energy-Efficient AI with Innovative Ferroelectric Technology
05/22/2025 | FraunhoferAs artificial intelligence (AI) becomes increasingly integrated into sectors such as healthcare, autonomous vehicles and smart cities, traditional computing architectures face significant limitations in processing speed and energy efficiency
SEMI Reports Typical Q1 2025 Semiconductor Seasonality with Potential for Atypical Shifts Due to Tariff Uncertainty
05/19/2025 | SEMIAccording to the Q1 2025 Semiconductor Manufacturing Monitor (SMM) Report released by SEMI in collaboration with TechInsights, the global semiconductor manufacturing industry entered 2025 with typical seasonal patterns.
NY CREATES, Fraunhofer Institutes Announce Joint Development Agreement to Advance Memory Devices at the 300mm Wafer Scale
05/16/2025 | NY CREATESNY CREATES and Fraunhofer IPMS announced at a signing ceremony a new Joint Development Agreement (JDA) to drive research and development focused on memory devices.
SMC Korea 2025 to Spotlight Next-Generation Memory and Materials Innovation amid AI Boom
05/13/2025 | SEMIThe Strategic Materials Conference (SMC) Korea 2025 is set to convene on May 14 at the Suwon Convention Center in Gyeonggi-do, South Korea, bringing together leading experts and innovators to highlight the critical role of materials innovation in addressing the performance, efficiency, and scalability requirements of AI-enabled semiconductor devices.