Cadence’s Brad Griffin Digs Deep Into DDR


Reading time ( words)

Guest Editor Kelly Dack stopped by the Cadence Design Systems booth at DesignCon 2015, where he sat down with Product Marketing Manager Brad Griffin to discuss Cadence’s advanced PCB design and signal integrity tools, and the company’s focus on DDR.

Kelly Dack: Brad, since you’re the product marketing director for Cadence Design Systems, I’d like to ask a few questions about your DDR products. But first, please give us a brief overview of DDR.

Brad Griffin: I’d be happy to. One of the main things with a computer is that it has memory and you can store data in that memory—that’s kind of what makes it a computing device. So they’ve been finding ways over the life of electronics to store and retrieve data faster out of memory. Somewhere around 2002, we came up with this idea of doubling the data rate in DDR memory, or double data rate memory. That was unique because basically, we clocked the data into the memory, both on the rising edge and on the falling edge of the clock. It was a clever way with the same sort of signaling to basically double the data rate speeds.

KD: Was there an organization involved? Was it standardized? 

BG: That’s really good question. As of right now, there's a standard committee called JEDEC, and I'm going to assume they were in place back in the 2002 timeframe, but I’d have to go back and check. But obviously there's memory companies and they have to be able to plug-and-play with different controllers as they’re driving the memory, so there's probably always been a standard they’ve been marching toward. That process used to be a lot simpler. You’d be transferring data at maybe 100 megabits per second. You would send the data, clock it in, and it wasn’t nearly as complicated as it is now.

kelly_brad2.jpg

KD: So where has DDR come from, and where is it now?

BG: There was DDR2 and then DDR3, and probably 2015 is going to be the transition where most DDR3 designs go over to DDR4. Typically, this happens because the DDR4 memory will actually become less expensive than some of the DDR3 memory. 

KD: What does that mean as far as the technology from a power standpoint as well as a data standpoint?

BG: The main difference from a technology standpoint from DDR3 to DDR4 is the speed. It basically just gets faster. So any application you have in the computer that’s run with DDR4 memory will make for a faster computer than one running with DDR3. One of the exciting things that has migrated probably over the last five to seven years is this new version of DDR called LPDDR, which stands for low power. That’s been something primarily used in mobile devices because you certainly don’t want your cell phone to run out of power in the middle of the day.

KD: With this reference to power, if I understand correctly, DDR came from a 2.5 V system and shrunk to 1.8 V and 1.5 V, and DDR4 is down at a little over 1 V. That seems really low already, so where will the LPDDR take us? 

BG: If you can believe it, the LPDDR4 specification only has a 300 mV swing, so it's really low. That means that for signal integrity and power integrity engineers, there's really very little margin left. We said there was very little margin left when it was 1.5 V, and now we’re down to 300 mV; this very small swing of data means that your signals have to be clean and your power planes have to basically be stable. Because then you have to have a power/ground bounce associated with simultaneous switching signals. It’s going to basically make it so that you're not going to meet the signal quality requirements that JEDEC puts in place for LPDDR4. So designs are getting really interesting. What we’re excited about this year at DesignCon are the things we’ve been putting into our tools to enable designers to validate that they've done everything they need to do to meet the LPDDR4 requirements.

Share

Print


Suggested Items

Turning ‘Garbage In, Garbage Out' into ‘Good In, Good Out’

03/23/2021 | Tamara Jovanovic, Happiest Baby
In the PCB design cycle, it is so easy to unintentionally introduce “garbage” into your system. Unless you have time to extensively check everything you bring in from an external source, it is very likely that something will not match up with your design data. In the end, this means you’ll have to put more work into your design and basically reverse-engineer a part that was supposed to save you time and effort.

Karen McConnell: Recipient of the IPC Raymond E. Pritchard Hall of Fame Award

03/11/2021 | Patty Goldman, I-Connect007
"I heard about IPC when I started a new job at UNISYS after graduating college. I moved from ASIC design to printed circuit boards," said Karen McConnell after being inducted into the Raymond E. Pritchard Hall of Fame. "At the time, in the late ’80s and early ’90s, there were rumors going around that printed circuit boards were going to disappear, and ASICs were going to take over the world. But something in printed circuit boards fascinated me. I minored in robotics in college as an electrical engineer and the data used to fabricate, assemble and test the boards is actually all robotic language. I was hooked."

The Key to Eliminating Bad Design Data: Constant Vigilance

03/09/2021 | I-Connect007 Editorial Team
The I-Connect007 editorial team recently met with Jen Kolar and Mark Thompson of Monsoon Solutions to discuss ways to eliminate bad data from the design process, whether that be from CAD libraries, parts vendors, chip makers, or customers themselves. They key in on some problems and obstacles that allow incorrect data into the design cycle, and then highlight possible solutions.



Copyright © 2021 I-Connect007. All rights reserved.