The problem with the definition of ESL
By Brian Bailey | November 8, 2011
How long have we heard the promises associated with a move to the Electronic System Level (ESL)? For the longest time it seemed as if all of the predictions about its growth kept moving out another year - every year. It always appeared to be just on the cusp of exploding, but never quite happened. And then all of a sudden, without any kind of fanfare it was here. People are talking about how they use it, their successes, they are complaining about aspects of it as if this was technology they had been using for years. They have wish lists, and openly talk about the limitations. Perhaps the most interesting part is that many of them don’t even call it ESL - It is just getting the job done. How can this be?
The problem is with the definition of ESL. I remember when Synopsys first came out with RTL synthesis it was a game changer, an Innovator’s Dilemma. When it appeared as if ESL was about to happen, everyone wanted to be the next Synopsys (except for Synopsys) and to have the next synthesis tool. They expected the transition to ESL to be exactly the same, for history to repeat itself. But as always happen, something different becomes the new normal. So what did happen? I am not sure we know the final answer yet, but almost certainly high-level synthesis is not at the center of ESL.
A few things are certain. ESL is a convergence point for many things, but it most certainly is not a single abstraction, or a defined flow. There are at least three things that converge here including design and verification, hardware and software and tools and IP. Let’s explore each of them a little more.
Figure 1: ESL as a convergence point
Design and Verification: Verification has become a huge bottleneck and one of the things I hear very frequently is that companies who adopt transaction level synthesis in the design flow see a larger benefit in their verification flow, often by a factor of 2 or 3 to 1. Moving to a higher level of abstraction not only speeds up simulation by a factor of 100 or even 1000 but it allows system level verification to be attempted which has not been possible in the past until almost just before tape-out. That is a game changer for the verification methodologies and a change that we have only just started to see emerge. Of course the design team also sees the benefit of being able to define the functionality they want to build quicker and being able to explore the implementation architecture.
Hardware and Software: This convergence was perhaps the first to have been observed. Systems today are defined by the software more than the hardware in many cases. The hardware has become a platform for software execution, plus some optimizations for power or performance reasons. Software is often on the critical path making it essential in some cases for virtual prototypes of the hardware to be made available long before the RTL has been written.
Tools and IP: When this convergence point was first suggested in the book ESL Design and Verification back in 2006, few people understood this convergence. Today we see many cases where a piece of IP without a sophisticated configuration tool has little value. Designing on-chip interconnect fabrics, memories and many other blocks are a combination of IP and tool. Whole chips, such as the Cypress PSoC devices would have no success without the right balance of tool and IP and many other startups have failed because they got this balance point wrong.
So why is ESL so nebulous? In a nutshell because it not a single point or intended for a single purpose. Each application of it potentially requires a different level of abstraction, or different models. Companies are utilizing the pieces of ESL that provide them with the most value, or to overcome their largest pain points. It may be many years before a unified ESL flow becomes apparent and it most certainly will not be hinged on a single tool such as synthesis. High-level synthesis may actually be one of the least important parts of an ESL flow.
Where do you think the most value lies?
Titan: Fastest Path to Mixed Signal Silicon
Mar Hershenson provides an overview of Titan and how it allows analog designers to more fully explore the design space to significantly improve performance and dramatically reduce power consumption on both new and existing analog designs.
With lithography limits, double patterning, high-k gates, low-k interconnects and myriad new requirements, the path to silicon success at the 20-nm node can look a little fuzzy. You need an advanced, scalable, integrated SoC design environment that delivers fast and predictable timing and layout closure. Attend this webcast to get a clear vision of how Magma's Silicon One technology can get your 20-nm SoC to silicon fast and with better results.