Programmable logic has come a long way from the simple devices we started out with. Remember Programmable Array Logic, or PALs? In bipolar technology the ICC on even a small PAL was over 100mA, and you could literally get your finger burned if you touched the top of the package too long. Today, programmable devices have an unimaginable number of simple PLD equivalents as well as on-chip memory, SERDES, DSP blocks, PLLs, etc. Who could have predicted where those initial devices have taken us?
Microcontroller units, or MCUs, have also evolved quickly from the simple 8-bit processor, with a small amount of on-chip ROM, a timer and a UART, to multi-core 32/64-bit processors, megabytes of on-chip Flash, and SRAM memory and a host of intelligent peripherals including Ethernet, USB, and wireless connectivity. There is more computing power in a sub $1 MCU than a top-of-the-line IBM 360 from those early days of computing. How far we have come.
More recently, we are even seeing a convergence of MCUs and programmable logic devices. Field Programmable Gate Arrays, or FPGAs, are adding fixed function, dedicated processors and peripherals, putting one or more complete high-end MCU on-chip. These sub-systems have a host of dedicated peripherals and lots of memory, and some devices even have on-chip Flash for non-volatile storage, just like their familiar MCU cousins.
Can we imagine where we will be in another five years if these innovations continue? How about in 10 years? Will MCUs evolve to include some amount of programmable logic on-chip? Will FPGAs continue to follow MCU innovations, or will they take the lead in creating ever more powerful and easier-to-use programmable controller functions. Will the software evolve to the point where your target device is independent of your code. Just program an FPGA, an MCU or a combination depending on the cost, power, and performance you desire.
I'd like to use this space on Design News to explore the future of programmable devices -- MCUs and FPGAs and those in between. I’m hoping to include your comments and thoughts on where these pervasive technologies may take us in five years or 10 years. What do you think are the key challenges? Are there some current architectures that need to be dramatically changed in order to continue innovating?
Here are some initial thoughts to get us started. What changes will we see in the underlying FPGA programmable fabric? Are there really any big changes in store? Is the 4-input look-up table (with appropriate add-ons for simple arithmetic functions) as good as it gets?
High-density FPGA devices today have a wide range of fixed function elements with memory blocks, SERDES, PLLs, multipliers, MCU peripheral functions, processors, and even analog functions. Are we going to see an explosion of specialized devices that address specific applications, or are FPGAs going to be primarily generic with a wide application focus?
Will MCUs be adding programmable fabric so users can better control intelligent peripherals and create specialized co-processors to off-load the CPU? Will the use of multiple CPU cores be common? Will software tools automatically partition and allocate functions to CPU elements for optimum efficiency?
These are just a few of the multitude of questions we can ask ourselves, as we try and see into our programmable future. What do you think are the critical issues, and where do you think we will be in the next five or 10 years? Let’s start a wide-ranging discussion here and see how we do at figuring out “what’s next.”