Are software development tools affecting your choice of 8-bit vs. 32-bit processors?

Wednesday, February 1st, 2012 by Robert Cravotta

I have always proposed that the market for 8-bit processors would not fade away – in fact there are still a number of market niches that rely on 4-bit processors (such as clock movements and razor blades that sport a vibrating handle for men when shaving their faces). The smaller processor architectures can support the lowest cost price points and the lowest energy consumption years before the larger 32-bit architectures can begin to offer anything close to parity with the smaller processors. In other words, I believe there are very small application niches that even 8-bit processors are currently too expensive or energy hungry to support just yet.

Many marketing reports have identified that the available software development tool chains play a significant role in whether a given processor architecture is chosen for a design. It seems that the vast majority of resources spent evolving software development tools are focused on the 32-bit architectures. Is this difference in how software development tools for 8- and 32-bit processors are evolving affecting your choice of processor architectures?

I believe the answer is not as straight forward as some processor and development tool providers would want to make it out to be. First, 32-bit processors are generally much more complex to configure than 8-bit processors, so the development environments, which often include drivers and configuration wizards, are nearly a necessity for 32-bit processors and almost a non-issue for 8-bit processors. Second, the type of software that 8-bit processors are used for are generally smaller and contend with less system-level complexity. Additionally, as embedded processors continue to find their way into smaller tasks, the complexity of the software may need to be simpler than current 8-bit software to meet the energy requirements of the smallest subsystems.

Do you feel there is a significant maturity difference between software development tools targeting 8- and 32-bit architectures? Do you think there is/will be a widening gap in the capabilities of software development tools targeting different size processors? Are software development tools affecting your choice of using an 8-bit versus a 32-bit processor or are other considerations, such as the need for additional performance headroom for future proofing, driving your decisions?

Tags: , ,

17 Responses to “Are software development tools affecting your choice of 8-bit vs. 32-bit processors?”

  1. The most significant reason why people hesitate to migrate to 32 bit processor from the 8 bit ones in my opinion is the manouvers needed to configure the never ending list of sfrs for each of the peripherals. The libraries and wizards greatly simplify this,however. A worthy discussion.

  2. D.G. @ LI says:

    The tools do affect the choice of microprocessor as well as overall application complexity at the subsystem boundary.

    The interesting trend that I have noticed is that the R&D is going into making ARM based microcontrollers consume less power and very little is being done about 8-bit microcontrller development. The mobile applications market is screeming for better battery life and these applications require a 32-bit processor given the UI complexity.

    Also, on a recent control application, the preferred processor was an 8-bit 8051 based processor and the product ended up using a 16 processor instead. The 16-bit processor had significantly more capabilities and consumed significantly less power. I have not looked at truely small appliciation like a basic watch but this 16-bit processor was used in a data acquisiton and control subsystem. It was remote from the processor that made control decisions. All it did was monitor local discrete and analog sensors and forward the data to the main processor and control an set of solenoids in response to commands.

    There are also quite a few ARM7 based microcontrollers that are very low power and have a lot of capabilities and useful interfaces.

    I recently looked at an application for a remote monitoring subsystem (goal of 5 years on a single battery) that sampled data and relayed the results to a central system. Another application that was expected to a 4-bit microcontroller. A 16-bit was used instead and the only criteria was availability and power consumption.

  3. D.W. @ LI says:

    This question will lead to an interesting debate no doubt. I don’t believe that 32-bit is any more complex than 8-bit where limitations in stack and address space creates many issues. Code density is often worse for 8-bit performing the same task. See here for a discussion.

    http://www.erts2010.org/Site/0ANDGY78/Fichier/PAPIERS%20ERTS%202010/ERTS2010_0047_final.pdf

    This is, of course, a System-on-Chip viewpoint, where it is not the case that 8-bit cores are much smaller on chip, or even lower in power (energy) for a given task.

    However, there are many 8-bit MCU chips out there and last time I checked 8-bit is still the most dominant MCU by volume of shipments. I am sure price has a lot to do with that!

    In the 8-bit space there is a certain degree of ‘lock in’ with development skills and legacy code, and there is probably a perception (a myth perhaps) that 8-bit is simpler. Therefore, I am sure choice is mostly dictated by comfort zone: “my favorite compiler is X and it only supports processor A, B and C; I used processor B on my last 5 projects and I know it inside out; so I choose B again.”

    If choice of processor is more strategic, surely 32 bit is the logical way to go since it provides the best ‘future proofing’ option. It is not hard to move to 32-bit.

  4. F.W. @ LI says:

    There are big hardware differences between an 8-bit device and a 32-bit device. The difference between the address spaces alone is tremendous. This question is similar to asking if the ease-of-use and configuration-options would influence my choice between a Mazda Miata and a Chevy Suburban.

    Uh, yes, I think it might, if size did not matter.

  5. R.S. @ LI says:

    For any slightly-more-complicated-than-a-toaster class embedded projects, the tools and development environment of 8 bit processors is in a sad, sad, state compared to what’s available in the 32 bit world. Part of that is because of the architectures available (is there any license-able 8 bit core other than the 8051–which is a horrid architecture for modern languages)?

    Beyond that, as die size decreases and the end cost begins to be dominated by packaging (and the die dominated by non-CPU logic and ram/ROM), the more efficient 32 bit processors will push out the less cost effective 8 bitters.

  6. A.M. @ LI says:

    I have a side question:

    Assume you were to make a decision about the 8/16/32-bit architecture for a processor to be used in the first microcontroller course that includes some introduction to microcontroller architecture (i.e. ALU, registers, assembly instruction set, etc.) and then goes to program it and its peripherals in C. Which architecture would you pick and why? The course is for undergraduate EE/ECE program.

  7. R.S. @ LI says:

    ARM, without a single doubt.

    It’s cheap
    Widely available, from multiple vendors
    Relatively Modern
    with modern tools

    and, most of all, is useful to have on your resume.

  8. j.d. @ LI says:

    @Robert said that «… generally, 32bit processors are much more complex to configure than 8bit processors, making configuration wizards almost a necessity for 32bit, while a non-issue for 8bit.»
    That may be true for top application processors, with MMU, virtualization layers and multi-cores.
    But for 32bit microcontrollers, such as ARM7 and ARM Cortex-M, the configuration of pins, peripherals and low-level options is as straightforward as any other microcontroller.
    The programmers’s model, linear address space, multiple pointer-capable data addressing, large internal static RAM, and powerful instructions are clearly an undeniable advantage when compared to any architecturally-constrained 8bit micro. The statement that 32bit micros are more complex simply does not hold.
    So, to answer the original question: NO, the development tools are second in importance, after the processor architecture, capabilities, energy, and peripherals. A good C optimizing compiler, with an integrated IDE is basic to any uC tool set.
    In my opinion, the compiler quality, compliance to C99, integration with multi-language development, are the most important points in the toolset. A free compiler that generates 25% larger code will waste 125KB worth of code flash in a 512KB chip, and may force going to a more expensive chip, that may outrun the cost of the tool chain.

    As a matter of fact, when you compare the IDEs used in embedded systems to the standard used in desktop and mobile systems, they all fall short.

    - Jonny

  9. R.W. @ LI says:

    I agree 8 bit controllers will be around for a long time. If you’re doing something simple with not too many inputs and outputs it’s hard to beat an 8 bit controller and the tools don’t need to be that sophisticated.

    For more complex applications, the debugging support being built into some of the 32 bit controllers is a big advantage over the 8 bit parts. The CoreSight debugging architecture in some of the ARM controllers has a lot of impressive features. Debugging tools are just as important as the rest of the tools.

  10. R.W. @ LI says:

    @Alexander wrote:
    “…Which architecture would you pick and why? The course is for undergraduate EE/ECE program”

    I would consider
    TI MSP430 or the ARM Cortex M0
    They have a clean architecture
    Small regular instruction set (easy to learn)
    Support C well (the compiled C code is easy to follow along side the source)
    Lots of sample code, documentation and books
    Inexpensive demo boards
    If you later want get into RTOSes, you can. (after all, this is the Real Time Embedded Group)

  11. J.D. @ LI says:

    @Rich: I think you may be right. The demise of the 8bit micros is more a matter of economics than architectural traits. Despite the 10K gates (very little) needed to realize a Cortex-M0, you can do a reasonable 8bit PIC with much less.

    However, from the standpoint of the embedded designer, unless you are really hard pressed in an overexploited commodity market, there is no gap whatsoever for taking an ARM as your next target….

    - Jonny

  12. J.D. @ LI says:

    @Alexander:

    I’d like to add to @Rich very good suggestions on ARM Cortex-M0. I would expand and suggest a Cortex-M3 instead. This is why:

    - The Cortext-M0 is actually a very architecturally constrained subset of the ARMv7-M architecture. Its cut-down instruction set is not exemplar of ARM architecture, and has several shortcomings. To use the -M0 16bit THUMB dialect to teach ARM assembly is not very productive, because it is much harder to produce any decent assembly programming on that subset, and it is not as easy to understand.
    - The -M3, on the other hand, has a superb instruction set, the 32bit/16bit THUMB2, with many traits derived from the extremely powerful 32bit ARM instruction set. Actually, even the ARM7TDMI-s is a better candidate for teaching, due to the ARM ISA and a simple architecture that scales very well to the ARMv7-AR.
    - These cores (ARM7 and Cortex-M3) can be used to teach current embedded computing, using a solid base of Assembly to build on, and using ARM tools, that are among the best you can have in terms of optimizing compiler, full C99 compliance, no-constraints programming, and very wide availability, with literally dozens of manufacturers and proper kits to chose from.
    - Almost any ARM-based uC is loaded with a nice pack of peripherals, and can be used for a serious embedded computing platform.
    - Take a look at the mbed boards, from NXP, with a free cloud-based C/C++ compiler. The compiler is actually the ARM RealView compiler, made to run on the cloud. It is perfect for teaching purposes.

    - Jonny

  13. R.W. @ LI says:

    @Jonny: I agree and really like ARM parts. Some of them are now priced very competitively to the 16 and 8 bit controllers!

    Back to Robert’s original question, Architecture and tools influence what I want but on high volume, size constrained projects, cost, chip size, power and operations/manufacturing requirements often dictate what I get ;)

    @Alexander:
    I thought of the M0 because it’s a simpler part but I like Jonny’s points about the M3.
    The mbed board is fun. You can do a lot with it quickly and it’s very easy to set up since you don’t have to install any tools and do everything from your web browser. But, you don’t have a debugger and can’t set breakpoints, examine registers, step through code, etc. That may have changed since I tried one last Fall.

  14. J.D. @ LI says:

    @Rich: I don’t know the extend of support for on-chip debug in mbed, but I don’t miss it. Using an oscilloscope and GPIOs/analog outputs can allow you to debug very complex systems.

    Even then, you can always write classes for debug (if doing C++), or write handlers for semihosting/breakpoint exceptions (for C/C++) to enable debug.

    One interesting point is that now you can export the mbed project to gcc, uVision and several other non-cloud compilers.

    - Jonny

  15. F.W. @ LI says:

    @Russell: “(is there any license-able 8 bit core other than the 8051–which is a horrid architecture for modern languages)?”

    That which does not kill us makes us stronger. Which is why the 8051 persists. :-)

  16. C.S. @ LI says:

    or prolongs the inevitable… which also could be why it persists…

  17. R.P. @ LI says:

    Tool support certainly influenced my choice of PIC18 for some low-end embedded devices I designed for OVS. Mostly the fact that such tools are readily available in open-source for Linux. But my old-fogey dislike of profligacy would have prevented me from using a 32-bit core for such relatively simple devices anyway.

Leave a Reply