How to Program an SOC Without a Hardware Kit

microcontrollerpcbproductionsoc

I am new to the production side of electronics. I was wondering if I can directly use a DA14683 to implement a size-constrained project. I have a few queries in mind

  1. Is interfacing a SOC similar to interfacing a microcontroller like Atmel or ARM?

  2. What does electronics product development look like? Do people directly buy SOCs to implement their idea?

  3. What all pre-requisites will I need to be equipped with before using raw SOC for my project?

Best Answer

In this context, an SoC is just the same as a microcontroller. Yes, there are subtlties relating to exactly what is integrated, how many dice are actually bonded together, etc. but this doesn't change much when it comes to using the part.

From a software point of view, the integrated hardware might come with some drivers. These might be secure, in ROM, or as a blob that you need to link in as you build an image. There might be a % CPU cycles overhead too (like a regular interrupt for the driver).

Assuming the device is offered in the production volume you care about, you can get a reference design from the vendor. This defines the necessary support that is needed for a reliable implementation. That reference design might come as a dev board that you can buy, or you might need to fab your own.

Typically, you might build/buy a non-constrained implementation for proof-of-concept, and at the same time work on a more optimised/constrained version of the design as a follow-on. To some extent this depends on how much you plan to parelise the various development threads.

Note: Particularly for anyone generating/consuming lecture material

Back in the previous century, when MCU usually meant 8051, and almost certainly meant 8 bit, there was a more specific use of SoC which is now starting to become obsolete.

It used to be that a CPU was almost always a stand-alone component, with exposed bus interfaces onto which you would connect decoders and memories. Around the time that mobile phones started to be a thing, it started to be that a whole 'computer on a chip' made sense, where more of the system was integrated. These chips would be running an operating system, but still be designed into an embedded system. Often they would be designed for fairly specific use-cases.

Fast forward to today, the integration is much simpler (and maybe not too expensive), so there are masses of general purpose SoC devices with lots of integrated peripherals and memories. Stand-alone CPU chips are next to non-existant (all the PC ones have DDR controllers, integrated MCUs for management, etc). SoC, MPU etc. have lost most of their legacy diferentiation, and it is just confusing to keep teaching them.

Related Topic