I might think that until someone else finds you another solution my answer will be look at what is called VERO Board and Component Sockets. VERO is a manufacturer, but it is now also made in many varieties by others and often it can be found by that term.
It's basically PCB material with 2.54mm (0.1") spaced holes to fit many of the through hole type components without having to bend anything. Then every component has a socket, from the olden days.
DIP sockets, TO92 sockets, etc.
When bought in bags the sockets or socket strips (20 to 40 pin long strips that you can cut to size to hold chips) are affordable enough at larger distributors or on eBay.
That said, I have to say I have not used reversible engineering in the last 15 years due to a personal lab holding the tools to make double sided surfacemount boards with 8mil (8/1000th inch) size traces and a stock of parts to match. So I am both spoiled and out of the loop. But 20 years ago I started with those sockets and VERO board myself.
Obviously all microcontroller chips which means a chip with some sort of processor, non-volatile memory (flash for example) and ram are at least one time programmable if not re-programmable. These days you find they are often in circuit programmable.
Atmel used to have a sam-ba bootloader on board, that was easy to use. Now that is gone, the SAMD21, has just the one flash space, and has ways to discourage the erasure, but it is trivial to erase. So you dont get that sam-ba nor the arduino experience. The arduino's using an avr, the avr is nice you have the isp programming, which varies across the product line, but is still good. and they have a bootloader area which is where you find the serial based bootloader that arduino's use.
ST and NXP still on m0/m0+ products include a serial bootloader that is easy to use. and both are easier to program than the atmel samd21.
a number of these have usb based bootloaders as well.
At the end of the day though it is a simple matter of reading the manual for the part you are interested in. Go to mouser or digikey or you favorite place, narrow down on microcontrollers from any company that meet your footprint (are hand solderable) or whatever and then look at the datasheets there. or dig deeper at the vendors website. they are ALL programmable. Some jtag is the only option (have a freescale I still cannot program, SWD only and cant get it going with openocd yet). the samd21 you can program with an st-link and probably a cmsis-dap. getting a nucleo board is a cheaper option than a standalone st-link, and hey you may find you like how easy it is to program an stm32. no I dont work for any of these companies.
Best Answer
In this context, an SoC is just the same as a microcontroller. Yes, there are subtlties relating to exactly what is integrated, how many dice are actually bonded together, etc. but this doesn't change much when it comes to using the part.
From a software point of view, the integrated hardware might come with some drivers. These might be secure, in ROM, or as a blob that you need to link in as you build an image. There might be a % CPU cycles overhead too (like a regular interrupt for the driver).
Assuming the device is offered in the production volume you care about, you can get a reference design from the vendor. This defines the necessary support that is needed for a reliable implementation. That reference design might come as a dev board that you can buy, or you might need to fab your own.
Typically, you might build/buy a non-constrained implementation for proof-of-concept, and at the same time work on a more optimised/constrained version of the design as a follow-on. To some extent this depends on how much you plan to parelise the various development threads.
Note: Particularly for anyone generating/consuming lecture material
Back in the previous century, when MCU usually meant 8051, and almost certainly meant 8 bit, there was a more specific use of SoC which is now starting to become obsolete.
It used to be that a CPU was almost always a stand-alone component, with exposed bus interfaces onto which you would connect decoders and memories. Around the time that mobile phones started to be a thing, it started to be that a whole 'computer on a chip' made sense, where more of the system was integrated. These chips would be running an operating system, but still be designed into an embedded system. Often they would be designed for fairly specific use-cases.
Fast forward to today, the integration is much simpler (and maybe not too expensive), so there are masses of general purpose SoC devices with lots of integrated peripherals and memories. Stand-alone CPU chips are next to non-existant (all the PC ones have DDR controllers, integrated MCUs for management, etc). SoC, MPU etc. have lost most of their legacy diferentiation, and it is just confusing to keep teaching them.