This is a fantastic article, of course there are opinions baked into this article like Apple's SOC, but you presented a lot of information which shows a breadth of knowledge and understanding.
SOCs were attempted in the past, but the manufacutring tech was not there. I mean 5nm? That is a wow just wow. Intel is hurting because their manufacturing stack is nearly 3-4 years old due to failures. They are stuck on wafer sizes 3x the size, and then they had their pants pulled down with Spectre/Meltdown.
Spectre was a blessing, because it validated what I saw in 2008 when building deployment designs. I picked AMD because they chained their memory controller on the die, which is SOC-lite.
So the world rotated around Intel for nearly 12 years, and 2020 came buy and vendors were like this is enough.
It has been talked about how mobile internet (I have worked in this industry since 2000) was going to replace the desktop, and now you can see what evolving a client every 9-12 months has accomplished.
Apple figured it out 3-4 years ago, drafted a plan and executed it well. As a Mac (and google chromebook user) owner, I saw the shit storm from a lack of QA on their current 10.15 codeline. Now I know why it sucked. The spent all of their QA budget on Big Sur and this chip in testing.
They swung for a home run and they did it.
I am first and foremost a Unix guy. I find Windows "interesting" but painful (I use VMs to run my stuff), and drive on a Mac.
FreeBSD + Mach kernel and this chip will smoke circles around quite a few things, and intel better watch out. Linux runs on this stuff natively or gets copied by a generic provider (which will happen), Intel and even maybe AMD will be in a world of hurt.
AMD is positioned to scale, but they will need to morph themselves. Nividia though maybe the dark horse in the long run.
Why? Cough Nividia Shield. The core processor on that from 2015 still works beautifully in 2020 ... which should say a lot.