How does C's negatives effect embedded programming

I’m learning C as my actual go to language however i’m in it for the micro-controllers, not software applications. I’ve done some basic C++ programming, some Arduino programming, etc.

So literally just read the first section of the book (“This book is not…” thru “The Undefined…”) and it got me thinking about how that applies to writing for embedded applications. (PIC, ARM, AVR, Insert-architecture microcontrollers, etc.)

To what extent does UB (or other C-isms) negatively effect embedded programming. I wouldn’t mind learning assembly but of course every architecture has it’s own assembly language therefore C seems to be the easier foot in the water if you don’t take into account microcontroller/architecture specific libraries.

I think I replied in private but I’ll put something else here:

The impact of C’s UB is mostly the same not matter what platform you’re on: hidden catastrophe. C compilers are the only programmer’s tools who will take licenses with UB that allows them to create dangerous software, and then blame you for not knowing they would do that. The standard says compilers are allowed to identify UB, and then do whatever they want, even if that’s a catastrophic action no programmer would want. Because this is in the standard, any time a compiler royally screws you over, you get blamed for not memorizing all the UB and how every compiler interprets them. In addition, there’s nothing in the standard that requires a “zero UB mode” that would stop compilation with an error when it detected UB. UB is then basically simultaneously considered an error when you write it, but a boon when a compiler sees it and exploits it, since you are always responsible.

When you use C on any platform your goal is to restrict the C you use to the smallest “attack surface” that you can. Restrict the language down to what avoids all possible UB you can and then run and test you code for any possible memory errors and use extensive fuzzing (that’s throwing random trash data at your code to cause it to break) and take any “unexplained” crash as a sign you have a problem. You should also have test suites that you run against a binary with zero optimization, then again run the same data against an optimized one. If you don’t get the same output then the compiler has effectively seen some UB, failed to tell you about it, and exploited it in the optimization stage.

C is ultimately a dangerous programming language that should be avoided at all costs these days. If you can use any other language with less UB then do it.