We're absolutely talking the same thing.
As I type, the software automatically starts a new line in the window every time the word I'm typing extends past the right margin of this subwindow. The people who implemented Discourse didn't have to implement this auto wrap! The browser itself provides that feature to all web pages. That's abstraction! Without it, and the bazillions of other browser affordances, nobody could write a program to add 2+2 and display the result in a web page.
The assumption here is the people writing the abstraction know what they're doing, and I do not believe that in the slightest.
I'm going to give a baseline as to why I think this.
Until very recently, I was using a computer from 2007, I'm using this beast thanks to a generous birthday gift.
The problem I have, is the computer from 2007 was STILL RUNNING WELL...
At some point I want to rebuild that machine with a couple of changes to it, because I still think it would keep up with modern software... and that? That's a massive, massive MASSIVE problem.
Something is broken in the industry and people who should know better are assuming everything is fine fine fine, and it isn't.
For example, and look, this part might be lost on you as you're not a gamer in any sense, but that 2007 PC was running BRAND NEW (to 2019) games at full detail. Borderlands 3 and Shadow Of The Tomb Raider to be specific, but those don't really matter.
I volunteered at a place around the late 2000's that used to refurbish old machines and sold them to people with low incomes, with decent mark ups, but they were dodgy on many levels aside from that general great intention.
Obnoxious person that I was Still am, to be brutally honest LOL, If we got, say a high end Pentium 1 or 2, ie; it could run the max technical specifications of that era, I would try playing Far Cry on it. (Brand New to 2007) or a DooM level that stretched that 2007 machine to it's limits, basically a machine that had no right trying anything of the sort, and they would absolutely struggle, 1 frame per hour kind of thing and rightly so because Early Pentiums were still at the era they were throwing mud at the wall and seeing what stuck, but that hardware was error more than trial.
(TL;DR == Computer performance basically stopped in 2007)
Then Intel realised they didn't need to actually research or improve their hardware any more and could sell the same hardware over and over again with "Increased Performance" and no actual way to verify that outside of Intel.
Then Nvidia got in on the scam and started doing the same with their video cards and the industry is treading water, not innovating, just endlessly reselling the same hardware over and over again and they learned it from phones. I love phones, I think bang for buck, phone form factor >>>> desktop form factor, but phones are being held back by the perception that they can't do anything and that you need a behemoth tower the size of a small car to do anything worthwhile, and that hasn't been true for a decade or more either.
Abstraction isn't helping, because again, it's assuming the people making the hardware and software know what they're doing, and RELYING on it with absolutely no way to verify anything.
Abstraction is basically saying "Just trust us" and I quite frankly do NOT.
The reason I want to learn to program has never changed, but I can't read jargon programming brands, and more specifically, do not even want to, because thanks to the endless game of telephone, what the hardware is capable of and what people think the hardware is capable of are two very different things.
I think visually, I want to write my own software that matches my train of thought, and block languages are the new foundation for that, but until someone writes an engineering grade block language, we're stuck running around the same circles again and again and again, and it has to stop.