Exact opposite. The analogy I use is the Rosetta stone. We're taught in school that there was one, except the truth was, we had hundreds of the things, we just didn't realise what they were until someone realised the other sides were not decorations and it was actually an ancient dictionary, but time and context had stolen that knowledge.
My issue with abstraction is that every language, high or low level insists on starting from scratch and that every other opposing brand name language is wrong and incompetent and that it's my way or the high way.
Which means that we get cross talk over artificial barriers and no-one dares venture outside their silos to actually check how much ground gets covered and recovered.
I was trying to find the MIT open press page again looking for the example text when I stumbled upon the 1986 lecture by Jay Sussman, the one where he looks like Matt Smith's doctor decades before Matt Smith chose the look lol and he makes a comment in that about "There are people who are suggesting graphical interfaces and I really can't see the point, with the images you're preventing the pass through"
And he was thinking in form of image compression and overhead.
He was already wrong.
The Unicode consortium wasn't officially around yet, and they wouldn't meet until 1987, but Joe Becker's 1984 thesis was the origins, so it was already being considered. Likewise, emoticons were first proposed in 1981, but it wasn't until 1992 when it was popularised in japan via a pager aimed at teenagers and called emoji that the idea started to grow.
It wasn't till 2007 when those two concepts finally emerged and the Unicode Consortium started publishing the Emoji standard that they would finally merge.
Snap/Scratch == [scratchblocks] (2 + 2) [/scratchblocks] == Scheme == (+ 2 2)
As you say, it's an abstraction, but the computer can't tell the difference between those two operations, because one is visual and the other is bracketed.
The computer does not care. We do... and that's the problem. That's the entire problem.
Wether it's Snap! or Scheme or C or Lua or Brand Name Here, the whitespace is human readable, but the computer does not give the slightest of crap.
I should be able to drop a .C file or a .scm or a .whatever file into snap and snap will read it. The idea that it's impossible is artificial. I can't prove it is the point, and what annoys me is that people DON'T GET IT.
Scratchblocks existence should be the clue. It's right there. It's RIGHT *********** THERE. but "Nah we can't do that due to "translation layers" or whatever excuse I'm given. They're wrong.