How to visually discriminate between sprite properties and user variables

Sigh, you guys want us to be C. (All your favorite languages have those notations because C does.) (EDIT: No you don't; rereading the original posts, you just want those semantics, not that notation.) And the reason C has them is that they reflect operations in the DEC PDP-11 machine language, so they were easy to compile back in ≈1980.

And they're part of the reason why a lot of kids think computer science is too hard for them. Because += isn't a notation they learned in elementary school, so it looks mysterious. Once you learn it, it feels obvious; what else could += mean? But it's not obvious at all.

(In fact it's so nonobvious that the original C notation back in 1980 was "=+", which is much more natural as a notation for x = x + 3. They changed it because you had to use spaces to distinguish it from x = +3.)

Our goal as a language for learners isn't to minimize keystrokes, especially since in a block language you don't have to use any keystrokes at all, so we can have primitive names such as ALL BUT FIRST OF. So there's no need for us to use terse but unfamiliar strings of punctuation characters.

P.S. foo++ and ++foo don't mean the same thing, speaking of obscurity of notation.

&=,|=,!=(but in a different way lol),~=,^=

Agree

Oh.Maybe the reason I don't think its obsecure is because I learned JS in the summer preceding first grade

Good grief.

..and then scratch at about the same time

Processing js is somewhat syntaxful.One day I got Oh noes and debuged 2 hours.Then I found out that objects should have "," as delimiter,instead of ";"

"Syntactic sugar leads to cancer of the semicolon." --Alan Perlis.

Yeah, Kinda? I mean, not really because C is a trainwreck of a language that has cause more problems to coding than it ever solved, but what I want, is do be able to do what I can do with PNG files and drop a .C or .h or .* file into a block based IDE and have it show me the text, the blocks, the assembly, the machine code AND what it's actually doing, and the ability to step mode it if necessary

I'd do it myself because it's what I personally want, except I can't do it. Is it me, yes. Absolutely it's me. If I knew why I had so much trouble trying, I wouldn't be stuck. (I have an theory tho...)

I'm desperately waiting for the SAP Snap3 course, because I'm hoping, it'll contain the rosetta stone that helps me GET it. I don't think it will, I'm inclined to think it'll be awesome but I'll still be stuck in neutral still.

The reason I want to do it myself is because what I want is not the end goal of snap! I'd happy to make it an extension, but I'm allergic to text languages.

TL:DR; Snap! is the tool I have, what I want is to extend Snap! into the tool I want that will make the tool I need, and then when I get there, realise I'm at step 1 again.

This was kind of the idea of smart script pics: to have a representation that can be used at various levels of abstraction. But all inside Snap!-world, rather than with those other kinds of files.

Sure and I love it, it is a favourite feature. The amount of things I've saved and dropped into snap is fairly high.

The thing is, I want a compiler. Me personally.

I'd make it myself if I was capable.

I've been using computers at a very shallow level since I was six. I'm very much not six any more. Haven't been for a couple decades, and yet, I still can't code. I built this machine and yet. This brand new machine, aside from a couple of format changes, is the same as my first personal computer that wasn't the family computer. That was a Pentium 4. Chips are smaller, ribbon cables are GONE (Thank Duck) but... it's the same thing.

And it drives me up the wall that people don't see the problem with that.

very agree

To explain, I have the source for most of the Carmack id games and the zork games, with the idea that I'd transpose them into snap and I get nowhere because well, I'm not John Carmack and he was solving how to put 3D into a 486 without much loss of clarity. Something he did very well.

I also have the source of one of the precursor to block languages, an obscure program made by a couple german wizkids in the demo scene called Werkkzeug that was created back in the mid 2000's, written in C++. It can do a whole heap of procedural wizardry and I wanted to know how it ticks. Still do.

The idea is I want to port that to snap too. Whether or not it's a good idea remains to be seen.
My ambitions are well beyond the goals of snap and I'm well aware of this, my ambitions are also well above my skill level and I'm also well aware of that.

I don't really want snap, I want an enterprise level block language, preferrably one I wrote, because I know how my brain works and if someone else did it I'd ask them how and they'd get annoyed because they didn't explain the detail I was after properly. Yes, I'm slightly crazy. ("Slightly")

Huh. That's news to me; I haven't looked inside a computer since forever ago. I guess a 64-bit-wide ribbon cable would be a little awkward. But they still make circuit boards with one edge full of gold-plated tongues sticking out; do they hide a parallel-to-serial converter inside the sockets they plug into? I guess they could, these days, with super fast serial protocols.

Still a child from my vantage point, then!

You mean you want to write a C++ compiler in Snap!? That's not so very crazy. It'll be easier when we get around to text boxes. I wrote a (subset) Pascal compiler in Logo long ago (https://people.eecs.berkeley.edu/~bh/v3ch5/langi.html). Someone wrote, astonishingly, a Scheme interpreter in Scratch! No custom blocks! No abstraction! The author then disappeared off the face of the earth. That was their one and only project.

That would be GP. Although yes I saw that you want to write your own.

Speaking as someone who is sometimes asked such questions, I diffidently suggest that if you want a detailed answer you have to ask a detailed question. If you can articulate what you really want to know, people will be happy to talk about their work, imho.

Huh. That's news to me; I haven't looked inside a computer since forever ago. I guess a 64-bit-wide ribbon cable would be a little awkward. But they still make circuit boards with one edge full of gold-plated tongues sticking out; do they hide a parallel-to-serial converter inside the sockets they plug into? I guess they could, these days, with super fast serial protocols.

Hell, it's not even a cable any more, I do still have a couple Serial ATA cabled drives in this, but the main drive in this is essentially a PCI-E thumb-drive at 1TB capacity. It's, kinda nuts, but it is still a gold plated tongue, just where you put it has changed.

I used to build things at a company that used ribbon cables, and I always adored when the "Red cable is pin one" was either faded or dotted or otherwise invisible... SO ANNOYING.

Still a child from my vantage point, then!

Yeah, I'm not that old, tbh. Numbers keep incrementing tho. Those darned numbers lol.

You mean you want to write a C++ compiler in Snap*!* ? That's not so very crazy. It'll be easier when we get around to text boxes. I wrote a (subset) Pascal compiler in Logo long ago (Computer Science Logo Style vol 3 ch 5: Programming Language Implementation). Someone wrote, astonishingly, a Scheme interpreter in Scratch! No custom blocks! No abstraction! The author then disappeared off the face of the earth. That was their one and only project.

Huh, that's a shame. That sounds cool.
When I discovered Snap in December 2017, the first three textbooks I devoured were your LOGO books. I then jumped onto SICP and Scheme. So I've messed around with text languages and I understand logo and scheme far better than I ever did C or C++, but ask me to program something in them? Good Luck. I can follow examples really well, but sit me infront of an interpreter and ask me to solve a problem. Knowledge flies through nearest window

That would be GP. Although yes I saw that you want to write your own.

I mean... if someone's beat me to it, I'd take it, but it all boils down to me wanting to know how the sausage is made. I don't really NEED to know that, but I do. I'll look into that.

Speaking as someone who is sometimes asked such questions, I diffidently suggest that if you want a detailed answer you have to ask a detailed question. If you can articulate what you really want to know, people will be happy to talk about their work, imho.

Questions like, for example;
Why the flying Duck are we still on 64bit, where is 128 or 256bit?
Why are we still pretending RISC is anything other than garbage?
Endianess? Nobody Cares, can we leave binary in the bin please?

You know, questions that are easy to answer and don't provoke arguments that just starting will take half an hour just to explain the basics of the basics.

I have issues with the path technology is taking. I'm the first to disclaim my opinions as crack pot nonsense, but only really because I can't prove/disprove them. As I said, Slightly Crazy. (Utterly Insane)

Ooh, them's fightin' words! Berkeley is a big center of RISC development. Why do you think it's garbage?

2^64 ≈ 10^19. That's a tiny number if you're thinking about how many water molecules on earth (≈10^40) or how many atoms in the observable universe (≈10^80), but if we're thinking about ordinary things, such as the number of devices on the Internet (≈1.4×10^11), or the number of bytes of memory in your computer (≈10^14 probably, including the filesystem), we're nowhere near needing more than 64 bits. Even those astronomical things will fit in 128 bits.

Also, think about cost. Adding one bit to the width of the bus doubles the amount of hardware you need to fill that width. There's also a cost in the speed of the computer, which goes down as the bus widens. (Luckily we get better at hardware density also, which counteracts the bus width in determining speed, but we're getting better at density more slowly these days, not least because more hardware means more heat, which threatens to melt the circuitry and/or set your house on fire.)

Programmers don't care, but the electrical engineers who build the circuitry have to pick a standard. It doesn't matter so much which one you pick, but it's like putting the hot water faucet on the right: there's no particular reason why right is better than left for that, but it had better be the same decision everywhere or people are going to burn their hands. Same for putting the red light above the green light in traffic lights. (Especially important if you're a color blind driver!)

(I learned programming at a time when different widely used computers had different endianness, and it was a real pain in the butt for network programmers.)

I don't think we should inflict binary arithmetic on children, but then again I don't think we should inflict decimal arithmetic on children either. We do so (binary I mean) mainly because it's the only thing about computers that teachers really understand.

Are those the kinds of answers you want? Detailed enough?

Not gonna lie, I wasn't really after an answer to those questions, they were more trying to set a bar as to the kind of questions I ask.

I mostly know the answers to them, my issue with that and other questions is that a lot of them were established a long time ago and given everything we've learned, isn't it time to find where to put a new foundation?

Let alone a new ceiling?

The thing is, with 2^64, if we were to make a single hard drive at that capacity, unless my math is off (VERY high chance), the current internet would fill one of those disks in two months, let alone any theoretical future internet where everyone is online (By everyone, I mean greater than the current global population of the world). So we're already heading towards a point where 64bit is not enough and we need to figure out that roadmap now instead of some unknown time in the future. Which is how we got Y2K in the first place.

The thing is, strictly hypothetical, because as always it's easier said than done, but instead of x bit bytes, if we were to go back to 16bit bytes, but instead of 0 and 1 we use the alphabet, 26^16 gives a larger problem space than 2^64.

Someone needs to actually BUILD that and as I said, far easier said than done, not in the least because two paths requires complex circuitry, how stupid would 26 path circuitry be, but is anyone doing that research and how are they getting on?

Unrelated, except the same topic, someone in another conversation elsewhere revealed the existence of https://godbolt.org to me, Which is a compiler explorer, which takes code and shows how a compiler reads it depending on architecture, and that's a part of what I'm after! So watch as I mess with that for a bit and end up frustrated in a whole new way! but GP and godbolt with a block component is very much a path I want to try to take.

I think that we should learn binary,ternary and hexadecimal :~P

Yeah but you're one of that small number of freaks who enjoy and are good at math. We can't design curriculum around you.

Interesting. But I'm having trouble working out what you want. Sometimes you seem to want to understand computation on a more abstract level, e.g., with higher level languages, and you complain loudly about Bits as a topic, and other times you want to think precisely about bits, in the guise of bus widths and RISC vs. CISC.

(By the way, you're not allowed to say "everyone uses x86, so it must be better." The answer to that is "everyone also runs Windows.")

Yeah,hence the :~)(i think that i saw you saying that this means sarcasm)

:slight_smile:

Ah, no, I said that about :~P (tongue sticking out smilie).