Logic programming library development log

I don't agree. Abstraction is actually an indispensable tool for coping with complexity, and enabling cooperation. How can anyone make a decision about anything (like what's going to be for dinner tonight) if they insist on first understanding all involved factors down to the level of elementary particles (and everything - everything! - in between).

If anything "has to go" it would be secrecy: one needs to be able to inspect / test any-thing reportedly underlying a phenomenon / service / claim. However even that is not always feasible, e.g. I wouldn't want criminals to spy on me, take the money, and run - "the enemy" to disrupt society / abolish the rule of law / trample dissidents, for that matter - would you?

It's a very real dilemma: sometimes we just need to trust others / authorities / "the process" beyond a certain point, and may be disappointed eventually; but if we don't, things could get far worse. So we need trust-enhancing mechanisms - a free press, a privacy authority, corporate reputations, open source platforms - and even those may turn out wolves in sheep's clothing. An ideal world isn't in sight.

I may have (unconsciously) hinted at you :wink:

BTW I recently wrote a Scheme pair constructor too, but rather differently:

Logic programming SICP script pic

Now isn't that interesting (@callietastrophic): two very different implementations that may be abstracted from by using the same hat block? The thing is, anyone can build a universe of Scheme-blocks code on top of either @bluebaritone21 's implementation, or mine, or anyone else's, without necessarily knowing what's under the hood - and it could be ported to any other implementation without a rewrite.

C'mon. Obviously @callietastrophic doesn't want to program by hooking oscilloscope probes onto the wires inside a chip. (Although Brian Silverman's famous laptop has a circuit-level simulation of a 6502 running on it, and he programs it.) But she does mean something by this. (That's the right pronoun, right? Or do I mean "they"?) For her the particular abstraction barrier between blocks and text is super important, for reasons I admit I only slightly understand.

I'm having trouble understanding this, because to me these two paragraphs are contradictory. The first seems to say that you want to live above the block-programming abstraction; the second seems to say that you want to live below it.

Certainly something has to be done about that, but as you point out later, two of you have done that. I can imagine a version of PRINTABLE that recognizes some abstraction for a dotted pair and includes the dot in the printform. And a reader that reads dot notation, for the other direction.

Rather than adding dotted pairs to Snap! 's linked list implementation, if we're going to change Snap! itself for the sake of this project I'd rather generalize to a much more useful feature, namely, allowing users to provide read and print hooks for abstract data types, so (make-rational 2 3) would display as ⅔ rather than as a list. (Just imagine if the lowest level of the 3D array that represents the bitmap of a costume were displayed as little color squares!) And (more challenging) so that the reader would recognize [-]?[0-9]+/[-]?[1-9][0-9]* as a call to make-rational.

Oh for sure!

Hahaha. You're good at this.

Not speaking in name of @callietastrophic of course, I think consistently using “they” is fine (I wouldn’t have agreed perhaps two years ago!). Even though it may seem absurd in cases where you actually (think you) know the "identity" of the person behind the @vatar - because, who knows, anyone might one day feel different (or even experience multiple personalities).

Huh? Too challenging for me, I'm afraid.

I'm a woman, so She/Her are my pronouns, but on a neutral place like a forum, I generally don't assume or ask, but I do try to refer to them as they, because, they are a person no matter what (well, we'll see in a couple years when skynet ChatGPT... nah I was right the first time takes over)

As for this topic. It frustrates me because I know what I'm driving at but iunno. Until a miracle happens and I suddenly get past my brain block, I can't prove it.

But I also don't see it happening any time soon and I have no idea where I would start to even attempt it.

Also I feel that every time I post in a thread I hijack it, so yeah, I'm still here, mostly just reading

I hope you'll find what you're looking for one day!

... or at least: be able to define it (even if it doesn't exist yet, perhaps you a/o others may create it)

Well yeah, I kind a want to create it, if only because I neeeeeeeeeeeed the money, but even if someone else creates it and I can use it properly and start making software I don't hate lol.

but until that day, (Apocalypse +1), I'll keep sitting around getting annoyed at everything.

That's a regex meaning possibly a minus, followed by 1 or more digits, followed by a /, followed by possibly a minus, followed by the digits 1 through 9, followed by 1 or more digits, eh?

The only questions I have is why is the last digit check not optional, and why a wild card?

Oh, maybe I'm getting the notation wrong? I thought that [0-9]* means zero or more of 0-9, same as + means one or more.

The denominator is different from the numerator, of course, because you can't divide by zero.

The general point I was after is to have a built-in reader that would keep a table of regexes corresponding to abstract types. So you could say ⌖([-]?[0-9]+,[0-9]+) as the input notation for a point, etc.

Yeah, I wasn't sure about posting that; it's a fine line, but my understanding of the current state of etiquette is that you don't ask someone's gender, but you do ask their pronouns. I agree that there's an inconsistency in that.

I wonder if we should add a "pronouns" field to the forum profile.

OTOH, maybe I'm getting old, but I'm skeptical about identity politics. Except for, you know, workers vs. bosses.

Identity politics is pretty much accurate, it's just it's opponents are loud. Very loud.

It used to be an Or statement. You were either Male OR Female and your life was decided based on which way your chromosomes folded, but there has always been resistance to that, it's only now that the science has pretty much confirmed that it, like a whole lot of other things, is a spectrum.

Much like early computers could only do on or off for speakers ((not quite true, but close enough)(The original Lemmings soundtrack was written for PC speaker and was burning into my brain lol(Pachelbel's canon sounds impressive even on limited hardware lol))) or could only do a limited amount of colour within a fixed amount of bytes, as computers have gotten more powerful the amount of space allocated has increased and thus colour and sound depth have increased and clarity is much improved.

To tie it back into the topic, kinda, this is why I'm frustrated with computation as a whole, a lot of this stuff, like abstraction, was designed when computers were HUGE, Expensive, Rare and most importantly, kinda absolutely useless at anything, and now none of that is true, but we still write and design and teach software/hardware as if nothing has changed.

Hardware is still relatively expensive, but what you can buy for the same price dwarves what you could buy in the eighties, and if it fails, you can claim it on warranty. (most of the time)

Which is why I think abstaction has to go. We now have the ability and capacity to watch the machine in real time instead of guess and work around, and the baseline is still "trust the machine" and given a lot of the quirks of modern software hardware are because of those machine issues, see floating point for example and it's absolutely insane. We can solve those problems, but instead we hyper focus on decisions made in the eighties or earlier as if they're set in stone.

I think you and I must mean different things by "abstraction." The increasing power and utility, and decreasing size, of modern computers is all the more reason why abstraction is important.

We don't even have to look at the extreme case of de-abstracting down to the hardware level, which would make it impossible for anyone, however expert, to talk or think about a modern application program. Even just in software, I am typing this paragraph into a subwindow in the lower left corner of the browser window. As I type, the software automatically starts a new line in the window every time the word I'm typing extends past the right margin of this subwindow. The people who implemented Discourse didn't have to implement this auto wrap! The browser itself provides that feature to all web pages. That's abstraction! Without it, and the bazillions of other browser affordances, nobody could write a program to add 2+2 and display the result in a web page.

It was in the days of huge expensive computers that I could write programs that did their own input, output, formatting, parsing, and so on.

I can sympathize with what I think you're saying to the extent that I sometimes intensely miss those days when I understood the software I used, and wrote, all the way down to the hardware. (I still took the hardware architecture itself as an abstraction over the electronics, although I didn't have the word "abstraction" in my speaking vocabulary back then.)

But, as I understand "abstraction," you have then and now backward.


About identity politics: It seems to me that the understanding of gender as on a spectrum, like the modern understanding of race as just plain nonexistent, should get us lefties to understand such divisions as entirely a tool of the oppressors to divide us, rather than something to take pride in. And yes, I do understand that I'm saying that as a white, male, err... ace antiZionist Jewish atheist. Nobody is shooting at me because of my race or sex or gender, but they are (sort of) shooting at me because of my religion, and the people who share my religious background yell at me over my politics. I don't take pride in being white, and I recognize how it privileges me, but I don't take pride in my religion -- except I guess for my expertise about smoked fish -- either.

That's what I mean about identity politics. It's not that I begrudge people their identities, it's that I think it's harmful, politically, for people to define themselves as their identities.

Identity politics may be a complex topic, this forum’s (informal?) identity policy - everyone is referred to as “they” - is undoubtedly valuable. It says: you can be whoever, and however, you want to be here; we (the community) are OK with that, we accept you as you are, and how you see yourself. Calling some people - persons (you think) you know - “him”, or “her”, could make those who don’t feel like a “her” or a “him”, feel excluded once more … And let’s face it: using pronouns dividing a population into broad categories that are irrelevant in the context of the forum, has no added value anyway - just like calling everyone taller than 6 foot “Beanpole”, and everyone under: “Midget”.

Yeah, I'm fine with "they." Honestly, I was afraid you wouldn't like it. Sorry.

(I confess, I lump everyone under age 65 in "children.") ;~)

We're absolutely talking the same thing.

As I type, the software automatically starts a new line in the window every time the word I'm typing extends past the right margin of this subwindow. The people who implemented Discourse didn't have to implement this auto wrap! The browser itself provides that feature to all web pages. That's abstraction! Without it, and the bazillions of other browser affordances, nobody could write a program to add 2+2 and display the result in a web page.

The assumption here is the people writing the abstraction know what they're doing, and I do not believe that in the slightest.

I'm going to give a baseline as to why I think this.

Until very recently, I was using a computer from 2007, I'm using this beast thanks to a generous birthday gift.

The problem I have, is the computer from 2007 was STILL RUNNING WELL...

At some point I want to rebuild that machine with a couple of changes to it, because I still think it would keep up with modern software... and that? That's a massive, massive MASSIVE problem.

Something is broken in the industry and people who should know better are assuming everything is fine fine fine, and it isn't.

For example, and look, this part might be lost on you as you're not a gamer in any sense, but that 2007 PC was running BRAND NEW (to 2019) games at full detail. Borderlands 3 and Shadow Of The Tomb Raider to be specific, but those don't really matter.

I volunteered at a place around the late 2000's that used to refurbish old machines and sold them to people with low incomes, with decent mark ups, but they were dodgy on many levels aside from that general great intention.

Obnoxious person that I was Still am, to be brutally honest LOL, If we got, say a high end Pentium 1 or 2, ie; it could run the max technical specifications of that era, I would try playing Far Cry on it. (Brand New to 2007) or a DooM level that stretched that 2007 machine to it's limits, basically a machine that had no right trying anything of the sort, and they would absolutely struggle, 1 frame per hour kind of thing and rightly so because Early Pentiums were still at the era they were throwing mud at the wall and seeing what stuck, but that hardware was error more than trial.

(TL;DR == Computer performance basically stopped in 2007)

Then Intel realised they didn't need to actually research or improve their hardware any more and could sell the same hardware over and over again with "Increased Performance" and no actual way to verify that outside of Intel.

Then Nvidia got in on the scam and started doing the same with their video cards and the industry is treading water, not innovating, just endlessly reselling the same hardware over and over again and they learned it from phones. I love phones, I think bang for buck, phone form factor >>>> desktop form factor, but phones are being held back by the perception that they can't do anything and that you need a behemoth tower the size of a small car to do anything worthwhile, and that hasn't been true for a decade or more either.

Abstraction isn't helping, because again, it's assuming the people making the hardware and software know what they're doing, and RELYING on it with absolutely no way to verify anything.

Abstraction is basically saying "Just trust us" and I quite frankly do NOT.

The reason I want to learn to program has never changed, but I can't read jargon programming brands, and more specifically, do not even want to, because thanks to the endless game of telephone, what the hardware is capable of and what people think the hardware is capable of are two very different things.

I think visually, I want to write my own software that matches my train of thought, and block languages are the new foundation for that, but until someone writes an engineering grade block language, we're stuck running around the same circles again and again and again, and it has to stop.

I still don't quite get what you're trying to say.

Try writing a complex x86 Assembly program. Right now. Try it. Make, say, Minecraft.
I guarantee that it would be too hard, as Assembly is too basic for a complex task.

And that is still many levels of abstraction high.

Try building Minecraft by creating atoms and bonding them together to make wires, transistors, and lights. That's still an abstraction.

I submit that it is impossible (at least for us mortals ) to make anything without abstraction. Even in the real world, houses, bricks, clay, heck, even the idea of objects in general, are abstractions over the real world.

Okay, two things about this:

  1. There's nothing new about software rot (≝ an unchanged program no longer working because something in its infrastructure has changed), a technical term to which I was introduced in 1965.

  2. I think you may be misattributing the very real slowdowns you are observing. My understanding of the situation is that there is a longstanding competition between the hardware developers, who are trying to make computers faster, and the software developers, who are trying to make computers slower by inventing things such as window systems and artificially intelligent user interfaces that drain processing power from the actual applications. I've been watching this competition for 60 years now, and the software developers are consistently winning; computer systems have gotten slower and slower. When I was a high school teacher in the early '80s, one PDP-11/70 processor could support around 20 connected terminals without the users complaining about lag. Today the vastly more powerful processor in your pocket can barely support one user. (The PDP-10 systems I used in the late '60s to mid-'70s, which were more powerful than the PDP-11 but still puny compared to my current phone, could support more like 100 users.)

The people who write advertising for the hardware companies are as dishonest as you say, but the engineers, when talking among themselves, are pretty clear about the capabilities of their systems. They tell customers about "gigacycles/second" when "cycles" aren't commensurate across architectures, but in the technical literature they use units such as "gigaflops/second," which are comparable across architectures.

P.S. To be fair, we ask and expect more from our computers these days.

I'm not sure if this is what callietastrophic meant but I interpreted it as how unpleasant it is living with the floating point abstraction when modern computers are capable of using exact rational numbers with a performance penalty that doesn't matter for nearly everything people do with Snap!. Floating point was a necessity a few decade ago but it rarely is today for ordinary uses.

I almost hate to get back to topic - plenty of interesting points were made here recently (from post 115 onward), deserving topics of their own … (back to topic now).

In SICP, e.g. in the pattern-match definition (section 4.4.4.3), the constant 'failed is used to indicate a non-match. I wonder if I could safely translate that to Logic programming SICP script pic 6, or perhaps Logic programming SICP script pic 9, or even Logic programming SICP script pic 8?
Of course I can use something like Logic programming SICP script pic 11, but that seems a bit exaggerated.