Logic programming library development log

I think you and I must mean different things by "abstraction." The increasing power and utility, and decreasing size, of modern computers is all the more reason why abstraction is important.

We don't even have to look at the extreme case of de-abstracting down to the hardware level, which would make it impossible for anyone, however expert, to talk or think about a modern application program. Even just in software, I am typing this paragraph into a subwindow in the lower left corner of the browser window. As I type, the software automatically starts a new line in the window every time the word I'm typing extends past the right margin of this subwindow. The people who implemented Discourse didn't have to implement this auto wrap! The browser itself provides that feature to all web pages. That's abstraction! Without it, and the bazillions of other browser affordances, nobody could write a program to add 2+2 and display the result in a web page.

It was in the days of huge expensive computers that I could write programs that did their own input, output, formatting, parsing, and so on.

I can sympathize with what I think you're saying to the extent that I sometimes intensely miss those days when I understood the software I used, and wrote, all the way down to the hardware. (I still took the hardware architecture itself as an abstraction over the electronics, although I didn't have the word "abstraction" in my speaking vocabulary back then.)

But, as I understand "abstraction," you have then and now backward.


About identity politics: It seems to me that the understanding of gender as on a spectrum, like the modern understanding of race as just plain nonexistent, should get us lefties to understand such divisions as entirely a tool of the oppressors to divide us, rather than something to take pride in. And yes, I do understand that I'm saying that as a white, male, err... ace antiZionist Jewish atheist. Nobody is shooting at me because of my race or sex or gender, but they are (sort of) shooting at me because of my religion, and the people who share my religious background yell at me over my politics. I don't take pride in being white, and I recognize how it privileges me, but I don't take pride in my religion -- except I guess for my expertise about smoked fish -- either.

That's what I mean about identity politics. It's not that I begrudge people their identities, it's that I think it's harmful, politically, for people to define themselves as their identities.

Identity politics may be a complex topic, this forum’s (informal?) identity policy - everyone is referred to as “they” - is undoubtedly valuable. It says: you can be whoever, and however, you want to be here; we (the community) are OK with that, we accept you as you are, and how you see yourself. Calling some people - persons (you think) you know - “him”, or “her”, could make those who don’t feel like a “her” or a “him”, feel excluded once more … And let’s face it: using pronouns dividing a population into broad categories that are irrelevant in the context of the forum, has no added value anyway - just like calling everyone taller than 6 foot “Beanpole”, and everyone under: “Midget”.

Yeah, I'm fine with "they." Honestly, I was afraid you wouldn't like it. Sorry.

(I confess, I lump everyone under age 65 in "children.") ;~)

We're absolutely talking the same thing.

As I type, the software automatically starts a new line in the window every time the word I'm typing extends past the right margin of this subwindow. The people who implemented Discourse didn't have to implement this auto wrap! The browser itself provides that feature to all web pages. That's abstraction! Without it, and the bazillions of other browser affordances, nobody could write a program to add 2+2 and display the result in a web page.

The assumption here is the people writing the abstraction know what they're doing, and I do not believe that in the slightest.

I'm going to give a baseline as to why I think this.

Until very recently, I was using a computer from 2007, I'm using this beast thanks to a generous birthday gift.

The problem I have, is the computer from 2007 was STILL RUNNING WELL...

At some point I want to rebuild that machine with a couple of changes to it, because I still think it would keep up with modern software... and that? That's a massive, massive MASSIVE problem.

Something is broken in the industry and people who should know better are assuming everything is fine fine fine, and it isn't.

For example, and look, this part might be lost on you as you're not a gamer in any sense, but that 2007 PC was running BRAND NEW (to 2019) games at full detail. Borderlands 3 and Shadow Of The Tomb Raider to be specific, but those don't really matter.

I volunteered at a place around the late 2000's that used to refurbish old machines and sold them to people with low incomes, with decent mark ups, but they were dodgy on many levels aside from that general great intention.

Obnoxious person that I was Still am, to be brutally honest LOL, If we got, say a high end Pentium 1 or 2, ie; it could run the max technical specifications of that era, I would try playing Far Cry on it. (Brand New to 2007) or a DooM level that stretched that 2007 machine to it's limits, basically a machine that had no right trying anything of the sort, and they would absolutely struggle, 1 frame per hour kind of thing and rightly so because Early Pentiums were still at the era they were throwing mud at the wall and seeing what stuck, but that hardware was error more than trial.

(TL;DR == Computer performance basically stopped in 2007)

Then Intel realised they didn't need to actually research or improve their hardware any more and could sell the same hardware over and over again with "Increased Performance" and no actual way to verify that outside of Intel.

Then Nvidia got in on the scam and started doing the same with their video cards and the industry is treading water, not innovating, just endlessly reselling the same hardware over and over again and they learned it from phones. I love phones, I think bang for buck, phone form factor >>>> desktop form factor, but phones are being held back by the perception that they can't do anything and that you need a behemoth tower the size of a small car to do anything worthwhile, and that hasn't been true for a decade or more either.

Abstraction isn't helping, because again, it's assuming the people making the hardware and software know what they're doing, and RELYING on it with absolutely no way to verify anything.

Abstraction is basically saying "Just trust us" and I quite frankly do NOT.

The reason I want to learn to program has never changed, but I can't read jargon programming brands, and more specifically, do not even want to, because thanks to the endless game of telephone, what the hardware is capable of and what people think the hardware is capable of are two very different things.

I think visually, I want to write my own software that matches my train of thought, and block languages are the new foundation for that, but until someone writes an engineering grade block language, we're stuck running around the same circles again and again and again, and it has to stop.

I still don't quite get what you're trying to say.

Try writing a complex x86 Assembly program. Right now. Try it. Make, say, Minecraft.
I guarantee that it would be too hard, as Assembly is too basic for a complex task.

And that is still many levels of abstraction high.

Try building Minecraft by creating atoms and bonding them together to make wires, transistors, and lights. That's still an abstraction.

I submit that it is impossible (at least for us mortals ) to make anything without abstraction. Even in the real world, houses, bricks, clay, heck, even the idea of objects in general, are abstractions over the real world.

Okay, two things about this:

  1. There's nothing new about software rot (≝ an unchanged program no longer working because something in its infrastructure has changed), a technical term to which I was introduced in 1965.

  2. I think you may be misattributing the very real slowdowns you are observing. My understanding of the situation is that there is a longstanding competition between the hardware developers, who are trying to make computers faster, and the software developers, who are trying to make computers slower by inventing things such as window systems and artificially intelligent user interfaces that drain processing power from the actual applications. I've been watching this competition for 60 years now, and the software developers are consistently winning; computer systems have gotten slower and slower. When I was a high school teacher in the early '80s, one PDP-11/70 processor could support around 20 connected terminals without the users complaining about lag. Today the vastly more powerful processor in your pocket can barely support one user. (The PDP-10 systems I used in the late '60s to mid-'70s, which were more powerful than the PDP-11 but still puny compared to my current phone, could support more like 100 users.)

The people who write advertising for the hardware companies are as dishonest as you say, but the engineers, when talking among themselves, are pretty clear about the capabilities of their systems. They tell customers about "gigacycles/second" when "cycles" aren't commensurate across architectures, but in the technical literature they use units such as "gigaflops/second," which are comparable across architectures.

P.S. To be fair, we ask and expect more from our computers these days.

I'm not sure if this is what callietastrophic meant but I interpreted it as how unpleasant it is living with the floating point abstraction when modern computers are capable of using exact rational numbers with a performance penalty that doesn't matter for nearly everything people do with Snap!. Floating point was a necessity a few decade ago but it rarely is today for ordinary uses.

I almost hate to get back to topic - plenty of interesting points were made here recently (from post 115 onward), deserving topics of their own … (back to topic now).

In SICP, e.g. in the pattern-match definition (section 4.4.4.3), the constant 'failed is used to indicate a non-match. I wonder if I could safely translate that to Logic programming SICP script pic 6, or perhaps Logic programming SICP script pic 9, or even Logic programming SICP script pic 8?
Of course I can use something like Logic programming SICP script pic 11, but that seems a bit exaggerated.

Pretty much. There are many things that are still done in computation because someone did something cool with it once upon a time.

But QW23's got a point and as I said, until I figure out how to get a new brain I can't explain what I'm trying to explain.

Mind, I should suggest watching Mark Guzdial's Keynote at last year's Snap!con.
Take note of the part where he queries what users want versus what "computation" wants and think about that, also think about the part where his brief to make Teaspoon languages included the instruction "Don't scare them off" and he didn't think of that as a WARNING, instead he decided that was a challenge.

It depends what you think of as ordinary. If you turn your turtle by something other than right angles, you need trig to know its screen coordinates, and you get irrational answers. Bunches of our users write ray tracing programs. Etc.

Maybe not. Might be possible to avoid the irrational intermediate values to get to the integer pixel coordinated.

I got curious about this and found discussions like this: https://community.nxp.com/t5/Classic-Legacy-CodeWarrior/Sine-Cos-functions-without-float-data-types/m-p/169697#M3317

Might be fast enough and accurate enough for some purposes.

Brian Silverman implemented fixed point trig functions for, I forget, LogoWriter or something. But you don't want to get results in integer pixel coordinates, even though that's what the graphics hardware will use. You want units of 1/10,000, or 1/32,768 as that article suggests, because if you round to integer pixels your polygons will end up not closing, because of accumulated error. Instead you have to keep track both of the hardware coordinates and of the more precise pseudo-irrational-valued coordinates.

But you don't want to do that now, when floating point is both fast and 128 bits wide. It's back then, as in all the other cases we're talking about, that breaking the abstraction barrier might have been useful.

I mean, everyone got scared about trusting floating point because of the Pentium divide bug. But that happened because of a last-minute, pretty much literally, maybe last-15-minutes, reversal by an inexperienced engineer of a correction that had already been made to an algorithm, slowing it down a little, but making it correct. The doofus thought, "hey, I can speed this up!" If everyone had followed Intel's very detailed and carefully designed engineering process, it wouldn't have happened.

To return to the earlier question of can we avoid floating point since it presents the user with a slightly flawed and confusing abstraction, whether we can use floating point for turtles since the user doesn't see the trigonometric operations so it can be done without confusing the user. And one can imagine making rational numbers the default and still support floating point when users do advanced operations. Perhaps floating point numbers should display as grey to indicate that they are approximations.

As an aside it interesting how many machine learning models use 4 or 8 bit floating numbers. Minifloat - Wikipedia

???prolog???

Oh, I agree that dividing integers should give an exact rational answer, if that's what you mean. I just boggle at calling √2, for example, "advanced."

A cool language from the 80s that allows you to write code declaratively. That means that you describe what something is, unlike imperative languages, where you describe how to do something.

example in snap

example

A classical example is a Sudoku solver.

In an imperative language, you'd have to write an algorithm to calculate the boxes, make guesses, etc.

However, in a declarative language, you'd say (in code):

"A Sudoku is a 9×9 grid that where every row, column, and inner 3×3 grid should have one of every digit," and it works.

The goal of @qw23's library is to be able to do this in Snap!.