Pretty much. There are many things that are still done in computation because someone did something cool with it once upon a time.
But QW23's got a point and as I said, until I figure out how to get a new brain I can't explain what I'm trying to explain.
Mind, I should suggest watching Mark Guzdial's Keynote at last year's Snap!con.
Take note of the part where he queries what users want versus what "computation" wants and think about that, also think about the part where his brief to make Teaspoon languages included the instruction "Don't scare them off" and he didn't think of that as a WARNING, instead he decided that was a challenge.
It depends what you think of as ordinary. If you turn your turtle by something other than right angles, you need trig to know its screen coordinates, and you get irrational answers. Bunches of our users write ray tracing programs. Etc.
Brian Silverman implemented fixed point trig functions for, I forget, LogoWriter or something. But you don't want to get results in integer pixel coordinates, even though that's what the graphics hardware will use. You want units of 1/10,000, or 1/32,768 as that article suggests, because if you round to integer pixels your polygons will end up not closing, because of accumulated error. Instead you have to keep track both of the hardware coordinates and of the more precise pseudo-irrational-valued coordinates.
But you don't want to do that now, when floating point is both fast and 128 bits wide. It's back then, as in all the other cases we're talking about, that breaking the abstraction barrier might have been useful.
I mean, everyone got scared about trusting floating point because of the Pentium divide bug. But that happened because of a last-minute, pretty much literally, maybe last-15-minutes, reversal by an inexperienced engineer of a correction that had already been made to an algorithm, slowing it down a little, but making it correct. The doofus thought, "hey, I can speed this up!" If everyone had followed Intel's very detailed and carefully designed engineering process, it wouldn't have happened.
To return to the earlier question of can we avoid floating point since it presents the user with a slightly flawed and confusing abstraction, whether we can use floating point for turtles since the user doesn't see the trigonometric operations so it can be done without confusing the user. And one can imagine making rational numbers the default and still support floating point when users do advanced operations. Perhaps floating point numbers should display as grey to indicate that they are approximations.
As an aside it interesting how many machine learning models use 4 or 8 bit floating numbers. Minifloat - Wikipedia
Oh, I agree that dividing integers should give an exact rational answer, if that's what you mean. I just boggle at calling √2, for example, "advanced."
A cool language from the 80s that allows you to write code declaratively. That means that you describe what something is, unlike imperative languages, where you describe how to do something.
I do wonder if one analysed all the Snap! saved on the server what fraction of projects would use square root. My guess is less than 1/1000. Of course real-valued functions should be supported as well as is possible.
Oh, well, I wonder how many even use custom blocks! When people post projects here, the ideas are often great, but the code often makes me want to cry. The same chunk of code copy-pasted a dozen times instead of using a custom block, or a loop, or abstraction of any sort. (Note, I said "often," not "always," gang!)
But our target audience is old enough not to boggle at irrational numbers. Imho.
Man, this topic is bound to be this forum’s one topic with most off-topic posts that deserve their own topics, ever (I tried to redirect discussion to the main topic at least twice, but it’s no use - the diversions are too interesting) … here’s my contribution to the discussion on FP: extending the “Bignums …” library with irrationals seems like a logical thing to do.
I think it is feasible (though presumably a big effort) to add an AI coach to Snap! that might popup a message like "The project you just saved can be made smaller and easier to work with if you used custom blocks. Would you like me to help you rewrite it be easier to understand and improve?" If the answer is yes, it could point out copies that are identical or only differ in some constants and walk the user through how they can make a custom block and call it.
Perhaps the first step (detecting copies of similar code) wouldn’t be too much work, provided it doesn’t have to be perfect. I doubt if it could even be called AI, but that’s your expertise I guess.
If it works well enough, two more things are required (IMAO) for a Minimum Viable Product:
a facility (with a user setting) that will start the (pop-up) check e.g. when the user saves a project (Snap! development team);
a general tutorial on custom blocks, loops, etc., that the popup-window can guide the user to (BJC?).
Detecting copies of similar code
I’ve been thinking of a block that will read pieces of code from a script or custom block. I think it should save those pieces in a database with keys such as:
the first block within the upper code level, and
the number of blocks within the same upper-level.
Next step: calculate a frequency distribution of code pieces that are exactly the same. Any selected piece beyond a certain size a/o frequency qualifies for being highlighted.
Thirdly, a more time-consuming comparison to select similar yet not identical pieces of code. Pieces beyond a certain size and degree of similarity qualify.
I don't understand; we already have irrationals, even without bignums. I suppose we could widen the number of bits, by some fixed amount, but if we try for infinite precision decimals, never mind even irrationals, even 1/3 will cause an infinite loop.
We could take exact rationals, and then make the extension field that adds square roots, representing them as (cons sqrt 3) or whatever, I suppose. But if we were to start down that road, I'd be more inclined to make another library that does full symbolic algebra as in Macsyma (free license) or Mathematica (the one you've heard of). That is to say, I'd be more inclined to encourage some user to do that.
No, Snap! has floating point, which is an approximation of what may be a rational or an irrational number.
I don’t want to argue about whether it should be in a separate library or an extension of Bignums, but yes, there may be users (not me, though I might contribute) willing and capable of doing so. Perhaps better start small.
= and comparison predicates might not terminate. Perhaps they should have an optional argument that indicates when to give up and raise an error.
Regarding printing/display of (irrational) numbers I came up with a solution in
Where each digit is slightly smaller than the previous digit. The number is displayed until the size of a digit is so small you can't tell which digit it is. The trick is that in the interface you can zoom (or expand the number) as much as you like and more digits are computed.