Random block without using random blocks!

Are you sure?

Yes. Look at the code:

It makes the number big, when Snap! starts to use Floating-point Arithmetic.

But are you sure floating-point arithmetic is random? I don't think it is. It just loses precision, which I think actually makes it less random.

I mean, it uses modulo function, which returns the remainder when you divide that big number with the Unix time. And when it loses precision, that could make differences which (for me) it's essential to make something random.

No. That would be right if the loss of precision were chaotic, maybe, but it's not; floating point numbers are all rational, and the probability of a randomly chosen real number being rational is zero.

Choosing a value at random from a finite set (such as the integers between A and B) is mathematically easy to get right. You run your algorithm a million times (or some number much much larger than |B−A|) and you should get each number with very nearly equal frequency.

But choosing a random integer with no limits on size is much harder to analyze mathematically, and choosing a random real number is even harder.

In principle, since there are only finitely many floating point values (for a given width), all you have to do is run your algorithm 2^1024 times, or some such thing, to check its quality as a pseudorandom number generator. But there are enough floats so that there won't be any more human beings to care when that check finishes, so we have to do the math as if any real value were possible in floating point, or as if any integer value, regardless of magnitude, fit in a computer. (Even bignums have a limit, since there's only a finite amount of computer memory possible in the universe.)

Since we can't really count how many times each value occurs, they use other tests to evaluate randomness. For example, they can check normality: When you represent the chosen numbers in base ten, does every digit occur equally often, does every string of two digits occur equally often... does every string of n digits occur equally often for all n? (Note that no rational number is normal, a result that I found deeply funny in high school.)

For integers, you can ask, "Are 50% of these numbers 0 mod 2" and similarly for other moduli.

But, tl;dr, it takes some very sophisticated probability theory to be confident in a pseudorandom number generator. So in practice you just have faith in the applied mathematicians who worked on your computer's math library.

How would I determine that this algorithm is useful?

If anyone wants to test this algorithm in other programming languages, here is the source code:


Python:

def random(length):
  from datetime import datetime
  from time import time
  data = [int(((((((((((datetime.now().year * 365) + datetime.now().month) * 30) + datetime.now().day) * 24) + datetime.now().hour) * 60) + datetime.now().minute) * 60) + datetime.now().second) * 1000 - time() * 1000) for _ in range(1, 1001)]
  index = 0
  for value in data:
    data[index] = int(((value / (index + 1)) ** 10) % (time() * 1000))
    index += 1
  result = []
  index = 0
  for _ in range(1, length + 1):
    item = int((time() - int(time())) * 1000)
    result += [data[item]]
    data = [int((value + result[index]) % (time() * 1000)) for value in data]
    index += 1
  return result

I am so much not an expert at this, sorry. Look up pseudorandom numbers in Knuth, or in Wikipedia, or something.

Nitpicking a bit, but true randomness is actually kind of fundamentally impossible in reality which is why it doesn't work in computing- If you knew every variable about the real world you could predict literally everything.

True randomness exists without tangible variables that could ever be acquired, I guess, it's impossible to acquire and impossible to predict.

Pseudo-randomness is the best anyone will get.

quantum machanics: You sure?

Have you thought of the brain though? :)

Unless the brain turns out to be a quantum device, which is quite possible, its capabilities are the same as those of (classical) computers. In particular, it's no more capable of true randomness.

Exactly. It doesn't have to be very complicated; as I keep saying, a simple diode is a quantum device; when you put it in a circuit, the voltage drop across it is ≈0.7 volts, but the exact value is different every time you measure it.

Or the total energy exerted to your microphone via air molecules(minus average atmosphere pressure)
Or the total magnetic force exerted to an USB data cable(is this the correct naming)
Or the random bit-flipping caused by cosmic rays hitting your RAM

Ehhh, we can still reasonably predict things that happen in the quantum realm, more or less, it's not completely out of reach. Even if it isn't entirely exact.

I suppose, but its never wildly out of tune with 0.7 volts.

I guess we can never know though, since we can't run quantum experiments with identical starting conditions without a time machine and impossibly precise tools to see if anything actually changes each time we go back.

I think the human brain might be worse at random than computers. I mean, try to come up with 10 random numbers, and compare them to 10 random numbers made by a computer.

Our brain has a lot of biased connections (by design) so obviously, some of us like certain numbers more than others, personally I like odd numbers and multiples of 5.

RIght, but if you take the voltage and subtract 0.7, you get a small but entirely random value, and you can multiply it by 1000 or something to get a value in a reasonable range. Those fluctuations are really truly random, and things like that are the only really random values.

I believe physical TPMs can do this, so snap's builtin random number might be truly random, depending on your PC and OS.