I thought about using ((#) mod (#)) to see if the number of 1’s and 0’s make sense for the bits, but then it wouldn’t work for one bit as modding anything by one is just going to be zero.
then I thought of checking the size, but then someone could just be testing one color and it would add a 0 and make it not make sense.
for binary into color, like a 1 bit being black and white, 3 bit being red green and blue. 010101 should be green and purple but it make purple then green
The numbers represent the level of thee color in a pixel, so three bit has either an on or off for red green and blue, and then nine bit will have three numbers for each color. the way my block works is that it gets each number for its color, so for nine bit it’ll grab the first three for red, the next three for green, and then blue. and then sums them up and divides 255 by it and sets the level of that color to whatever that equals
look up color bit palettes on Wikipedia and it explains it better than I am
if anyone else sees this, the code works, it seemed like it wasn’t because I was using the Show Variable block to check it. Turns out variables can’t show numbers that start with 0 and just shows what’s next after they end. Ex: 0022 = 22, 0101 = 101, 0101001 = 101001, and so on. everything works, it’s just what it was showing didn’t match with what I expected.