hail mattbatwings
Impressive project. I've never seen any neural network project in Snap! until you made it. It's very accurate.
would be nice if it could be instant
Whoa.
Are the actual data in the variables with "Weight" in their names?
Where did the data come from? For that matter, where did the algorithm come from? This needs more project notes. :~)
I tried giving it Greek letters. It didn't do so well on lambda; it guessed 2. Some of its choices make visual sense even if not linguistic sense, e.g., it thought my lower case gamma ɣ was a Y.
I learned that, even though I know all the Greek letters from reading math books, I don't remember Greek alphabetical order past epsilon. :~(
nice! In v10 (dev) instead of using replace
for updating the hidden layers you can use change by
(hyper-mutation).
Very nice indeed!
When a letter or digit isn't recognized very well (like, < 2:3 certainty - whatever that may signify ), you might want to show the next best match(es), too.
lol. the data is taken from a model i found
ok. how would this affect the speed
its trained in only numbers and letters
Yeah I know. What I mean is if e.g. the model thinks there's a 43% probability of the handwriting being a "B", 37% probability that it's a "8", 7% its a "R", and so on - that it will then show:
"B" : 43%
"8" : 37%
honestly the data stores that information. i just couldnt find a nice way to display it
I reckoned it would.
You will, eventually.
no i wont. i already started on my next project
can this go into the comp sci studio
Done!
ty
it only accepts english
yes
because hyperblocks, a.k.a. linear algebra beats looping and even HOFs in terms of how long it takes.
interesting. honestly I dont know if ill have patients to change it.