Random Word Generator (Markov Chain)

Similar to my YouTube channel's pfp i custom made

image

Vector or Bitmap format?

Just for me

Try vector first

if its good, i might use it as my normal pfp

Definitely not my best, but...

Sladescar

will it help if you use the distribution of bigrams? (ie. one letter followed by another) - here's the distribtion of 2-letter combinations in english words.

it means for example that H-followed-by-E has much more occurences (dark red) than H-followed-by A (orange), and Q is really followed by only U practically.
i tried to upload the csv representing the above but the forum won't let me. you might want to see the source where i got it from
https://blogs.sas.com/content/iml/files/2014/09/bigrams.txt

the data starts after datalines;

Apparently "th" is the most common bigram in English, which makes sense, because the most common word in English is "the".

in fact, "HE" is also up there because it's part of "the"!
i'll upload the csv into google drive in a few minutes. here's the grid

open it then download as csv.

Yes, "he" is the second-most common bigram in English. Not to mention it's a whole word by itself.

Does this mean that you warped it and that's why it's faster?

oh thats pretty cool

No, it is because MAP lags a lot.

Not to mention this, that, then, there, those, think...

Hmm, that's interesting. It definitely lagged a lot back when it was written in Snap!, but now that it's a primitive it's generally pretty fast. Is the mapped function simple enough so you can compile the MAP?

And the -th suffix for numbers: fourth, fifth, sixth, etc.

They are already compiled.

Huh. How big is your input list? (I loaded the project link way back at the top of this thread but it doesn't have any data in it.)

I like Machine Learning and/or AI projects.

Wow this cool

Welcome to the community!