It may look that it's all completely random, but if you run this for a long time, it could create some actual words.
How it works:
This uses a Markov Chain to generate a sequence of two letters per item of a word. For example, joecooldoo would become a list of jo, oe, ec, co, oo, ol, ld, do, and oo. Then it finds how many times each sequence is found in the Markov chain. After that, it finds the average of all of the amounts, then picks random items out of a list of sequences that are the most found in the Markov chain. Then it joins them all together! Sorry if this doesn't make sense, I don't really know how to describe it better.
@slate_technologies, I think you will be interested in this. It has to do with machine learning.
I want to wait for it to process the whole text first, then see what it makes. BTW "th" "he" and "e " being the most common two-letter combinations doesn't surprise me. "the" is the most common word in the English language. Plus, word-final "e" is also really common.