URL bolck broken after Snap version 10.0.0

I'm wanting to build a program that is able to get all the words in the English language
so I'm fetching a file off of GitHub (meaning no CORS-Proxy required) using the built-in url block.

But instead of returning, it freezes my editor. I tried this in Snap 10.0.5, then I did again in Snap 9.0.5 and it returned instantly. now while i can code in Snap 9.0.5 I would like to code in the latest version.

All the urls I try work. Can you share the url that you're trying to get?

the forum is preventing me from posting url links.

Try putting it in a code block (image)

https://raw.githubusercontent.com/dwyl/english-words/master/words_alpha.txt

Yeah, it also freezes for me. I don't know why.

Snap tries to render the contents of the URL, but the data is very large, causing a long graphical hang. Storing the result in a variable does not crash the editor, because it is not being displayed.

Oh right, I forgot snap 10 shows the full (scrollable) text in result bubbles, whereas older versions only showed the first few lines.

Thank you.

Hmm, I don't like Snap! freezing, no matter what the user did wrong.

I wonder if we can do a stream-style computation in which we compute and display the first five items or so, with a scroll bar claiming to have the entire list, and then compute more items when the user actually scrolls.

So just what many websites already do, load the data when the user scrolls to it.

Yeah, but I'm guessing it'd be easier for us to do the same thing for all list computations than to special case the result from URL. I mean, what if you make a speech balloon from ALL BUT FIRST OF (URL ...)? That should also run fast.

I honestly have no idea what you're talking about, because I was talking about how the text result bubble (not just in the url block) would have each line load when you scroll to it, and unload if it's off screen rather than rendering the whole thing.

That's called lazy loading. It's when you load as much content as the users scrolls up on. It works with something like lists of big images. Probably mixing something up though.

Right, I know. What I'm saying is that that shouldn't apply just to "loading," but to any computation of a large list.