because you cant webscrape
no, it's just that the block doesn't work on most sites i think
ok. 1 some sites have cors restrictions. 2 an text on the site thats js the url block cant view because it doesnt webscrape
google.com works on browsers without javascript
umm. google.com contains js
Dude, what's your problem?
It's not impossible, they have the url () block, and they're already scraping websites and parsing the html
by the way, I made two HTML parsers, one with all CSS color names, if that'll be of any use
maybe you could make your own browser then
Nah, you can do your own project. But any of that code is free to use (with credit of course) if you think any of it would be useful
it litterally is impossible. you cannot create a full functioning google in snap. you cant even load google in snap. scripted text is not visible
i think i'm good for now
it isn't though? the act of getting a website's content and parsing it is literally web scraping. That's what they're doing.
the point is not if a functional google is possible, the point is that the url block doesn't even try to load it
cors restrictions, you could try a proxy
then do it. load google.
it cant.
You can't because of
You can use a proxy and use Google though. In fact, someone already has (I cant find the project ATM)
if you successfully figure out how to get scripted text to return i would love to see it
It probably doesn't work anymore because Google has changed a morbillion times since April of 2022
what

