Converting list to JSON doesn't give me the correct list

There's a basic issue that could do with fixing re the single element list

Re auto inverse matching - maybe if key/value pairs were signalled by using triple nested lists instead of double nested e.g.

untitled script pic (12)
being mapped to {"item1":"item2"}

Then a double nested list could just return the simple JSON array ["item1","item2"]

But don't think a number or a string representation of a number would necessarily survive a round trip conversion and be returned in the same format it started off in
e.g
{"age":30} would be converted to list(list(list(age,30))) and then when converted back it could end up the same if Snap! checked if the value was valid numeric

But {"age":"30"} would also be converted to list(list(list(age,30))) and then when converted back it could end up as {"age":30} as Snap! would have no idea that the original 30 was a string

There is an actual bug in dealing with single element lists (as scratchmodification posted about)

The other things being discussed are more subtle issues

another problem with the csv format:
image
(3rd line is missing)

This time, the problem seems to come from the splitting block because:
image
(the result has 3 lines...)
image
(2 rows in the list)
the split block always skip the last empty record...

Yeah, I was planning to move to CSV. Then I noticed it doesn't work with 3D lists.

I need a format that supports all lists conversion without any problems like this one.

Maybe converting a list to JSON without using ? I see this as a good solution for now.

Oh stop it please! JSON is a key-value format used to communicate with web services. that's why we're using it, not as a generic universal data format.

But if a user puts a 3-D or higher list in a variable, and then clicks "export" on its watcher, Snap! writes a file called "data.json" that contains stuff that looks like JSON but without keys, just values. If you want to say we use JSON only for key-value situations, then we should find a different notation for 3-D lists, e.g., Lisp notation. Or, I don't care, Python notation! It's not about Lisp; it's about having a good export/import notation for deep lists.

On the other hand, as I remember it, we picked JSON because there's external software that knows how to read the text file and create a structured list from it. This approach works perfectly in the case of 2-D lists represented as csv, which is then readable by spreadsheet software as well as by Snap! ittself. Apparently it doesn't work perfectly using JSON for higher-dimensional lists.

Snap "stringify" lists as arrays based on internal JS type
untitled script pic - 2022-02-05T032422.035
untitled script pic - 2022-02-05T032649.447
So why not on demand?

@scratchmodification
Have you got real-world example list structures that fails for you?

What is you use case for converting lists to JSON?

Is it for transferring variable data across scenes using browser database?

Yes.

Converting split blocks, using

Not exactly

Can you provide a link to it then please? :slight_smile:

https://snap.berkeley.edu/snap/snap.html#present:Username=scratchmodification&ProjectName=JSON%20problem

I'm sorry - that bit of code is way above my level of comprehension :frowning:

What is it meant to achieve?

After a workaround and with a bit of cheating: add 2 dummy rows at the end of each list in list before running the block "json of" and remove all of these dummy row after running the block "split by json"

My tests: (project)
image
image
image
image
image
image

Project is private, again

Sorry (done)

I think your project is good and is trying to patch this error. Well done!

Although, I am planning to use a JSON converter using blocks (made by me).

Which is pretty much the same as Lisp notation, just with brackets instead of parentheses. (Unless you look at Python tuple notation, in which case it does use parentheses.)

I know. I was just making the point that this isn't me finding an opportunity to push Lisp.

3d lists...?

It can be used for that, yes, but thats not necessarily the reason.