September 28, 2015 Re: std.data.json formal review | ||||
---|---|---|---|---|
| ||||
Posted in reply to Marc Schütz | Am Tue, 18 Aug 2015 09:05:32 +0000 schrieb "Marc Schütz" <schuetzm@gmx.net>: > Or, as above, leave it to the end user and provide a `to(T)` method that can support built-in types and `BigInt` alike. You mean the user should write a JSON number parsing routine on their own? Then which part is responsible for validation of JSON contraints? If it is the to!(T) function, then it is code duplication with chances of getting something wrong, if it is the JSON parser, then the number is parsed twice. Besides, there is a lot of code to be shared for every T. -- Marco |
September 29, 2015 Re: std.data.json formal review | ||||
---|---|---|---|---|
| ||||
Posted in reply to Marco Leise | On Monday, 28 September 2015 at 07:02:35 UTC, Marco Leise wrote:
> Am Tue, 18 Aug 2015 09:05:32 +0000
> schrieb "Marc Schütz" <schuetzm@gmx.net>:
>
>> Or, as above, leave it to the end user and provide a `to(T)` method that can support built-in types and `BigInt` alike.
>
> You mean the user should write a JSON number parsing routine
> on their own? Then which part is responsible for validation of
> JSON contraints? If it is the to!(T) function, then it is
> code duplication with chances of getting something wrong,
> if it is the JSON parser, then the number is parsed twice.
> Besides, there is a lot of code to be shared for every T.
No, the JSON type should just store the raw unparsed token and implement:
struct JSON {
T to(T) if(isNumeric!T && is(typeof(T("")))) {
return T(this.raw);
}
}
The end user can then call:
auto value = json.to!BigInt;
|
September 29, 2015 Re: std.data.json formal review | ||||
---|---|---|---|---|
| ||||
Posted in reply to Marc Schütz | On Tuesday, 29 September 2015 at 11:06:03 UTC, Marc Schütz wrote:
> On Monday, 28 September 2015 at 07:02:35 UTC, Marco Leise wrote:
>> Am Tue, 18 Aug 2015 09:05:32 +0000
>> schrieb "Marc Schütz" <schuetzm@gmx.net>:
>>
>>> Or, as above, leave it to the end user and provide a `to(T)` method that can support built-in types and `BigInt` alike.
>>
>> You mean the user should write a JSON number parsing routine
>> on their own? Then which part is responsible for validation of
>> JSON contraints? If it is the to!(T) function, then it is
>> code duplication with chances of getting something wrong,
>> if it is the JSON parser, then the number is parsed twice.
>> Besides, there is a lot of code to be shared for every T.
>
> No, the JSON type should just store the raw unparsed token and implement:
>
> struct JSON {
> T to(T) if(isNumeric!T && is(typeof(T("")))) {
> return T(this.raw);
> }
> }
>
> The end user can then call:
>
> auto value = json.to!BigInt;
I was just speaking to Sonke about another aspect of this. It's not just numbers where this might be the case - dates are also often in a weird format (because the data comes from some ancient mainframe, for example). And similarly for enums where the field is a string but actually ought to fit in a fixed set of categories.
I forgot the original context to this long thread, so hopefully this point is relevant. It's more relevant for the layer that will go on top where you want to be able to parse a json array or object as a D array/associative array of structs, as you can do in vibe.d currently. But maybe needs to be considered in lower level - I forget at this point.
|
September 30, 2015 Re: std.data.json formal review | ||||
---|---|---|---|---|
| ||||
Posted in reply to Marc Schütz | Am Tue, 29 Sep 2015 11:06:01 +0000 schrieb Marc Schütz <schuetzm@gmx.net>: > No, the JSON type should just store the raw unparsed token and implement: > > struct JSON { > T to(T) if(isNumeric!T && is(typeof(T("")))) { > return T(this.raw); > } > } > > The end user can then call: > > auto value = json.to!BigInt; Ah, the duck typing approach of accepting any numeric type constructible from a string. Still: You need to parse the number first to know how long the digit string is that you pass to T's ctor. And then you have two sets of syntaxes for numbers: JSON and T's ctor. T could potentially parse numbers with the system locale's setting for the decimal point which may be ',' while JSON uses '.' or support hexadecimal numbers which are also invalid JSON. On the other hand, a ctor for some integral type may not support the exponential notation "2e10", which could legitimately be used by JSON writers (Ruby's uses shortest way to store numbers) to save on bandwidth. -- Marco |
October 02, 2015 Re: std.data.json formal review | ||||
---|---|---|---|---|
| ||||
Posted in reply to Atila Neves | Am Tue, 28 Jul 2015 14:07:18 +0000 schrieb "Atila Neves" <atila.neves@gmail.com>: > Start of the two week process, folks. > > Code: https://github.com/s-ludwig/std_data_json > Docs: http://s-ludwig.github.io/std_data_json/ > > Atila There is one thing I noticed today that I personally feel strongly about: Serialized double values are not restored accurately. That is, when I send a double value via JSON and use enough digits to represent it accurately, it may not be decoded to the same value. `std.json` does not have this problem with the random values from [0..1) I tested with. I also tried `LexOptions.useBigInt/.useLong` to no avail. Looking at the unittests it seems the decision was deliberate, as `approxEqual` is used in parsing tests. JSON specs don't enforce any specific accuracy, but they say that you can arrange for a lossless transmission of the widely supported IEEE double values, by using up to 17 significant digits. -- Marco |
October 06, 2015 Re: std.data.json formal review | ||||
---|---|---|---|---|
| ||||
Posted in reply to Atila Neves | JSON is a particular file format useful for serialising heirachical data. Given that D also has an XML module which appears to be deprecated, I wonder if it would be better to write a more abstract serialisation/persistance module that could use either json,xml,some binary format and future formats. I would estimate that more than 70% of the times, the JSON data will only be read and written by a single D application, with only occasional inspection by developers etc. In these cases it is undesirable to have code littered with types coming from a particular serialisation file format library. As the software evolves that file format might become obsolete/slow/unfashionable etc, and it would be much nicer if the format could be changed without a lot of code being touched. The other 30% of uses will genuinely need raw JSON control when reading/writing files written/read by other software, and this needs to be in Phobos to implement the backends. It would be better for most people to not write their code in terms of JSON, but in terms of the more abstract concept of persistence/serialisation (whatever you want to call it). |
October 06, 2015 Re: std.data.json formal review | ||||
---|---|---|---|---|
| ||||
Posted in reply to Alex | Am 06.10.2015 um 12:05 schrieb Alex: > JSON is a particular file format useful for serialising heirachical data. > > Given that D also has an XML module which appears to be deprecated, I > wonder if it would be better to write a more abstract > serialisation/persistance module that could use either json,xml,some > binary format and future formats. > > I would estimate that more than 70% of the times, the JSON data will > only be read and written by a single D application, with only occasional > inspection by developers etc. > In these cases it is undesirable to have code littered with types coming > from a particular serialisation file format library. > As the software evolves that file format might become > obsolete/slow/unfashionable etc, and it would be much nicer if the > format could be changed without a lot of code being touched. > The other 30% of uses will genuinely need raw JSON control when > reading/writing files written/read by other software, and this needs to > be in Phobos to implement the backends. > It would be better for most people to not write their code in terms of > JSON, but in terms of the more abstract concept of > persistence/serialisation (whatever you want to call it). A generic serialization framework is definitely needed! Jacob Carlborg had once tried to get the Orange[1] serialization library into Phobos, but the amount of requested changes was quite overwhelming and it didn't work out so far. There is also a serialization framework in vibe.d[2], but in contrast to Orange it doesn't handle cross references (for pointers/reference types). But this is definitely outside of the scope of this particular module and will require a separate effort. It is intended to be well suited for that purpose, though. [1]: https://github.com/jacob-carlborg/orange [2]: http://vibed.org/api/vibe.data.serialization/ |
October 06, 2015 Re: std.data.json formal review | ||||
---|---|---|---|---|
| ||||
Posted in reply to Alex | On Tuesday, 6 October 2015 at 10:05:46 UTC, Alex wrote:
> I wonder if it would be better to write a more abstract serialisation/persistance module that could use either json,xml,some binary format and future formats.
I think there are too many particulars making an abstract (de)serialization module unworkable.
If that wasn't the case it would be easy to transform any format into another, by simply deserializing from format A and serializing to format B. But a little experiment will show you that it requires a lot of complexity for the non-trivial case. And the format's particulars will still show up in your code.
At which point it begs the question, why not just write simple primitive (de)serialization modules that only do one format? Probably easier to build, maintain and debug.
I am reminded of a binary file format I once wrote which supported referenced objects and had enough meta-data to allow garbage collection. It was a big ugly c++ template monster. Any abstract deserializer is going to stay away from that.
|
October 06, 2015 Re: std.data.json formal review | ||||
---|---|---|---|---|
| ||||
Posted in reply to Sebastiaan Koppe | On Tuesday, 6 October 2015 at 15:47:08 UTC, Sebastiaan Koppe wrote: > At which point it begs the question, why not just write simple primitive (de)serialization modules that only do one format? Probably easier to build, maintain and debug. The binary one is the one I care about, so that's the one I wrote: https://github.com/atilaneves/cerealed I've thinking of adding other formats. I don't know if it's worth it. Atila |
October 09, 2018 Re: std.data.json formal review | ||||
---|---|---|---|---|
| ||||
Posted in reply to Atila Neves | On Tuesday, 28 July 2015 at 14:07:19 UTC, Atila Neves wrote:
> Start of the two week process, folks.
>
> Code: https://github.com/s-ludwig/std_data_json
> Docs: http://s-ludwig.github.io/std_data_json/
>
> Atila
Sorry for the late ping, but it's been 3 years - what has happened to this? Has it been forgotten?
Working with JSON in D is still quite painful.
|
Copyright © 1999-2021 by the D Language Foundation