Thread overview
HipJSON: A high performance implementation of JSON parser with std.json syntax. Used by Redub and Hipreme Engine
Oct 23
Hipreme
2 days ago
Dennis
2 days ago
Hipreme
2 days ago
Dennis
2 days ago
Ben Jones
1 day ago
Hipreme
15 hours ago
Hipreme
October 23

Hey folks! I've decided to release HipJSON because this dependency is being used frequently in both Hipreme Engine and Redub, and the main difference is that it uses highly optimized data structures inside it:

  • StringBuffer so all the string allocations are linear
  • D Segmented Hashmap: A hashmap which does not require rehashing, thus making it the fastest solution possible for a JSON parser
  • JSONArray which makes smart use of stack memory and promotion to dynamic memory on demand

It is the only JSON parser in D I saw being faster than the one present in JavaScript which I think it is a good benchmark as JSON is super important for web. Here are some stats comparison for some common JSON libraries:

STD JSON: 336 ms, 836 μs, and 6 hnsecs (50000 Tests)
JSONPIPE: 206 ms and 571 μs (50000 Tests)
MIR JSON: 266 ms, 770 μs, and 7 hnsecs (50000 Tests)
HipJSON: 86 ms, 881 μs, and 8 hnsecs (50000 Tests)

You can test it yourself as the documentation shows.

Against JavaScript:

JS performance of the parseJSON:
Parsed: 50 MB in 0.7036 s
Speed: 71.06 MB/s

HipJSON  with dub test -b release-debug --compiler=ldc2
Parsed: 50MB in 606ms
MB per Second: 86.5162
Allocated: 739.969 MB
Free: 68.7608 MB
Used: 739.962 MB
Collection Count: 7
Collection Time: 273 ms, 757 μs, and 5 hnsecs

I may do a @nogc configuration in the future, but the refactor to do it is kinda big and I plan to use nulib as a backend.

Hope you guys enjoy it :)

https://code.dlang.org/packages/hipjson

2 days ago

On Thursday, 23 October 2025 at 01:50:55 UTC, Hipreme wrote:

>

Hope you guys enjoy it :)

https://code.dlang.org/packages/hipjson

Very nice! I wonder why you chose to follow std.json's interface with dynamic objects, rather than an introspection based parseJson(T)(ref T obj, string jsonData) which fills a struct/class T with values from the json text, ignoring unknown keys.

Can this become a std.json replacement in Phobos v3 (or even v2)?

2 days ago

On Thursday, 30 October 2025 at 12:16:36 UTC, Dennis wrote:

>

On Thursday, 23 October 2025 at 01:50:55 UTC, Hipreme wrote:

>

Hope you guys enjoy it :)

https://code.dlang.org/packages/hipjson

Very nice! I wonder why you chose to follow std.json's interface with dynamic objects, rather than an introspection based parseJson(T)(ref T obj, string jsonData) which fills a struct/class T with values from the json text, ignoring unknown keys.

Can this become a std.json replacement in Phobos v3 (or even v2)?

Redub uses a cache system in which the hash of an object becomes a key. This does not translate well to structs as they are "random" keys.

Beyond that, dub also does use platform filters such as "dflags-ldc". In which one can make arbitrary combinations, which is also a problem. That being said, dynamic objects are way more flexible.

Another use-case where it doesn't fit very well is for representing file system. Since the filesystem contents are dynamic, that is another reason. This is used for checking valid accessible files in my wasm API.

There is another case in which the dynamic type API is kinda important, which is mostly when you do union types, so one would still need a good representation of it:

"directionals": {
        "move": {
            "x": [
                {"keyboard": "a", "gamepad": "dPadLeft", "value": -1},
                {"keyboard": "d", "gamepad": "dPadRight","value": 1},
                {"analog": "left", "axis": "x"}
            ],
            "y": [
                {"keyboard": "w", "gamepad": "dPadUp", "value": -1},
                {"keyboard": "s", "gamepad": "dPadDown", "value": 1},
                {"analog": "left", "axis": "y"}
            ]
        }
    }

For my input API, it also receives arbitrary actions which can be referenced to the player, and in an array of an union type of either a value or an axis. Really complicates the algorithm. And also I wanted a std.json API since I was already using it in my projects, so it makes it much easier to migrate.

As a final input, I also don't really like using templated code. It is also very easy to create a deserialization function based on a struct, so, if one day I decide to add that API, it will be a 0 effort migration.

2 days ago

On Thursday, 30 October 2025 at 13:02:16 UTC, Hipreme wrote:

>

Redub uses a cache system in which the hash of an object becomes a key. This does not translate well to structs as they are "random" keys. (...)

So you have several use cases where the keys are not a small set known ahead of time, but constructed dynamically. Makes sense, though now I wonder why you would use JSON for your cache or filesystem when performance is such a concern. Or when it's not a concern, why std.json doesn't suffice.

>

There is another case in which the dynamic type API is kinda important, which is mostly when you do union types, so one would still need a good representation of it:

You could define:

struct Input
{
    string keyboard;
    string gamepad;
    string analog;
    string axis;
}

And then check the length of those strings to see if they have a value.

Either way, it's still useful to have a fast JSON parser using dynamic objects, and using std.json's API makes it an easy upgrade.

2 days ago

On Thursday, 30 October 2025 at 16:40:54 UTC, Dennis wrote:

>

On Thursday, 30 October 2025 at 13:02:16 UTC, Hipreme wrote:

I think there's a push to use JSONIOPipe in phobos v3 (https://github.com/schveiguy/jsoniopipe) which has support for serializing/deserializing from structs. Steven, Gaofei (GSOC student) and Inkrementor have been improving it since the summer.

1 day ago

On Thursday, 30 October 2025 at 16:40:54 UTC, Dennis wrote:

>

On Thursday, 30 October 2025 at 13:02:16 UTC, Hipreme wrote:

>

Redub uses a cache system in which the hash of an object becomes a key. This does not translate well to structs as they are "random" keys. (...)

So you have several use cases where the keys are not a small set known ahead of time, but constructed dynamically. Makes sense, though now I wonder why you would use JSON for your cache or filesystem when performance is such a concern. Or when it's not a concern, why std.json doesn't suffice.

>

There is another case in which the dynamic type API is kinda important, which is mostly when you do union types, so one would still need a good representation of it:

You could define:

struct Input
{
    string keyboard;
    string gamepad;
    string analog;
    string axis;
}

And then check the length of those strings to see if they have a value.

Either way, it's still useful to have a fast JSON parser using dynamic objects, and using std.json's API makes it an easy upgrade.

I do plan to also add a streaming API in the near future. I think that with this addition, it will also becomes a lot easier to add this new struct dump API.

Btw, with HipJSON, I can parse up to 700MBps for my cache system. Considering it usually takes like 120KB, it becomes like 0.17ms to parse that cache, so I don't think it is slow. THe 80MBps I show there is for parsing pure-dictionary, which is the slowest operation you can do for JSON.

15 hours ago

On Thursday, 30 October 2025 at 16:40:54 UTC, Dennis wrote:

>

On Thursday, 30 October 2025 at 13:02:16 UTC, Hipreme wrote:

>

Redub uses a cache system in which the hash of an object becomes a key. This does not translate well to structs as they are "random" keys. (...)

So you have several use cases where the keys are not a small set known ahead of time, but constructed dynamically. Makes sense, though now I wonder why you would use JSON for your cache or filesystem when performance is such a concern. Or when it's not a concern, why std.json doesn't suffice.

>

There is another case in which the dynamic type API is kinda important, which is mostly when you do union types, so one would still need a good representation of it:

You could define:

struct Input
{
    string keyboard;
    string gamepad;
    string analog;
    string axis;
}

And then check the length of those strings to see if they have a value.

Either way, it's still useful to have a fast JSON parser using dynamic objects, and using std.json's API makes it an easy upgrade.

The reasons for why std.json didn't suffice:

  • It uses a bunch of phobos dependencies which I can't use since I target multiple platforms with Hipreme Engine
  • Redub started using hipjson because I found out that it was going way faster than std.json even though that meant I would use std.conv.to or other phobos functions
  • I made that project available because I found out it was the fastest implementation for dynamic JSON objects in D
  • After that, I grew a little nerdy on that and then made up d-segmmented-hashmap which was my response in going even further

Right now, HipJSON just release v1.0.1 which also supports streaming API:

import hip.data.json;
import std.exception;
JSONValue myJson;
JSONParseState state = JSONParseState.initialize(0);
enforce(JSONValue.parseStream(myJson, state, `{`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `"`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `h`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `e`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `l`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `l`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `o`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `"`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `:`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, ` `) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `"`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `w`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `o`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `r`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `l`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `d`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `"`) == JSONValue.IncompleteStream);
enforce(JSONValue.parseStream(myJson, state, `}`) != JSONValue.IncompleteStream);
import std.stdio;
writeln(myJson); //{"hello" : "world"}

What was more of a POC also wasn't that hard to get done, though I've got some bugfixes and even more performance improvement