Is there a standard mapping between JSON and Protocol Buffers?

Posted by Daniel Earwicker on Stack Overflow See other posts from Stack Overflow or by Daniel Earwicker
Published on 2010-03-30T11:01:22Z Indexed on 2010/03/30 11:03 UTC
Read the original article Hit count: 385

Filed under:
|

From a comment on the announcement blog post:

Regarding JSON: JSON is structured similarly to Protocol Buffers, but protocol buffer binary format is still smaller and faster to encode. JSON makes a great text encoding for protocol buffers, though -- it's trivial to write an encoder/decoder that converts arbitrary protocol messages to and from JSON, using protobuf reflection. This is a good way to communicate with AJAX apps, since making the user download a full protobuf decoder when they visit your page might be too much.

It may be trivial to cook up a mapping, but is there a single "obvious" mapping between the two that any two separate dev teams would naturally settle on? If two products supported PB data and could interoperate because they shared the same .proto spec, I wonder if they would still be able to interoperate if they independently introduced a JSON reflection of the same spec. There might be some arbitrary decisions to be made, e.g. should enum values be represented by a string (to be human-readable a la typical JSON) or by their integer value?

So is there an established mapping, and any open source implementations for generating JSON encoder/decoders from .proto specs?

© Stack Overflow or respective owner

Related posts about protocol-buffers

Related posts about JSON