Very large graph definitions, totaling over 64KB, don't fit into page_props DB table, and thus end up trimmed / invalid JSON. Need to switch to an alternative storage.
Description
Details
Subject | Repo | Branch | Lines +/- | |
---|---|---|---|---|
Compress graph_specs page property | mediawiki/extensions/Graph | master | +13 -3 |
Related Objects
- Mentioned In
- T124840: Section edit preview doesn't let you preview references defined outside the section being previewed
T105898: Allow large page props data to be transparently compressed with gzip in storage - Mentioned Here
- T53740: TemplateData: page_props limits value length to 65535 bytes (MySQL 'blob' field)
Event Timeline
Note that TemplateData also uses page_props. It uses an edit hook to prevent oversized blobs from being saved in the first place. Doesn't solve the size problem, but at least ensures data integrity.
It also compresses the JSON. See T53740 for discussion and patches. There were numerous complaints about limited length before, not a single one afterwards. :)
Lemme do a quick pass on compression and if we do something fancier than that, great :D
Ok checked in with Yuri, it all makes sense now. ;) Graphoid fetches this via API, needs tweaking.
Ok intermediate fix could be something like:
- gzip the graph_specs page props data
- add API query page props extension to provide the decompressed graph specs
- tweak graphoid to use that instead of raw page_props
Proper fix for this is to do the data transformations on the MediaWiki side and bundle everything into a data blob indexed by (a fuller) hash value, but that's farther out.
Per discussions with Yuri and JamesF we prefer to do this the right way. :) removing the dep on the compression bit.
Change 255914 had a related patch set uploaded (by Yurik):
(WAIT) Compress graph_specs page property
In the above patch, I added the compression, since now there is an API to get the data from. Waiting for the api to ride the train first, and update the Graphoid service