Are there any plans to serve compressed API responses? I’ve tested some pages, and they consistently showed 5-10x reduction in size when gzipped, which could translate to significant improvements to load time for API apps that load many pages.
And higher system load to perform the compression.
EDIT: Not trying to shoot it down outright, just pointing out that bandwidth vs system load is a tradeoff to consider.
I agree absolutely. I think some things, such as ongoing games, chat, anything else that’s low bandwidth would not benefit from gzipping overly much, unless there’s enough CPU time to compress all network requests.
On the other hand, mostly static content, such as old game records and SGFs, could benefit especially much, especially if they are compressed once using a program like 7zip (better compression, even in gzip format) and then stored as .gz. Even libbaduk and the main css for the main page (not API) could be gzipped in this way.
One downside is that the API would no longer be able to sort and/or filter the data returned without some major overhead (decompress, then sort, then recompress, unless they’re gzipped live, of course) and it would be better to store the data in bigger pages, which would break apps until they’re modified.
Oh, yeah, I’m totally opposed to that idea. Sorry. I thought you meant following the network standard of over-the-wire compression.
I don’t think this is really using that much bandwidth that it’s something the devs would be likely to take on the added effort for, especially when it would cause issues for everything using the software. Adding in 7zip would be a little ridiculous.
7zip can compress in the same gzip format that can be served in the same way as live gzip/deflate. The only difference is that you compress once vs compress every time for live compression, though obviously the data would have to be static for you to do that. Compress every time is suitable if you need to sort or filter the data, or have variable page sizes.