https://bitbucket.org/wolfpld/etcpak/wiki/Home
It's the fastest open source ETC1 encoder that I'm aware of.
Notice the lack of any PSNR/MSE/SSIM statistics anywhere (that I can see). Also, the developer doesn't seem to get that the other tools/libraries he compares his stuff against were optimized for quality, not raw speed. In particular, rg_etc1 (and crunch's ETC1 support) was tuned to compete against the reference encoder along both the quality and perf. axes.
Anyhow, there are some interesting things to learn from etcpak:
- Best quality doesn't always matter. It obviously depends on your use case. If you have 10 gigs of textures to compress then iteration speed can be very important.
- The value spectrum spans from highest quality/slow encode (to ship final assets) to crap quality/fast as hell encode (favoring iteration speed).
- Visually, the ETC1/2 formats are nicely forgiving. Even a low quality ETC1 encoder produces decent enough looking output for many use cases.
Hi Rich! Thanks for the mention.
ReplyDeleteI "do get" the perf vs quality optimization problem. Each test was performed with the fast quality setting (as can be seen in command line column at http://zgredowo.blogspot.com/2013/06/fastest-etc-compressor-on-planet.html). And I expect "fast" to be actually fast, but nope. It's not even that hard in some cases, eg. when I was testing PVRTexTool, it wouldn't parallelize compression of ETC blocks.
As for the interesting things, here's another one. We are using etcpak in a production environment for a game with more than 5 million installs. It seems to be good enough, as no one is complaining about low quality of graphics.
Hi Bartosz,
Delete>> as no one is complaining about low quality of graphics.
I'm very happy to hear that. etcpak is the existence proof that it is not always necessary to spend insane amounts of CPU time optimizing these formats to be as high quality as possible. It's like the programmers who write these encoders have an insatiable quality fetish.
There is clearly a point of diminishing returns at work here. Is it worth spending 10x or 100x more CPU time compressing a texture? A graph of CPU time invested vs. actual value gained would be interesting to look at.
Here's a visual comparison of etcpak vs other encoders:
ReplyDeletehttp://i.imgur.com/FxlmUOF.png
It also shows how image quality metrics fare against the actual image quality and why I don't care about including them.
Thanks! That's interesting.
DeleteWhile working on crunch, and my previous texture compression efforts, I found image quality metrics absolutely essential. I was totally in the weeds until I switched to conducting experiments and studying the results both visually and through metrics.
PSNR obviously has issues, but generally I found the higher I pushed my DXT endpoint optimizer's PSNR, the better it looked visually too.
Also, you should include some sort of image metrics in your unit tests, just in case someone breaks your encoder in some subtle way by accident. Visually, the results could be hard to notice, and the perf. could be identical, but quality could by impacted. You need these metrics somewhere, at least for development. I wouldn't trust a codec library that doesn't do something like this to check for breakage over time.
DeleteI need bit-exact results, so I check for that during development by comparing md5 sums. Breakages are well documented (eg. precision improvements in 0.4, or AVX2 vs scalar implementation for planar blocks in 0.5).
Delete