In this experiment, I've worked on reducing the size of the lookup table used to quickly convert a subset of ETC1 texture data (using only a single 5:5:5 base color, one 3-bit intensity table index, and 2-bit selectors) directly to DXT1 texture data. Now the ETC1 encoder is able to simultaneously optimize for both formats, and due to this I can reduce the size of the conversion table. To accomplish this, I've modified the ETC1 base color/intensity optimizer function so it also factors in the DXT1 block encoding error into each trial's computed ETC1 error.
The overall trial error reported back to the encoder in this experiment was etc_error*16+dxt_error. The ETC1->DXT1 lookup table is now 3.75MB, with precomputed DXT1 low/high endpoints for three used selector ranges: 0-3, 0-2, 1-3. My previous experiment had 10 precomputed ranges, which seemed impractically large. I'm unsure which set of ranges is really needed or optimal yet. Even just one (0-3) seems to work OK, but with more artifacts on very high contrast blocks.
Anyhow, here's kodim18.
Post a Comment