Wednesday, April 24, 2019

Direct conversion of ETC1 to DXT1 texture data: 4th experiment (originally published 9/11/16)

In this experiment, I've worked on reducing the size of the lookup table used to quickly convert a subset of ETC1 texture data (using only a single 5:5:5 base color, one 3-bit intensity table index, and 2-bit selectors) directly to DXT1 texture data. Now the ETC1 encoder is able to simultaneously optimize for both formats, and due to this I can reduce the size of the conversion table. To accomplish this, I've modified the ETC1 base color/intensity optimizer function so it also factors in the DXT1 block encoding error into each trial's computed ETC1 error.

The overall trial error reported back to the encoder in this experiment was etc_error*16+dxt_error. The ETC1->DXT1 lookup table is now 3.75MB, with precomputed DXT1 low/high endpoints for three used selector ranges: 0-3, 0-2, 1-3. My previous experiment had 10 precomputed ranges, which seemed impractically large. I'm unsure which set of ranges is really needed or optimal yet. Even just one (0-3) seems to work OK, but with more artifacts on very high contrast blocks.

Anyhow, here's kodim18.

ETC1 subset:


Max:  80, Mean: 3.809, MSE: 30.663, RMSE: 5.537, PSNR: 33.265

DXT1:


Max:  76, Mean: 3.952, MSE: 32.806, RMSE: 5.728, PSNR: 32.971

ETC1 block selector range usage histogram:
0-3: 19161
1-3: 3012
0-2: 2403

No comments:

Post a Comment