TinyImageFormat, Tiny KTX and Tiny DDS
I currently have 3 ‘tiny’ libraries on github with MIT licenses, that hopefully fill a gap in the open source community.
All 3 together provide a pretty robust texture loading and saving solution for any C/C++ project.
Tiny Ktx and DDS are both loader/savers for the same named texture file. TinyImageFormat handles a lot of the work of actually decoding and encoding the image data itself.
Ktx is a fairly well thought our format, with its biggest issues mostly being its GLisms. Version 2 is in the work (some comments at the end of this post) and fix most if not all of those issues.
The weirdest part of Ktx is GL_UNPACK = 4, which means very small images have padding. It makes decoding/encoding Ktx just a little bit harder and the code a little messier for really no gain except compatibility with GL.
Interestingly enough PVRTexTool which is often used as the defacto test loader for Ktx gets the GL_UNPACK rule wrong and is broken for small mipmaps. Other than its a handy tool and helped verify much of my loader/saver.
Use of GL style image format is understable given the era and platform it was originally written for but make life a pain. TinyKtx using a enum style format based of vulkan’s VkFormat in the main but does allow access to the GL style image format if you want.
DDS is a funny old format, primarily because its not really a file format at all.
Historically it evolved from the binary data passed to Direct Draw for Surfaces (hence DDS). DDS was designed to support any pixel format around 20 years ago. Originally this was 2D formats such as palettes and various bit encodings upto 32 bit per pixel. Then it become more 3D focused with some weird formats to support strange HW (like early bump mapping) before finally becoming a more proper disk format.
The original DDS was a copy of the in memory Direct Draw structure passed to/from the API. Its largely based around a structure and some flags, these enabled basically any pixel format upto 32 bits of 4 channels of normalised integers. Additionally for formats that didn’t fit into this structure, FourCC (a 4 byte unique ID) value could uniquely describe special cases.
As D3D got more complex things got wierd. Direct Draw was killed as an API and instead of the structure used to describe surfaces previously, a enumeration type was used (D3D_FORMAT). Rather then create a new file format a hack was applied to DDS files to store image formats that DirectDraw couldn’t express (Like multi channel floats formats), where D3D_FORMAT values were directly placed in the FourCC field. This was pretty safe as FourCC’s are made of 4x8 ascii characters and so low numbers like D3D_FORMAT would never appear as a valid FourCC value.
The next big change occured at D3D10 (the D3D_FORMAT version only existed during D3D9’s lifetime), here a FourcCC code of DX10 works as a version marker indicated additional data is stored in the DDS with support for textures arrays and the use of DXGI_FORMAT values for image format. So if you encounter to FourCC value for DX10, you read an extra structure which gives you explicit formats and support for things like texture arrays.
However it worth noting that in general if the format was expressable by the existing Direct Draw pixel structure then a Direct Draw Surface header could be synthesised and saved to be more compatibilte with older loaders.
There is a small revision (which TinyDDS doesn’t support yet) related to Alpha (whether is pre-multiplied) in DirectXTex (the reference for the latest version of DDS). Not sure how widely this is used, but I will support it eventually.
To finally add to the confusion, there are many files saved with incorrect data. The most famous is probably that DirectX itself screwed up saving R10G10B10A2 format files, not once but twice! Additionally I’ve found cubemaps I think were saved with the devil libraries with faults and likely many others.
Which is meant to point out that loading and saving DDS reliable is troublesome. Especially given the original Direct Draw Surface descriptor can encode many different formats but likely only a hand full were ever actually used.
TinyDDS is still a work in progress IMHO. It works and likely loads and saves every DDS file you will find unless your an obscure hoarder of exotic files (if you are send me your DDS files!). However I know that it has enough faults for some of the more exotic formats that I can’t say its done yet.
In particular, it won’t yet load palette files or any non texture compression FourCC formats.
If you’ve read this far, you’ve probably noticed that the common hard thing about loading formats like KTX and DDS is pixel formats. TinyImageFormat is the solution to that. Its a generalised image format library that fast and easy to use but handles most of the cases real-time graphics will encounter. As we speak its has just over 200 formats in a single enumeration. It has query functions that provide you with useful infomration about any format you pass.
Additionally for a subset if can directly encode and decode (from floats) data stored in a format.
Theres a lot to talk about regarding the implementation of TinyImageFormat but the key point is that it convert many of the data loaded by TinyKTX or TinyDDS into whatever format you want to use with the GPU. Currently its just packed pixel formats but it plans to be able to decode compressed and some video formats as well in future.
Its never likely to be able to encode directly into Compressed formats simply due there being no cheap and easy way but it will support writing pre-encoded blocks.
The biggest missing feature IMHO is decoding compressed format blocks (you can use compressed formats but it won’t decode them currently). That is happening soon after I’ve verified and tested all the existing formats.
Thoughts on the Upcoming KTX 2
Its not ratified yet, but has potential to be the texture format for the future.
Gone is the GL style image format replaced with an enumeration based on Vulkan. To support the many formats that aren’t in Vulkan, it requires image to have a DFD, a generilised descriptor of pixels defined by Khronos.
The problem is that DFDs are ridiculously complex to handle, as they support almost every possible image format possible and not just ones games use. I’m not yet sure how I will handle it yet, I suspect I will just keep a table of pre-formated DFD for each TinyKTX format and save that for export and simply refuse to load any texture which doesn’t have a enumeration based image format (Ktx2 whilst Vulkan based, has extensions to additionally store DXGI and Metal pixel format enumerations, which should cover most of the ground) and isn’t one of my ‘blessed’ DFD formats that are simple to understand.
The super compression of Ktx 2 is a good feature but does increase the complexity of the TinyKTX library quite a bit. I’m tending towards not shipping TinyKTX with decoders in the single header but allow the user to pass in decompressors. This gives the flexibility of using there own libraries instead of having yet another of zlib. Obviously I’ll provide examples which use things the common open source implementations so it can still be a simple job to load KTX files if you don’t really care about bloat.
I have a test framework that create, encodes, decodes and save then loads all the supported formats and the I verify they each look correctly on the 3 GPU backends (Metal, Vulkan, D3D) manually. Its takes a lot of time but its about 60% done. Once its finished the images will also be tested against these blessed golden images to ensure regressions don’t occur.
One of the hardest parts of developing them has been finding valid test images to verify against, especially DDS files as it evolved from Direct Draw in memory surfaces, so only later had any attempts at standardisation. If you find weird KTX/DDS files that don’t load as they should let me know and ideally borrow the file so I can fix the loader up.
My top hit list of DDS files I want to inspect are
- 1, 2, 4 bit images
- P4, P4A4, P8, P8A8 palettes/clut images
- FourCC images
Anyway thats enough for now about these three ‘tiny’ libraries for now.