32-bit to 16-bit Tiff - Is the height automatically scaled to 16-bit range

veddycentveddycent Global Mapper UserPosts: 17
Hi all,

So I have a set of 32-bit .tiff files for DEM data and I need to export them as 16-bit GeoTiffs to use in Unity Game Engine.
When the 32-bit image is exported as 16-bit is the height range scaled automatically to the 16-bit range because my 16-bit exports are not showing the desired result. The height is staggered at 1m intervals whereas the 32-bit has nice smooth gradients.

Any help would be much appreciated

Best Answer

  • Ice Age MarkIce Age Mark Global Mapper User Posts: 290
    Answer ✓

    The way I think I understand it is like this: the 32-bit depth IS the smooth un-terraced DEM you're seeing.  USGS DEM's are only 24-bit and will show some terracing.  16-bit DEM's are terraced to whole integer native units.  It's caused by how the decimals are truncated at each bit depth.  Only a 32-bit floating point export will give you the original 32-bit "smoothness".



    (FYI - If you save a 32-bit DEM as a Global Mapper .gmg, it will only be 24-bit, and will show some terracing.  You cannot then recover the original bit depth to export except from the original 32-bit source.)  


  • veddycentveddycent Global Mapper User Posts: 17
    Sorry for the late reply!

    Thanks Mark, it seem like a massive waste of precision if you have a terrain with an elevation range of 10m and it clamps it to 1m intervals. I wonder if GM would be able to use the terrain height range to determine at what precision to save to 16bit image as to maximums the full 16bit range with the most accuracy.

Sign In or Register to comment.