• Which the release of FS2020 we see an explosition of activity on the forun and of course we are very happy to see this. But having all questions about FS2020 in one forum becomes a bit messy. So therefore we would like to ask you all to use the following guidelines when posting your questions:

    • Tag FS2020 specific questions with the MSFS2020 tag.
    • Questions about making 3D assets can be posted in the 3D asset design forum. Either post them in the subforum of the modelling tool you use or in the general forum if they are general.
    • Questions about aircraft design can be posted in the Aircraft design forum
    • Questions about airport design can be posted in the FS2020 airport design forum. Once airport development tools have been updated for FS2020 you can post tool speciifc questions in the subforums of those tools as well of course.
    • Questions about terrain design can be posted in the FS2020 terrain design forum.
    • Questions about SimConnect can be posted in the SimConnect forum.

    Any other question that is not specific to an aspect of development or tool can be posted in the General chat forum.

    By following these guidelines we make sure that the forums remain easy to read for everybody and also that the right people can find your post to answer it.

Image size problems SBuilderX

Thanks for the update, Braedon ! :)

I look forward to seeing your further conclusions arising from this interesting and complex testing scenario. ;)

GaryGB
 
OK, tested the larger 2.9GB BGL @ LOD9 created with the 4G patched reample, and it created an invalid BGL as well. Nothing in TMFViewer or in P3D v3.4. Reample itself does not error (visibly), but the simulator and all the tools to view it do not read it.

I even tried the 2.9GB BGL in P3D v4 with it's 64 bit improvements, thinking that it might handle the larger size but it is still invalid there too.

So:

To sum up.
  • LZW Compression of GeoTIFF files is OK to keep source input sizes down, or to use a (slightly) larger dimension image and keep the file as a standard (Geo)TIFF rather than a BigTIFF. I did no testing on how long this took compared to a comparable sized uncompressed TIFF as all my sources are compressed with File system compression not LZW, so this double compression would mess with the results.
  • Once you have factored in seasons and night maps, it is possibly a long process to discover that you have an invalid BGL due to it being >2GB if you use SplitFileLOD = 9 or bigger (ie 8,7,6,etc). A fully filled in LOD10 cell appears to get to a maximum of about 1.2GB with 4 seasons & blend/water masks at compression 80 so would still be under the 2GB limit for the output BGL with a night map thrown in. The orientation of the data and how much is NULL/Blended out within any LOD cell of whatever size will have a lot to do with which LOD split you choose.
  • The 4G modified resample seems to have no effect on either the success or failure of the BGL creation process. The biggest limiting factor is BGL output size, which at ~2GB is well below 4GB
  • Resample is hard disk intensive, so don't run too many parallel resample tasks using the same disk as either source or destination (or on the same PC if your hard disk choice is limited to 1). I had the luxury of having 6 disks in a mix of RAID10 mechanical (sources), USB3 mechanical (destination), and SSD (destinaton). This allowed me to have the source and destination on different disks, and spread the destinations around to to avoid some HDD congestion, and it still took an age to run a very large area. Mechanical hard disk thrashing tyring to read different source files sections for the different processes will kill your whole computers performance dramatically, not just for resample.
  • Resample is single threaded, which means that multi-core processors won't be of much use to you other than to allow you to do other things with the rest of the CPU while resample does it's thing, or you run multiple resample tasks simultaneously. While resample may appear to spread it's processing over multiple cores, this is just each time resample starts a new (singular) thread, the processing is being done on a different core than the previous calculation which has already finished. This potentially results in the system having to re-poplulate the L2/3 caches on the CPU in the process as they are more core specific the closer you get to the actual processor, and you may be processing on a core where he old cache isn't accessible. You will notice that the processing CPU time for it never gets above 100/<#CPU cores> % +/- a bit of overhead.
  • You might consider setting the affinity of the resample task to use a processor other than 0 (or maybe 0 & 1 if Hyperthreading is enabled?) so that arcane operating system tasks (like HDD access, or FSX ;)) that only run on processor 0 don't compete with resample as well. This *may* also help on modern CPUs where they can 'turbo' the frequency of a small number of cores if the others are idle, giving more GHz to your single threaded process. You can set the affinity of a process by right clicking the process in the "Details" tab of the windows Task Manager (right click on the task bar to bring this up). YMMV.

As an additional note:
If you want to use LZW compression of TIFF files simply to save space, I personally don't use LZW compression. As the uncompressed size of a file quadruples for every unit you add to the X & Y directions, the laws of diminishing returns starts to kick in for files that are largely filled with somethng other than solid colour or transparency.

You might consider using the file system's built in compression attribute to keep space usage down and real bytes read/written to disk to a minimum. The compression is then not processed by RESAMPLE.EXE but the Operating System itself. That frees up the single thread (~core) of the CPU that reample wants to use to...well...resample data instead of compressing /decompressing the data as well. The Operating System is free to put that compression/decompression task on another CPU core and process separately.

File System Compression.jpg

This is not Windows compressed files
, which are effectively ZIP files masquerating as file systems of their own in the normal file system, and require a temporary area to decompress to as well. It is the NTFS File System's built in compression which will compress and decompress on the fly. Obviously the more data you have the the longer it will take to perform the initial compression, and you will need as much free space as your largest file to perform the compression initially too. If you put the data on a different drive, I find it is sometimes easier to create a compressed empty folder and copy all the data into it.

cheers

Braedon

 
Last edited:
Back
Top