Global Mapper v25.0

Out of Memory Error v22.1.1

Hi all, 

I'm trying to load in a single LAZ file (362,641KB) into GM v.22.1.1 (64bit), and keep running in to an Out of Memory error. I have restarted my computer, closed and re-opened the GM project, and even started a brand new project with nothing else loaded into the control center. I have checked task manager and have plenty of memory left. Really not sure what to do, any suggestions? 

I have also tried to load the single file in as a Map Catalog and get the same error message. 

Error pasted below: 

The following errors were encountered trying to add files to the map catalog (this text is also copied to the clipboard and can be pasted using Ctrl+V):

Error loading file H:\SunshineCoast_LiDAR_DSM\_Data_07-18-2021\bc_092g072_1_2_1_xyes_8_utm10_2020.laz
Out of memory trying to perform operation, only 154,348,752,896 remaining bytes of memory available. The operation may succeed if you perform it over a smaller area. 
LidarLasOverlay.cpp - 1675
Version: v22.1.1 (131) (64-bit)
Build Time: Mar  8 2021 00:18:30
Thread: Main UI Thread

Stack Trace:
0000000140A99960 (global_mapper)
0000000140A9AF70 (global_mapper)
0000000140F46C5C (global_mapper)
0000000140F48046 (global_mapper)
0000000140F474BE (global_mapper)
0000000142C1F931 (global_mapper)
00007FF820441080 (VCRUNTIME140_1)
00007FF8204426D5 (VCRUNTIME140_1)
00007FF85AAC1446 (ntdll)
0000000141373C54 (global_mapper)
0000000141284C97 (global_mapper)
0000000141442330 (global_mapper)
00000001414208B6 (global_mapper)
000000014141E860 (global_mapper)
0000000141390DAC (global_mapper)
... (62 Additional Stack Items Hidden)

Windows 10 Enterprise (64-bit) Memory: 154,329,235,456 of 171,687,378,944 available, GDI Usage: 414 GDI (Peak 523), 218 User (Peak 329)



  • bmg_bob
    bmg_bob Global Mapper Programmer

    I suggest that you contact Blue Marble Support directly via email ( to address this issue. Please include a detailed description of your work flow, with screen captures of any options that are used, and the data set that is failing. Your data set is fairly large, so you probably won't be able to attach it to an email, but the support folks will provide instructions for uploading it. Thanks.