Memory Handling Problem with Scripts
Roger Edrinn
Global Mapper UserTrusted User
I'm gridding out a major extent using scripts. I'm using the Embed_Script command to call the next script. However I get a memory error after a while. All my heavy memory users are in catalogs and the lighter ones are full extent. The net result is ~1GB of memory used at the start of the first script.
In watching Task Mgr, each successive script adds 20-50MB to that 1GB and I eventually run out of memory. My guess is that for some reason the unneeded catalog extents are not being closed after the script is through with them.
What does the Oracle of Olathe think?
In watching Task Mgr, each successive script adds 20-50MB to that 1GB and I eventually run out of memory. My guess is that for some reason the unneeded catalog extents are not being closed after the script is through with them.
What does the Oracle of Olathe think?
Comments
-
First of all, I'm the Oracle of Parker, CO now, not Olathe!
Are you unloading the layers loaded by each sub-script (assuming there are any) once you don't need them? It sounds like maybe you keep loading new layers in each called script, but aren't unloading those layers ever (they wouldn't be freed automatically until the top-most script exited).
Let me know if I can be of further assistance.
Thanks,
Mike
Global Mapper Support
support@globalmapper.com -
global_mapper wrote: »First of all, I'm the Oracle of Parker, CO now, not Olathe!Are you unloading the layers loaded by each sub-script (assuming there are any) once you don't need them? It sounds like maybe you keep loading new layers in each called script, but aren't unloading those layers ever (they wouldn't be freed automatically until the top-most script exited).
When the script ends, TM drops back to 1GB, but the Embed_Script is no end and the memory load keeps growing until the error message. When I've watched TM I've seen 1.8GB. -
A little more info, I started over on one corrupted group of MP exports. TM starts at ~200MB, not 1GB. It rapidly escalates to the 1GB range as it exports more MP files.
I think that unlike the GM grid tab on export, when I grid manually using scripts, GM is not closing catalog files when finished. So that's my story and I'm sticking to it.
Now, what say the Prophet of Parker?!? -
Even more. Shortly after 1.8GB in TM I get the Out of Memory error. Some where in the process I get corrupted MP exports, such that cgpsmapper reports "Can't have no data at the "0" level".
Perhaps theirs a command I can periodically embed in the script to flush memory, then continue the script? Need to do something. -
I moved to Parker in late 2008 as Denver has a lot more outdoor stuff to do than KC and is the hub of mapping companies worldwide. Aspen is way too far from a major airport and doesn't get hot enough for me in the summer, plus I'm a fan of severe weather which doesn't happen much in the mountains (winter weather being the exception of course).
Can you post your script files that you are using (or email them to support@globalmapper.com) so that I can take a look and see what might be happening? Do your map catalogs reference just a few very large files? A map catalog will keep several previously loaded files around, so if you have a catalog with very large files then that extra memory might hang around a bit.
What you can do is periodically do an UNLOAD_ALL command and then reload your files (maybe put the loading in its own embedded script to make it easy to call again after unloading). This will force all files to be unloaded.
Let me know if I can be of further assistance.
Thanks,
Mike
Global Mapper Support
support@globalmapper.com -
global_mapper wrote: »I moved to Parker in late 2008 as Denver has a lot more outdoor stuff to do than KC and is the hub of mapping companies worldwide. Aspen is way too far from a major airport and doesn't get hot enough for me in the summer, plus I'm a fan of severe weather which doesn't happen much in the mountains (winter weather being the exception of course).Can you post your script files that you are using (or email them to support@globalmapper.com) so that I can take a look and see what might be happening? Do your map catalogs reference just a few very large files? A map catalog will keep several previously loaded files around, so if you have a catalog with very large files then that extra memory might hang around a bit.
Script file in your inbox. -
Actually I enjoy the big severe storms, always wanted to be a tornado chaser, so the weak storms in the mountains wouldn't do it for me. Gotta keep a bit of KS in my CO life!
I took a look and did discover where map catalogs wouldn't unload vector layers as they should during script processing in some situations (like what you are doing), which is likely what is happening. I have placed a new build at http://www.globalmapper.com/global_mapper11.zip with this fixed for you to try. Simply download that file and extract the contents into your existing v11.xx installation folder to give it a try. If you are using the 64-bit version, there is a new build available at http://www.globalmapper.com/global_mapper11_64bit.zip .
Let me know if I can be of further assistance.
Thanks,
Mike
Global Mapper Support
support@globalmapper.com -
Well that made a BIG difference, memory in TM never exceeded 0.5GB and it moved down as rapidly as up.
Thanks for the prompt fix.
Categories
- 12.7K All Categories
- 5.6K Features Discussion
- 342 Downloading Imagery
- 1.3K Elevation Data
- 380 Georeferencing Imagery Discussion
- 628 GM Script Language
- 53 User Scripts
- 113 GPS Features
- 414 Projection Questions
- 819 Raster Data
- 1.3K Vector Data
- 6.6K Support
- 177 Announcement and News
- 908 Bug Report
- 558 SDK
- 1.2K Suggestion Box
- 3.7K Technical Support
- 562 Other Discussion
- 129 GIS Data Sources
- 27 Global Mapper Showcase
- 233 How I use Global Mapper
- 107 Global Mapper Forum Website