View Issue Details Jump to Notes ] Print ]
IDProjectCategoryView StatusDate SubmittedLast Update
0014177ParaView(No Category)public2013-07-18 19:572014-02-27 17:31
ReporterAlan Scott 
Assigned ToAlan Scott 
PriorityhighSeverityminorReproducibilityhave not tried
StatusclosedResolutionfixed 
PlatformOSOS Version
Product Versiongit-master 
Target Version4.2Fixed in Version 
Summary0014177: AMRContour filter grows memory needs 8X for every level increase
DescriptionFrom an e-mail between Utkarsh, Berk and me:

Berk/Utkarsh,

I have somewhat figured out what is going on with the AMR Contour filter. The symptom is that when you try to AMR Contour on big data (defined as a billion cells, close to a dozen levels, and a million blocks) on any size computer (including Cielo), with lots of memory, nodes and cores, you die. This happens even though the Memory Inspector shows memory use at under 25% or so before running the AMR Contour filter.

The root cause of the problem is found in vtkAMRDualGridHelper.cxx, function vtkAMRDualGridHelperLevel::AddGridBlock(). For some arbitrary level, this code is trying to figure out the full extent of all blocks (all blocks sized at this level) within any file. When you get to the higher levels of block refinement, such as level 10, you have a cuboid that is around 1024X1024X1024 in size. Next, the code creates a 3d array of pointers for each block of this space. Thus, you have a possible 8 GByte new() that will occur. This is where we are blowing up.

Specifically, the test that I am running and having issues with is level 10, and has an X, Y and Z of around 504, 506 and 514. This is trying to create an array of 131,082,336 pointers. This is obviously dying.

Also note that this issue is happening with the smallest of the three datasets that I have on Cielo (only 16k files, a few billion cells).

Surprisingly, this shows up with only 1 file, and can be replicated with local server. Yay!

I am working on getting a dataset. (I am having issues getting a dataset that has enough levels, that I can release to Kitware). However, it turns out that you can see the issue with Dave’s large CTH dataset (although it won’t crash).
• Copy file spcta.0 into a temporary directory.
• In file vtkAMRDualGridHelper.cxx, function vtkAMRDualGridHelper ::Initialize(), line “numBlocks = input->GetNumberOfDataSets(level), place a breakpoint.
• Run ParaView local server.
• Open the file. Just read in one of the Volume Fractions. Apply.
• AMR Contour. Select the Volume Fraction. Apply.
• Now, at the breakpoint, keep hitting run until level is 5.
• You only get one block. Step into AddBlock().
• Step into AddGridBlock()
• Place a breakpoint at the memset() line (down about an inch from the top of function). Run to here.
• Look at the size of newSize and the extents of the space. Extents are 20, 30, 22. newSize is 14973 (The size of the array of pointers we will create). We are only at level 5, and this grows something like 8 X per level.


How do we proceed?

Thanks all!

Alan.

TagsNo tags attached.
ProjectSandia
Topic Name
Typecrash
Attached Files

 Relationships

  Notes
(0032435)
Alan Scott (manager)
2014-02-27 17:31

For AMR with many levels, this filter has been deprecated with the ExtractCTHParts filter. Thus, this bug no longer applies. Lets close this bug.

 Issue History
Date Modified Username Field Change
2013-07-18 19:57 Alan Scott New Issue
2014-01-16 17:06 Utkarsh Ayachit Target Version => 4.2
2014-02-27 17:31 Alan Scott Note Added: 0032435
2014-02-27 17:31 Alan Scott Status backlog => closed
2014-02-27 17:31 Alan Scott Assigned To => Alan Scott
2014-02-27 17:31 Alan Scott Resolution open => fixed


Copyright © 2000 - 2018 MantisBT Team