ParaView Use Cases: Difference between revisions

From ParaQ Wiki
Jump to navigationJump to search
No edit summary
No edit summary
 
(3 intermediate revisions by the same user not shown)
Line 1: Line 1:
'''Use Case 1'''
'''Use Case 1: Analysis of a Structural Model'''
<br>''User: Bruce Kistler''


Description:
Description:
Line 20: Line 21:
# Provide an integrated movie tool for displaying these animations on a desktop or a multi-panel display wall.
# Provide an integrated movie tool for displaying these animations on a desktop or a multi-panel display wall.
# Provide a geometry display tool for displaying interactive flipbook animations on a desktop or multi-panel display wall.
# Provide a geometry display tool for displaying interactive flipbook animations on a desktop or multi-panel display wall.
'''Use Case 2: Time Domain Electromagnetic Simulation Set-up/Debugging'''
<br>''User: Michael Pasik''
Description: When creating volumetric tetrahedral meshes from customer provided ProE or SolidWorks solid models, the model typically goes through several tools (CUBIT, I-DEAS, etc.) and/or formats (ACIS, IGES, etc.). Even after geometry clean-up and using tools like virtual geometry or section meshing, small sections or facets typically remain. When applying boundary conditions for electrical simulations is essential that the boundary conditions be applied to all the necessary surfaces otherwise current flow and/or equi-potential surfaces can be interrupted causing large parallel runs to produce incorrect results. The problem is particularly acute with I-DEAS as it identifies boundary conditions (encoded as pressures) using arrows normal to a surface. As result, after generating a mesh we typically will visualize sidesets in the Genesis database by flying through the geometry while changing the visibility (and representation) of the individual sidesets to verify their electrical connectivity.
Our time-domain electromagnetic simulation tools employ edge based variables. For diagnostics purposes we often need to identify a set of connected edges using sidesets. Because the Genesis format does not support edgesets, we use distribution factors to identify which edge of a particular element face is to be used. Currently, visualizing these “edgesets” requires some imagination in guessing the connected edges (think of a tail on a kite or sharks teeth). Ideally we would like to visualize the edges rather than faces.
Several unit conversions may have been involved in generating the final mesh since mechanical designers prefer using mils/inches/feet, most meshing tools tend to have absolute geometry tolerances that can be on the scale of small devices, and most analysts prefer to work in SI units. We typically use visualization tools to confirm the unit conversions have all been performed properly by making measurements of key mesh dimensions.
When setting up input files for our simulation tools, analysts will often want to determine locations within the geometry to locate additional diagnostics. This step involves identifying coordinates and/or node/element numbers corresponding to key locations in the geometry.
Lastly, when debugging it is often useful to be able to query mesh connectivity and selectively label and/or display sets of elements/nodes.
Our primary platform is x86/Linux but expect this to change in the future to x86-64/Linux.
Extracted/Implied Tool Requirements
# Must be able to easily load all sidesets in a Genesis file as individual entities with their IDs.
# Must be able to easily control the visibility of individual sidesets.
# Must be able to easily discern the identity of a sideset (e.g., using labels, highlighting, etc.).
# Must be display sidesets with shaded surface, wireframe, and hidden line representations.
# Sidesets display must support tet, hex, and pyramid element types.
# It would be highly desirable if the tool could interpret/display sideset distribution factors (i.e., fields associated with the sideset).
# It would be highly desirable if the tools could visualize our “edgesets” encoded as sidesets as element edges rather than element faces.
# Must be able to easily measure distances between mesh entities such as nodes and sidesets.
# Must be able to easily determine the extent of selected entities (sidesets, element blocks, etc.).
# It would be highly desirable if the tools could report measured distances in multiple user selected units or perform unit conversions.
# It must be easy to probe coordinates and node/element numbers using a user controlled cursor.
# Must be able to report the closest node/element numbers to an arbitrary user specified location.
# It would be highly desirable if this probing operation could be easily scripted for automated probing of the mesh.
# Must be able to easily display user subsets of elements/nodes.
# Must be able to display element/node numbering (including understanding maps for parallel).
# Must be able to provide mesh connectivity (given a node number provide the elements that include it, etc.).
# Tool must support x86 and x86-64 Linux.
'''Use Case 3: Time Domain Electromagnetic Simulation Analysis'''
<br>''User: Michael Pasik''
Description: Time domain electromagnetic simulations can involve a variety of data including structured mesh vector fields with 1-50 million cells in hundreds of blocks, 1-10 million unstructured tetrahedral, hexahedral, and pyramid element-based vector fields in tens of blocks in Exodus files, and millions of particles (vertex-based with scalar/vector attribute data such as charge, momentum, etc. that move each time step) in our Portable File Format (PFF) files. The structured field data is stored by block using element-based fields in our PFF files or edge-based fields in SAF files. The unstructured fields produced by our simulation tools are actually edge-based but we move them to the element in order to use Exodus. Ideally we would like to visualize the edge-based fields in our visualization tools. A single simulation can produce any combination of the above data. We are also currently developing a capability to adaptive refine tetrahedral meshes. In this case, separate Exodus files are produced at each time step.
Typically, an initial simulation will be performed that will save large (tens of GB) amounts of data. Unstructured data is produced as many separate Exodus files that are typically concatenated together. This step often fails on 32-bit Linux clusters so the data is often moved to 64-bit SGI machines for analysis. This step is cumbersome and is often a barrier to analysts capturing all the data they would like to see. The transient data is typically first reviewed by looking animations of the vector fields in cut planes as a function of time to verify correct simulations set-up, proper operation of the boundary conditions, etc. Once the analyst is satisfied that the qualitative behave is correct, often the designer of the component being simulated will sit down with the analyst or otherwise be shown these animations. Next, the analyst will typically perform data reductions to verify the results quantitatively. This step typically involves performing integrations of the fields (or functions of the fields) over volumes, surfaces, lines or probing the data at selected points. This data will often be viewed as XY plots (as a function of time or less frequently position) directly in the viz tool although in most cases it will also be exported for more analysis in tools such as IDL, MATLAB, Mathematics, etc. This one-dimensional data may also be used in mesh convergence studies. It is often desirable to be able to simultaneously view results from multiple simulations runs. However, the current tool interfaces for doing this often make it easier to invoke multiple instances of the viz tool unless, of course, quantitative rather than visual comparisons need to be performed.
Future simulations tend to have less data, as the output requests are targeted to performing the analysis the customer requested. Eventually the analyst will need to produce images and animations for customer/analyst presentations (viewgraphs).
For structured fields, we prefer to spatially reduce the data output from the simulation tool rather than taking cuts/slices in the viz tool. However, not all file formats currently support this paradigm.
Particle visualization/analysis is currently done a separate tool (IDL) than we use to view the fields. Ideally, the mesh-based viz tools would support simultaneous visualization of the fields and particles. Particles are typically windowed by spatial location and/or attribute (charge, energy, etc.) and plotted in an XY scatter plot with dots representing the particles. The particles may also be colored by their attributes.
Extracted/Implied Tool Requirements
# Must support multi-block structured vector fields; unstructured vector fields on tets, hexes, and pyramids; vertex based particle data.
# It would be highly desirable if the tool can simultaneously visualize all the above, possibly overlapping, data.
# Must support Exodus, SAF, and PFF file formats.
# It would be highly desirable if FEM edge-based fields were supported in viz tools and data file formats.
# Must make it easy to view fields in tetrahedral adaptive meshes.
# Tools, data file formats, and viz platforms must make it easy to generate large (10-100 GB) amounts of data (especially for users on LANs other than ESHPC).
# Must be easy to produce animations of transient data as well as save animation formats easily exchanged with Windows and OS X users.
# Must be easy to visualize vector fields: both as magnitudes (and/or selectable components) and as actual vectors. Autoscaling of data over time intervals and fine control over the density of vectors is required.
# Must make it easy to create derived variables. These derived variables must include the ability to include advanced  mathematical operations such as dot, cross, integrations over lines, surfaces, volumes, calculation of surface normals, moving from nodes to elements, etc.
# Must be able to easily produce XY plot of probed or reduced data.
# Must be able to export extracted XY data in a commonly supported format such as CSV.
# Must make it easy to deal with long variable names and should not mangle the user provided names (i.e., resize widgets to support long names instead of using scroll bars).
# Must make it easy to produce images the can be incorporated in viewgraphs (i.e., white backgrounds, large fonts, etc.).
# Must make it easy to load multiple results and make it easy to identify which results are which.
# Tool must provide the necessary filters/calculators to compare results from different discretizations.
# Data file formats must support reduction of structured mesh data.
# It would be highly desirable if mesh based viz tools provided support for particle data sets (including attributes beyond position, such as charge, energy, etc.).
# Tool must make it easy to manipulate 2D-like data. All too often it is hard to manipulate (e.g., to zoom) data that is long and wide but very thin.
# Must be easy to save/restore the state of a session possibly using a different data set.
# Tool must support x86 and x86-64 Linux.
'''Use Case 4: Exploratory Visualization'''
<br>''User: Tre` Shelton''
In the V&V arena, an analyst spends a great deal of time comparing predictions with experiment.  In the thermal case, simulation output is typically nodal values, which represent thermocouple temperatures.  A great deal of time is spent formatting both the simulation and experimental data for formal comparisons.  Several scripts/macros have been developed to automate this process as much as possible.  The ability to quantitatively and qualitatively compare prediction with experiment provides valuable insight to the analyst.  Producing sand report quality images of these comparisons could save the analyst a great deal of time and effort in the formal documentation stage of the D2A process.
In the same vain as the comparisons with experimental data, the ability to quantitatively compare predictions with each other would add more power to the visualization tool.  Simulation uncertainty quantification is becoming more standard at Sandia.  The visualization tool needs to support that effort as well as facilitate it.  All of my current UQ comparisons are performed in Excel were I have to manage a great deal of data.  The ability to define important parameters in the visualization tool and then have it monitor the effect these parameters have on predictions would really reduce the time I spend in Excel.

Latest revision as of 15:26, 28 March 2006

Use Case 1: Analysis of a Structural Model
User: Bruce Kistler

Description: Analyst first generates a movie to obtain a visual understanding of model displacements. This movie will typically use either time-based model displacements or contour lines. Once he has a general overview of the structural displacements, they will then delve deeper into the model to obtain quantitative measurements. These measurements are obtained using queries and plots of stress, strain, and displacement. A final animation is then generated to display specific items of interest in the analysis.


Extracted requirements:

  1. Handle both structured 3D.
  2. Color the mesh by scalar variables including derived variables.
  3. Compute new variables from output variables.
  4. Produce a plane clip through a 3D mesh.
  5. Have the profile plots update automatically when you change time step.
  6. Provide linear plotter axes and labeled curve legends in different lines styles and colors.
  7. Provide linear color palettes with color legends which are user adjustable in both location and size.
  8. Provide text annotations including time annotations.
  9. Be able to automatically deal with byte order differences in different data formats.
  10. Be able to work with data which resides on a remote platform like White without having to move it.
  11. Be able to work effectively with big data.
  12. Be able to generate animations which can be played without reformatting on a desktop.
  13. Provide an integrated movie tool for displaying these animations on a desktop or a multi-panel display wall.
  14. Provide a geometry display tool for displaying interactive flipbook animations on a desktop or multi-panel display wall.


Use Case 2: Time Domain Electromagnetic Simulation Set-up/Debugging
User: Michael Pasik

Description: When creating volumetric tetrahedral meshes from customer provided ProE or SolidWorks solid models, the model typically goes through several tools (CUBIT, I-DEAS, etc.) and/or formats (ACIS, IGES, etc.). Even after geometry clean-up and using tools like virtual geometry or section meshing, small sections or facets typically remain. When applying boundary conditions for electrical simulations is essential that the boundary conditions be applied to all the necessary surfaces otherwise current flow and/or equi-potential surfaces can be interrupted causing large parallel runs to produce incorrect results. The problem is particularly acute with I-DEAS as it identifies boundary conditions (encoded as pressures) using arrows normal to a surface. As result, after generating a mesh we typically will visualize sidesets in the Genesis database by flying through the geometry while changing the visibility (and representation) of the individual sidesets to verify their electrical connectivity.

Our time-domain electromagnetic simulation tools employ edge based variables. For diagnostics purposes we often need to identify a set of connected edges using sidesets. Because the Genesis format does not support edgesets, we use distribution factors to identify which edge of a particular element face is to be used. Currently, visualizing these “edgesets” requires some imagination in guessing the connected edges (think of a tail on a kite or sharks teeth). Ideally we would like to visualize the edges rather than faces.

Several unit conversions may have been involved in generating the final mesh since mechanical designers prefer using mils/inches/feet, most meshing tools tend to have absolute geometry tolerances that can be on the scale of small devices, and most analysts prefer to work in SI units. We typically use visualization tools to confirm the unit conversions have all been performed properly by making measurements of key mesh dimensions.

When setting up input files for our simulation tools, analysts will often want to determine locations within the geometry to locate additional diagnostics. This step involves identifying coordinates and/or node/element numbers corresponding to key locations in the geometry.

Lastly, when debugging it is often useful to be able to query mesh connectivity and selectively label and/or display sets of elements/nodes.

Our primary platform is x86/Linux but expect this to change in the future to x86-64/Linux.

Extracted/Implied Tool Requirements

  1. Must be able to easily load all sidesets in a Genesis file as individual entities with their IDs.
  2. Must be able to easily control the visibility of individual sidesets.
  3. Must be able to easily discern the identity of a sideset (e.g., using labels, highlighting, etc.).
  4. Must be display sidesets with shaded surface, wireframe, and hidden line representations.
  5. Sidesets display must support tet, hex, and pyramid element types.
  6. It would be highly desirable if the tool could interpret/display sideset distribution factors (i.e., fields associated with the sideset).
  7. It would be highly desirable if the tools could visualize our “edgesets” encoded as sidesets as element edges rather than element faces.
  8. Must be able to easily measure distances between mesh entities such as nodes and sidesets.
  9. Must be able to easily determine the extent of selected entities (sidesets, element blocks, etc.).
  10. It would be highly desirable if the tools could report measured distances in multiple user selected units or perform unit conversions.
  11. It must be easy to probe coordinates and node/element numbers using a user controlled cursor.
  12. Must be able to report the closest node/element numbers to an arbitrary user specified location.
  13. It would be highly desirable if this probing operation could be easily scripted for automated probing of the mesh.
  14. Must be able to easily display user subsets of elements/nodes.
  15. Must be able to display element/node numbering (including understanding maps for parallel).
  16. Must be able to provide mesh connectivity (given a node number provide the elements that include it, etc.).
  17. Tool must support x86 and x86-64 Linux.


Use Case 3: Time Domain Electromagnetic Simulation Analysis
User: Michael Pasik

Description: Time domain electromagnetic simulations can involve a variety of data including structured mesh vector fields with 1-50 million cells in hundreds of blocks, 1-10 million unstructured tetrahedral, hexahedral, and pyramid element-based vector fields in tens of blocks in Exodus files, and millions of particles (vertex-based with scalar/vector attribute data such as charge, momentum, etc. that move each time step) in our Portable File Format (PFF) files. The structured field data is stored by block using element-based fields in our PFF files or edge-based fields in SAF files. The unstructured fields produced by our simulation tools are actually edge-based but we move them to the element in order to use Exodus. Ideally we would like to visualize the edge-based fields in our visualization tools. A single simulation can produce any combination of the above data. We are also currently developing a capability to adaptive refine tetrahedral meshes. In this case, separate Exodus files are produced at each time step.

Typically, an initial simulation will be performed that will save large (tens of GB) amounts of data. Unstructured data is produced as many separate Exodus files that are typically concatenated together. This step often fails on 32-bit Linux clusters so the data is often moved to 64-bit SGI machines for analysis. This step is cumbersome and is often a barrier to analysts capturing all the data they would like to see. The transient data is typically first reviewed by looking animations of the vector fields in cut planes as a function of time to verify correct simulations set-up, proper operation of the boundary conditions, etc. Once the analyst is satisfied that the qualitative behave is correct, often the designer of the component being simulated will sit down with the analyst or otherwise be shown these animations. Next, the analyst will typically perform data reductions to verify the results quantitatively. This step typically involves performing integrations of the fields (or functions of the fields) over volumes, surfaces, lines or probing the data at selected points. This data will often be viewed as XY plots (as a function of time or less frequently position) directly in the viz tool although in most cases it will also be exported for more analysis in tools such as IDL, MATLAB, Mathematics, etc. This one-dimensional data may also be used in mesh convergence studies. It is often desirable to be able to simultaneously view results from multiple simulations runs. However, the current tool interfaces for doing this often make it easier to invoke multiple instances of the viz tool unless, of course, quantitative rather than visual comparisons need to be performed. Future simulations tend to have less data, as the output requests are targeted to performing the analysis the customer requested. Eventually the analyst will need to produce images and animations for customer/analyst presentations (viewgraphs).

For structured fields, we prefer to spatially reduce the data output from the simulation tool rather than taking cuts/slices in the viz tool. However, not all file formats currently support this paradigm.

Particle visualization/analysis is currently done a separate tool (IDL) than we use to view the fields. Ideally, the mesh-based viz tools would support simultaneous visualization of the fields and particles. Particles are typically windowed by spatial location and/or attribute (charge, energy, etc.) and plotted in an XY scatter plot with dots representing the particles. The particles may also be colored by their attributes.

Extracted/Implied Tool Requirements

  1. Must support multi-block structured vector fields; unstructured vector fields on tets, hexes, and pyramids; vertex based particle data.
  2. It would be highly desirable if the tool can simultaneously visualize all the above, possibly overlapping, data.
  3. Must support Exodus, SAF, and PFF file formats.
  4. It would be highly desirable if FEM edge-based fields were supported in viz tools and data file formats.
  5. Must make it easy to view fields in tetrahedral adaptive meshes.
  6. Tools, data file formats, and viz platforms must make it easy to generate large (10-100 GB) amounts of data (especially for users on LANs other than ESHPC).
  7. Must be easy to produce animations of transient data as well as save animation formats easily exchanged with Windows and OS X users.
  8. Must be easy to visualize vector fields: both as magnitudes (and/or selectable components) and as actual vectors. Autoscaling of data over time intervals and fine control over the density of vectors is required.
  9. Must make it easy to create derived variables. These derived variables must include the ability to include advanced mathematical operations such as dot, cross, integrations over lines, surfaces, volumes, calculation of surface normals, moving from nodes to elements, etc.
  10. Must be able to easily produce XY plot of probed or reduced data.
  11. Must be able to export extracted XY data in a commonly supported format such as CSV.
  12. Must make it easy to deal with long variable names and should not mangle the user provided names (i.e., resize widgets to support long names instead of using scroll bars).
  13. Must make it easy to produce images the can be incorporated in viewgraphs (i.e., white backgrounds, large fonts, etc.).
  14. Must make it easy to load multiple results and make it easy to identify which results are which.
  15. Tool must provide the necessary filters/calculators to compare results from different discretizations.
  16. Data file formats must support reduction of structured mesh data.
  17. It would be highly desirable if mesh based viz tools provided support for particle data sets (including attributes beyond position, such as charge, energy, etc.).
  18. Tool must make it easy to manipulate 2D-like data. All too often it is hard to manipulate (e.g., to zoom) data that is long and wide but very thin.
  19. Must be easy to save/restore the state of a session possibly using a different data set.
  20. Tool must support x86 and x86-64 Linux.


Use Case 4: Exploratory Visualization
User: Tre` Shelton

In the V&V arena, an analyst spends a great deal of time comparing predictions with experiment. In the thermal case, simulation output is typically nodal values, which represent thermocouple temperatures. A great deal of time is spent formatting both the simulation and experimental data for formal comparisons. Several scripts/macros have been developed to automate this process as much as possible. The ability to quantitatively and qualitatively compare prediction with experiment provides valuable insight to the analyst. Producing sand report quality images of these comparisons could save the analyst a great deal of time and effort in the formal documentation stage of the D2A process.

In the same vain as the comparisons with experimental data, the ability to quantitatively compare predictions with each other would add more power to the visualization tool. Simulation uncertainty quantification is becoming more standard at Sandia. The visualization tool needs to support that effort as well as facilitate it. All of my current UQ comparisons are performed in Excel were I have to manage a great deal of data. The ability to define important parameters in the visualization tool and then have it monitor the effect these parameters have on predictions would really reduce the time I spend in Excel.