[Paraview] client/server error for large superquadrics
Moreland, Kenneth
kmorel at sandia.gov
Tue Jun 12 09:31:49 EDT 2007
It looks like your server is crashing because it is trying to open a
connection to the local X host and is failing. You should be able to
correct this by either opening up the X host to the mpi job or compiling
with OSMesa (in which case the -use-offscreen-rendering will prevent a
connection any X host).
On a larger issue, ParaView 2 was able to correctly detect this, give a
warning to the user at the client, and perform all rendering locally on
the client. Has this functionality gone away with ParaView 3? If so,
we need to bring it back.
-Ken
________________________________
From: paraview-bounces+kmorel=sandia.gov at paraview.org
[mailto:paraview-bounces+kmorel=sandia.gov at paraview.org] On Behalf Of
Glanfield, Wayne
Sent: Tuesday, June 05, 2007 3:12 AM
To: paraview at paraview.org
Subject: [Paraview] client/server error for large superquadrics
I am experiencing a problem with Paraview 3.0.1 when generating large
i.e. 1024x1024 Superquadric sources when testing client/server mode.
Smaller values e.g. 1024x16 or 128x128 for example work ok. Large values
i.e. 1024x1024 in local mode work ok. I have installed the repository
version of PV3 and MPICH-1.2.7 and everything appeared to installed
correctly. I have tried it with and without offscreen rendering. Clients
and servers have the foillowing configuration, SUN Ultra 40, AMD
Opteron, SLED10.1 with NVIDIA Quadro FX 3450 GPU with the latest NVIDIA
drivers
Does anyone have any ideas why this is happening?
Regards
Wayne
I receive the following client and server errors
CLIENT
ERROR: In /apps/CFD/ParaView3/Servers/Common/vtkServerConnection.cxx,
line 67
vtkServerConnection (0xedd500): Server Connection Closed!
SERVER (MPI MASTER)
l-cfd-08:/apps/CFD/downloads/mpi/mpich-1.2.7p1 #
/usr/local/mpich-1.2.7/ch_p4/bin/mpirun -np 8 -machinefile
/apps/CFD/machinefile /apps/CFD/paraview-3.0.1/bin/pvserver -rc
-ch=sl-cfd-07 --connect-id=11111 --use-offscreen-rendering
Connected to client
Process id: 2 >> ERROR: In
/apps/CFD/ParaView3/VTK/Rendering/vtkXOpenGLRenderWindow.cxx, line 1376
vtkXOpenGLRenderWindow (0x7fedd0): bad X server connection.
DISPLAY=p2_5680: p4_error: interrupt SIGSEGV: 11
rm_l_2_5685: (81.078125) net_send: could not write to fd=5, errno = 32
Process id: 6 >> ERROR: In
/apps/CFD/ParaView3/VTK/Rendering/vtkXOpenGLRenderWindow.cxx, line 1376
vtkXOpenGLRenderWindow (0x7fea00): bad X server connection.
DISPLAY=p6_5698: p4_error: interrupt SIGSEGV: 11
Process id: 1 >> ERROR: In
/apps/CFD/ParaView3/VTK/Rendering/vtkXOpenGLRenderWindow.cxx, line 1376
vtkXOpenGLRenderWindow (0x7fea00): bad X server connection.
DISPLAY=Process id: 3 >> ERROR: In
/apps/CFD/ParaView3/VTK/Rendering/vtkXOpenGLRenderWindow.cxx, line 1376
vtkXOpenGLRenderWindow (0x7fea00): bad X server connection.
DISPLAY=Process id: 5 >> ERROR: In
/apps/CFD/ParaView3/VTK/Rendering/vtkXOpenGLRenderWindow.cxx, line 1376
vtkXOpenGLRenderWindow (0x7fea00): bad X server connection.
DISPLAY=Process id: 4 >> ERROR: In
/apps/CFD/ParaView3/VTK/Rendering/vtkXOpenGLRenderWindow.cxx, line 1376
vtkXOpenGLRenderWindow (0x7fea00): bad X server connection.
DISPLAY=p4_5689: p4_error: interrupt SIGSEGV: 11
Process id: 7 >> ERROR: In
/apps/CFD/ParaView3/VTK/Rendering/vtkXOpenGLRenderWindow.cxx, line 1376
vtkXOpenGLRenderWindow (0x7fea00): bad X server connection.
DISPLAY=p7_3036: p4_error: interrupt SIGSEGV: 11
rm_l_7_3041: (77.406250) net_send: could not write to fd=5, errno = 32
p1_3015: p4_error: interrupt SIGSEGV: 11
rm_l_1_3020: (82.078125) net_send: could not write to fd=5, errno = 32
p5_3029: p4_error: interrupt SIGSEGV: 11
p3_3022: p4_error: interrupt SIGSEGV: 11
rm_l_3_3027: (80.535156) net_send: could not write to fd=5, errno = 32
rm_l_6_5703: (78.105469) net_send: could not write to fd=5, errno = 32
rm_l_4_5694: (79.593750) net_send: could not write to fd=5, errno = 32
rm_l_5_3034: (79.039062) net_send: could not write to fd=5, errno = 32
sl-cfd-08:/apps/CFD/downloads/mpi/mpich-1.2.7p1 # p2_5680: (85.132812)
net_send: could not write to fd=5, errno = 32
p5_3029: (89.078125) net_send: could not write to fd=5, errno = 32
p7_3036: (87.464844) net_send: could not write to fd=5, errno = 32
p6_5698: (88.152344) net_send: could not write to fd=5, errno = 32
p3_3022: (90.593750) net_send: could not write to fd=5, errno = 32
p4_5689: (89.652344) net_send: could not write to fd=5, errno = 32
p1_3015: (94.136719) net_send: could not write to fd=5, errno = 32
The following information is a list of processes which are running when
I manually setup the client and servers;
CLIENT SL-CFD-07
root 27926 2.1 1.0 246744 85764 pts/1 SL+ 09:32 0:12
/apps/CFD/paraview-3.0.1/bin/paraview --connect-id=11111
MPI MASTER SERVER 1 (SL-CFD-08)
sl-cfd-08:/apps/CFD/downloads/mpi/mpich-1.2.7p1 #
/usr/local/mpich-1.2.7/ch_p4/bin/mpirun -np 8 -machinefile
/apps/CFD/machinefile /apps/CFD/paraview-3.0.1/bin/pvserver -rc
-ch=sl-cfd-07 --connect-id=11111 --use-offscreen-rendering
Connected to client
root 4737 0.0 0.0 8336 1684 pts/3 S+ 09:36 0:00 /bin/sh
/usr/local/mpich-1.2.7/ch_p4/bin/mpirun -np 8 -machinefile
/apps/CFD/machinefile /apps/CFD/paraview-3
root 4968 0.3 0.2 161004 36668 pts/3 S+ 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver -rc -ch=sl-cfd-07
--connect-id=11111 --use-offscreen-rendering -p4pg /r
root 4969 0.0 0.0 156904 5632 pts/3 S+ 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver -rc -ch=sl-cfd-07
--connect-id=11111 --use-offscreen-rendering -p4pg /r
root 4970 0.0 0.0 6388 676 pts/3 S+ 09:36 0:00
/usr/bin/rsh sl-cfd-09 -l root -n /apps/CFD/paraview-3.0.1/bin/pvserver
sl-cfd-08 58395 \-p4amslave \-p4yourn
root 4971 0.0 0.0 6388 672 pts/3 S+ 09:36 0:00
/usr/bin/rsh sl-cfd-08 -l root -n /apps/CFD/paraview-3.0.1/bin/pvserver
sl-cfd-08 58395 \-p4amslave \-p4yourn
root 4972 0.0 0.0 24840 1216 ? Ss 09:36 0:00 in.rshd
-aL
root 4973 0.2 0.2 160308 36396 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-08 -p4rmrank 2
root 4978 0.0 0.0 6388 676 pts/3 S+ 09:36 0:00
/usr/bin/rsh sl-cfd-09 -l root -n /apps/CFD/paraview-3.0.1/bin/pvserver
sl-cfd-08 58395 \-p4amslave \-p4yourn
root 4979 0.0 0.0 156344 5952 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-08 -p4rmrank 2
root 4980 0.0 0.0 6388 676 pts/3 S+ 09:36 0:00
/usr/bin/rsh sl-cfd-08 -l root -n /apps/CFD/paraview-3.0.1/bin/pvserver
sl-cfd-08 58395 \-p4amslave \-p4yourn
root 4981 0.0 0.0 24836 1208 ? Ss 09:36 0:00 in.rshd
-aL
root 4982 0.2 0.2 160312 36392 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-08 -p4rmrank 4
root 4987 0.0 0.0 6384 676 pts/3 S+ 09:36 0:00
/usr/bin/rsh sl-cfd-09 -l root -n /apps/CFD/paraview-3.0.1/bin/pvserver
sl-cfd-08 58395 \-p4amslave \-p4yourn
root 4988 0.0 0.0 156348 5956 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-08 -p4rmrank 4
root 4989 0.0 0.0 6384 676 pts/3 S+ 09:36 0:00
/usr/bin/rsh sl-cfd-08 -l root -n /apps/CFD/paraview-3.0.1/bin/pvserver
sl-cfd-08 58395 \-p4amslave \-p4yourn
root 4990 0.0 0.0 24844 1212 ? Ss 09:36 0:00 in.rshd
-aL
root 4991 0.2 0.2 160308 36388 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-08 -p4rmrank 6
root 4996 0.0 0.0 6384 672 pts/3 S+ 09:36 0:00
/usr/bin/rsh sl-cfd-09 -l root -n /apps/CFD/paraview-3.0.1/bin/pvserver
sl-cfd-08 58395 \-p4amslave \-p4yourn
root 4997 0.0 0.0 156344 5956 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-08 -p4rmrank 6
SERVER 2 (SL-CFD-09)
root 2984 6.5 0.2 160308 36396 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-09 -p4rmrank 1
root 2989 0.0 0.0 156344 5180 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-09 -p4rmrank 1
root 2990 0.0 0.0 24840 1216 ? Ss 09:36 0:00 in.rshd
-aL
root 2991 7.2 0.2 160308 36392 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-09 -p4rmrank 3
root 2996 0.0 0.0 156344 5956 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-09 -p4rmrank 3
root 2997 0.0 0.0 24848 1216 ? Ss 09:36 0:00 in.rshd
-aL
root 2998 8.7 0.2 160312 36396 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-09 -p4rmrank 5
root 3003 0.0 0.0 156348 5964 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-09 -p4rmrank 5
root 3004 0.0 0.0 24848 1216 ? Ss 09:36 0:00 in.rshd
-aL
root 3005 10.0 0.2 160312 36392 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-09 -p4rmrank 7
root 3010 0.0 0.0 156348 5960 ? S 09:36 0:00
/apps/CFD/paraview-3.0.1/bin/pvserver sl-cfd-08 58395 4amslave
-p4yourname sl-cfd-09 -p4rmrank 7
---------------------------------------------------------------------
For further information on the Renault F1 Team visit our web site at
www.renaultf1.com.
Renault F1 Team Limited
Registered in England no. 1806337
Registered Office: 16 Old Bailey London EC4M 7EG
WARNING: please ensure that you have adequate virus protection in place
before you open or detach any documents attached to this email.
This e-mail may constitute privileged information. If you are not the
intended recipient, you have received this confidential email and any
attachments transmitted with it in error and you must not disclose copy,
circulate or in any other way use or rely on this information.
E-mails to and from the Renault F1 Team are monitored for operational
reasons and in accordance with lawful business practices.
The contents of this email are those of the individual and do not
necessarily represent the views of the company.
Please note that this e-mail has been created in the knowledge that
Internet e-mail is not a 100% secure communications medium. We advise
that you understand and observe this lack of security when e-mailing us.
If you have received this email in error please forward to:
is.helpdesk at uk.renaultf1.com quoting the sender, then delete the message
and any attached documents
---------------------------------------------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://public.kitware.com/pipermail/paraview/attachments/20070612/b06c7717/attachment-0001.htm
More information about the ParaView
mailing list