Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ctest dont pass all test #477

Open
ghostforest opened this issue Jun 20, 2024 · 14 comments
Open

Ctest dont pass all test #477

ghostforest opened this issue Jun 20, 2024 · 14 comments

Comments

@ghostforest
Copy link

Hi, I managed to compile elmer, however about 130 tests of ctest fail to pass. The following tests dont pass.

  8 - AdvReactDB_np6 (Failed)
 10 - AdvReactDBmap_np6 (Failed)
 12 - AdvReactDG_np6 (Failed)
 85 - ContactFrictionHeating (Failed)
125 - CurvedBoundaryCylH_np3 (Failed)
126 - CurvedBoundaryCylH_np8 (Failed)
147 - DisContBoundaryDoubleMortar_np8 (Failed)
153 - DisContBoundaryMortarCont_np8 (Failed)
157 - DisContBoundaryMortarContElim_np8 (Failed)
161 - DisContBoundaryMortarJump_np8 (Failed)
165 - DisContBoundaryMortarJumpB_np8 (Failed)
169 - DisContBoundaryMortarJumpC_np8 (Failed)
220 - ExtrudedMeshSlices_np6 (Failed)
249 - HarmonicNS (Failed)
264 - HelmholtzFEM (Failed)
272 - HelmholtzStructure2 (Failed)
273 - HelmholtzStructure3 (Failed)
274 - Hybrid2dMeshPartitionCyl_np8 (Failed)
275 - Hybrid2dMeshPartitionMetis_np8 (Failed)
276 - Hybrid2dMeshPartitionMetisConnect_np8 (Failed)
278 - HydrostaticNSVec-ISMIP-HOM-C (Failed)
285 - InductionHeating2 (Failed)
286 - InductionHeating3 (Failed)
291 - InternalPartitioning_np8 (Failed)
293 - InternalPartitioning2_np6 (Failed)
295 - InternalPartitioning3_np6 (Failed)
335 - MazeMeshPartitionMetisContig_np6 (Failed)
360 - MortarPoissonPartz3D_np6 (Failed)
361 - MortarPoissonPartz3D_np8 (Failed)
399 - ParallelBoundaryMapGmsh_np8 (Failed)
426 - PlatesEigenComplex (Failed)
427 - PlatesHarmonic (Failed)
434 - PoissonDB_np8 (Failed)
437 - PoissonDG_np8 (Failed)
476 - RotatingBCPoisson3Daxial_np6 (Failed)
500 - SD_HarmonicNS (Failed)
524 - SD_ViscoElasticMaxwell (Failed)
563 - Shell_with_Solid_Beam_EigenComplex (Failed)
569 - ShoeboxFsiHarmonicPlate (Failed)
570 - ShoeboxFsiStatic (Failed)
572 - ShoeboxFsiStaticShell (Failed)
607 - StressConstraintModes3 (Failed)
633 - VectorHelmholtzImpMatrix (Failed)
634 - VectorHelmholtzWaveguide (Failed)
635 - VectorHelmholtzWaveguide2 (Failed)
636 - VectorHelmholtzWaveguide3 (Failed)
637 - VectorHelmholtzWaveguide4 (Failed)
639 - VectorHelmholtzWaveguideNodal (Failed)
640 - VectorHelmholtzWaveguideQuadBlock (Failed)
641 - VectorHelmholtzWaveguide_TM (Failed)
648 - WinkelBmPoissonCgIlu0_np8 (Failed)
650 - WinkelBmPoissonIdrsIlu0_np8 (Failed)
651 - WinkelPartitionMetis_np8 (Failed)
652 - WinkelPartitionMetisConnect_np8 (Failed)
653 - WinkelPartitionMetisRec_np8 (Failed)
654 - WinkelPartitionRecursive_np8 (Failed)
655 - WinkelPartitionRecursiveHaloBC_np8 (Failed)
656 - WinkelPartitionRecursiveLevel2_np8 (Failed)
663 - WinkelPoissonMetisKwayDual_np8 (Failed)
666 - WinkelPoissonMetisKwayNodal_np8 (Failed)
667 - WinkelPoissonPartitionRecursive_np8 (Failed)
693 - circuits2D_harmonic_stranded_homogenization (Failed)
694 - circuits2D_scan_harmonics (Failed)
702 - circuits_harmonic_foil (Failed)
703 - circuits_harmonic_foil_anl_rotm (Failed)
704 - circuits_harmonic_foil_wvector (Failed)
705 - circuits_harmonic_homogenization_coil_solver (Failed)
706 - circuits_harmonic_massive (Failed)
707 - circuits_harmonic_stranded (Failed)
708 - circuits_harmonic_stranded_homogenization (Failed)
738 - freesurf_maxd_np4 (Failed)
741 - freesurf_maxd_local_np4 (Failed)
766 - linearsolvers_cmplx (Failed)
768 - mgdyn2D_compute_average_b (Failed)
769 - mgdyn2D_compute_bodycurrent (Failed)
770 - mgdyn2D_compute_complex_power (Failed)
773 - mgdyn2D_em_harmonic (Failed)
774 - mgdyn2D_harmonic_anisotropic_permeability (Failed)
776 - mgdyn2D_scan_homogenization_elementary_solutions (Failed)
778 - mgdyn_3phase (Failed)
782 - mgdyn_airgap_force_np2 (Failed)
783 - mgdyn_airgap_harmonic (Failed)
799 - mgdyn_harmonic (Failed)
800 - mgdyn_harmonic_loss (Failed)
801 - mgdyn_harmonic_wire (Failed)
802 - mgdyn_harmonic_wire_Cgauge (Failed)
803 - mgdyn_harmonic_wire_Cgauge_automatic (Failed)
804 - mgdyn_harmonic_wire_impedanceBC (Failed)
805 - mgdyn_harmonic_wire_impedanceBC2 (Failed)
807 - mgdyn_lamstack_lowfreq_harmonic (Failed)
809 - mgdyn_lamstack_widefreq_harmonic (Failed)
814 - mgdyn_steady_coils (Failed)
828 - mgdyn_thinsheet_harmonic (Failed)
832 - mgdyn_torus_harmonic (Failed)
835 - mgdyn_wave_eigen (Failed)
836 - mydyn_wave_harmonic (Failed)
867 - poisson_transient_conforming_anti_np8 (Failed)
868 - radiation (Failed)
869 - radiation2 (Failed)
870 - radiation2d (Failed)
871 - radiation2dAA (Failed)
872 - radiation2d_deform (Failed)
873 - radiation2d_spectral (Failed)
874 - radiation2dsymm (Failed)
875 - radiation3d (Failed)
876 - radiation_bin (Failed)
877 - radiation_dg (Failed)
878 - radiator2d (Failed)
879 - radiator3d (Failed)
880 - radiator3d_box (Failed)
881 - radiator3d_box2 (Failed)
882 - radiator3d_open (Failed)
883 - radiator3d_radiosity (Failed)
884 - radiator3d_spectral (Failed)
885 - radiator3d_symm (Failed)

How do I solve this issue?

@raback
Copy link
Contributor

raback commented Jun 20, 2024

You may have different set of reasons for this. For example, I suspect that you cannot run with 6 or 8 MPI tasks and all such tests seem to fail (with suffix _np6 and _np8).

I suggest that you pick representative tests and check the log. For example for "radiation" you have the files:
elmeruser@elmer-VirtualBox:~/Source/builddir_devel/fem/tests/radiation$ ls test-std* test-stderr_1.log test-stdout_1.log

what is the content here? Often it makes it easier to locate the issue when you raise the "Max Output Level" in the *.sif file to 20 or so to make the solver more verbose. The reason why "radiation" fails is probably the same for all tests 868-885.

-Peter

@ghostforest
Copy link
Author

ghostforest commented Jun 20, 2024

I extended the "Max Output Level" to 30, reran "ctest -j4" from my build directory which is ~/User/elmer/elmerfem/build2

test-stderr_1.log is empty,
and this is the log from "test-stdout_1.log":

ELMER SOLVER (v 9.0) STARTED AT: 2024/06/20 14:15:05
ParCommInit:  Initialize #PEs:            1
MAIN: 
MAIN: =============================================================
MAIN: ElmerSolver finite element software, Welcome!
MAIN: This program is free software licensed under (L)GPL
MAIN: Copyright 1st April 1995 - , CSC - IT Center for Science Ltd.
MAIN: Webpage http://www.csc.fi/elmer, Email [email protected]
MAIN: Version: 9.0 (Rev: 879006d32, Compiled: 2024-06-15)
MAIN:  Running one task without MPI parallelization.
MAIN:  Running with just one thread per task.
MAIN: =============================================================
MAIN: 
MAIN: 
MAIN: -------------------------------------
MAIN: Reading Model: radiation.sif
LoadInputFile: Scanning input file: radiation.sif
LoadInputFile: Scanning only size info
LoadInputFile: First time visiting
LoadInputFile: Reading base load of sif file
LoadInputFile: Loading input file: radiation.sif
LoadInputFile: Reading base load of sif file
LoadInputFile: Number of BCs: 3
LoadInputFile: Number of Body Forces: 1
LoadInputFile: Number of Initial Conditions: 1
LoadInputFile: Number of Materials: 2
LoadInputFile: Number of Equations: 1
LoadInputFile: Number of Solvers: 1
LoadInputFile: Number of Bodies: 2
ListTagKeywords: Setting weight for keywords!
ListTagKeywords: No parameters width suffix: normalize by area
ListTagKeywords: Setting weight for keywords!
ListTagKeywords: No parameters width suffix: normalize by volume
Loading user function library: [HeatSolve]...[HeatSolver_Init0]
LoadMesh: Loading serial mesh!
ElmerAsciiMesh: Performing step: 1
ElmerAsciiMesh: Base mesh name: ./radiation
ReadHeaderFile: Reading header info from file: ./radiation/mesh.header
InitializeMesh: Number of nodes in mesh: 148
InitializeMesh: Number of bulk elements in mesh: 105
InitializeMesh: Number of boundary elements in mesh: 76
InitializeMesh: Initial number of max element nodes: 4
ElmerAsciiMesh: Performing step: 2
ReadNodesFile: Reading nodes from file: ./radiation/mesh.nodes
SetMeshDimension: Dimension of mesh is: 2
SetMeshDimension: Max dimension of mesh is: 2
ElmerAsciiMesh: Performing step: 3
ReadElementsFile: Reading bulk elements from file: ./radiation/mesh.elements
ElmerAsciiMesh: Performing step: 4
ReadBoundaryFile: Reading boundary elements from file: ./radiation/mesh.boundary
PermuteNodeNumbering: Performing node mapping
MapBodiesAndBCs: Skipping remapping of bodies
MapBodiesAndBCs: Remapping boundaries
MapBodiesAndBCs: Minimum initial boundary index: 1
MapBodiesAndBCs: Maximum initial boundary index: 3
ElmerAsciiMesh: Performing step: 5
ElmerAsciiMesh: Performing step: 6
LoadMesh: Loading mesh done
NonNodalElements: Element dofs max: 4
LoadMesh: Preparing mesh done
LoadMesh: Elapsed REAL time:     0.0017 (s)
MeshStabParams: Computing stabilization parameters
MeshStabParams: Elapsed REAL time:     0.0003 (s)
LoadModel: Defined radition solver by Equation name "heat equation"
CompleteModelKeywords: Completing keywords for mortars and mechanics!
MAIN: -------------------------------------
AddMeshCoordinates: Setting mesh coordinates and time
VariableAdd: Adding variable > Coordinate 1 < of size 148
VariableAdd: Adding variable > Coordinate 2 < of size 148
VariableAdd: Adding variable > Coordinate 3 < of size 148
AddSolvers: Setting up 1 solvers
AddSolvers: Setting up solver 1: heat equation
AddEquationBasics: Setting up keywords internally for legacy solver: heat equation
AddEquationBasics: Using procedure: HeatSolve HeatSolver
AddEquationBasics: Setting up solver: heat equation
Loading user function library: [HeatSolve]...[HeatSolver_Init]
AddEquationBasics: Checking for _init solver
Loading user function library: [HeatSolve]...[HeatSolver_bulk]
AddEquationBasics: Checking for _bulk solver
Loading user function library: [HeatSolve]...[HeatSolver]
AddEquationBasics: Treating variable string: temperature
AddEquationBasics: Creating standard variable: temperature
AddEquationBasics: Computing size of permutation vector
RadiationFactors: ----------------------------------------------------
RadiationFactors: Computing radiation factors for heat transfer
RadiationFactors: ----------------------------------------------------
RadiationFactors: Using sparse matrix format for factor computations.
RadiationFactors: Using direct solver for radiation factors
RadiationFactors: Total number of Radiation Surfaces 39 out of 76
RadiationFactors: Temporarily updating the mesh.nodes file!
ERROR:: systemc: Command exit status was 6
RadiationFactors: Computing area info for set 1
RadiationFactors: Loading view factors!
WARNING:: RadiationFactors: View Factors File does NOT exist: ./radiation/ViewFactors.dat
RadiationFactors: All done time (s)                      1.2359E-01
RadiationFactors: ----------------------------------------------------
AddEquationBasics: Maximum size of permutation vector is: 148
AddEquationBasics: Creating solver matrix topology
CreateMatrix: Creating initial permutation
CreateMatrix: Creating inverse of initial order of size: 148
CreateMatrix: Creating list matrix for equation: heat equation
MakeListMatrix: Creating list matrix
MakeListMatrix: Adding radiation matrix
MakeListMatrix: Done Adding radiation matrix
OptimizeBandwidth: ---------------------------------------------------------
OptimizeBandwidth: Computing matrix structure for: heat equation
OptimizeBandwidth: Initial bandwidth for heat equation: 15
OptimizeBandwidth: Optimized bandwidth for heat equation: 12
OptimizeBandwidth: ---------------------------------------------------------
CreateMatrix: Initializing list matrix for equation
CRS_CreateMatrix: Creating CRS Matrix of size: 148
CRS_CreateMatrix: Creating CRS Matrix with nofs: 1070
CreateMatrix: Sparse matrix created
AddEquationBasics: Number of rows in CRS matrix: 148
AddEquationBasics: Creating solver variable
VariableAdd: Adding variable > temperature < of size 148
AddSolvers: Setting up solvers done
AddTimeEtc: Setting time and other global variables
VariableAdd: Adding variable > Time < of size 1
VariableAdd: Adding variable > Timestep < of size 1
VariableAdd: Adding variable > Timestep size < of size 1
VariableAdd: Adding variable > Timestep interval < of size 1
VariableAdd: Adding variable > nonlin iter < of size 1
VariableAdd: Adding variable > coupled iter < of size 1
VariableAdd: Adding variable > Partition < of size 1
MAIN: Random seed initialized to: 314159265
SetInitialConditions: Setting up initial conditions (if any)
InitCond: Initial conditions for 2D mesh:radiation
VectorValuesRange:  [min,max,sum] for PreInit: temperature:   0.0000000000000000        0.0000000000000000        0.0000000000000000
InitCond: Trying to initialize variable: coordinate 1
InitCond: Trying to initialize variable: coordinate 2
InitCond: Trying to initialize variable: coordinate 3
InitCond: Trying to initialize variable: temperature
InitCond: Trying to initialize variable: time
InitCond: Trying to initialize variable: timestep
InitCond: Trying to initialize variable: timestep size
InitCond: Trying to initialize variable: timestep interval
InitCond: Trying to initialize variable: nonlin iter
InitCond: Trying to initialize variable: coupled iter
InitCond: Trying to initialize variable: partition
ListInitElementKeyword: Treating keyword: temperature
ListInitElementKeyword: Initiated handle for: > temperature < of type: 4
VectorValuesRange:  [min,max,sum] for PostInit: temperature:   250.00000000000000        250.00000000000000        37000.000000000000
MAIN: Number of timesteps to be saved: 1
MAIN: 
MAIN: -------------------------------------
MAIN:  Steady state iteration:            1
MAIN: -------------------------------------
MAIN: 
SolveEquations: Solvers before timestep
SolveEquations: Solvers in main iteration loop
SolveEquations: Performing set of solvers in sequence
ListTagCount: Counting tags for keyword normalization!
SetActiveElementsTable: Creating active element table for: heat equation
SetActiveElementsTable: Number of active elements found : 105
SingleSolver: Attempting to call solver: 1
SingleSolver: Solver Equation string is: heat equation
HeatSolver: -------------------------------------------
HeatSolver: Solving the energy equation for temperature
DefaultStart: Starting solver: heat equation
HeatSolve: 
HeatSolve: 
HeatSolve: -------------------------------------
HeatSolve:  TEMPERATURE ITERATION           1
HeatSolve: -------------------------------------
HeatSolve: 
HeatSolve: Starting Assembly...
InitializeToZero: Initializing the linear system to zero
HeatSolve: Assembly:
GetNOFActive: Number of active elements: 105
VectorValuesRange:  [min,max,sum] for A_bulk:  -2.6111111111326677        13.555555555550669        2.3869795029440866E-015
VectorValuesRange:  [min,max,sum] for b_bulk:   0.0000000000000000        41.666666666750018        400.00000000000011
ERROR:: ComputeRadiationLoad: Gebhart factors not calculated for boundary!

It seems something is missing?

@juharu
Copy link
Contributor

juharu commented Jun 20, 2024 via email

@ghostforest
Copy link
Author

ghostforest commented Jun 20, 2024

adrian@MacBook-Pro-von-Adrian build2 % ViewFactors
dyld[15337]: Library not loaded: '@rpath/libmpi_stubs.dylib'
  Referenced from: '/usr/local/bin/ViewFactors'
  Reason: tried: '/usr/local/bin/libmpi_stubs.dylib' (no such file), '/usr/local/Cellar/gcc/14.1.0_1/lib/gcc/current/gcc/x86_64-apple-darwin21/14/libmpi_stubs.dylib' (no such file), '/usr/local/Cellar/gcc/14.1.0_1/lib/gcc/current/gcc/libmpi_stubs.dylib' (no such file), '/usr/local/Cellar/gcc/14.1.0_1/lib/gcc/current/libmpi_stubs.dylib' (no such file), '/usr/local/bin/../lib/elmersolver/libmpi_stubs.dylib' (no such file), '/usr/local/bin/libmpi_stubs.dylib' (no such file), '/usr/local/Cellar/gcc/14.1.0_1/lib/gcc/current/gcc/x86_64-apple-darwin21/14/libmpi_stubs.dylib' (no such file), '/usr/local/Cellar/gcc/14.1.0_1/lib/gcc/current/gcc/libmpi_stubs.dylib' (no such file), '/usr/local/Cellar/gcc/14.1.0_1/lib/gcc/current/libmpi_stubs.dylib' (no such file), '/usr/local/bin/../lib/elmersolver/libmpi_stubs.dylib' (no such file), '/usr/local/lib/libmpi_stubs.dylib' (no such file), '/usr/lib/libmpi_stubs.dylib' (no such file)
zsh: abort      ViewFactors

Yes some dynamic library file seems to be missing.
A quick search told me it exists at ~/User/elmer/elmerfem/build2/fem/src

Why is that, I had multiple problems with files being at the wrong places. Should I simply copy that file to /usr/local/bin/ ?

Thank you so much for your help!

@juharu
Copy link
Contributor

juharu commented Jun 20, 2024 via email

@ghostforest
Copy link
Author

ghostforest commented Jun 20, 2024

I copied the file to "elmer_install_dir/lib/elmersolver", and into /usr/local/bin

This did something, before: "87% tests passed, 115 tests failed out of 918"
Now: "90% tests passed, 96 tests failed out of 918"

Looking at those tests:

8 - AdvReactDB_np6 (Failed)
10 - AdvReactDBmap_np6 (Failed)
12 - AdvReactDG_np6 (Failed)

I recognize that the "test_stdout_6.log" files are all empty. No idea.

This was my cmake command, maybe I set stuff wrong up, but I had to add all those flags bc it wouldnt compile otherwise.

cmake -DCMAKE_C_COMPILER=/usr/local/Cellar/gcc/14.1.0_1/bin/gcc-14 \
-DCMAKE_CXX_COMPILER=/usr/local/Cellar/gcc/14.1.0_1/bin/g++-14 \
-DWITH_ELMERGUI=TRUE \
-DWITH_OpenMP:BOOLEAN=TRUE \
-DCMAKE_PREFIX_PATH="/usr/local/Cellar/open-mpi/5.0.3_1;$(brew --prefix libomp)" \
-DOpenMP_CXX_FLAGS="-Xpreprocessor -fopenmp -I$(brew --prefix libomp)/include" \ 
-DOpenMP_CXX_LIB_NAMES="omp" \
-DOpenMP_omp_LIBRARY=$(brew --prefix libomp)/lib/libomp.a \
-DCMAKE_EXE_LINKER_FLAGS="-L/usr/local -liomp5" \
-DCMAKE_SHARED_LINKER_FLAGS="-L/usr/local -liomp5" \
-DCMAKE_INSTALL_PREFIX=../install ..

as of now radiation seems to be solved with adding libmpi_stubs.dylib to the correct directory.
every test with a sffix of _np6 or _np8 fails. Some _np4 tests fail too. What is _np6 or _np8 for?

@ghostforest
Copy link
Author

ghostforest commented Jun 21, 2024

I looked into

"85 - ContactFrictionHeating (Failed)"
reran with max Output Level = 30 and found:

WARNING:: CompareToReferenceSolution: Solver 2 FAILED:  Norm = 3.01112493E-06  RefNorm = 9.94112870E-02
CompareToReferenceSolution: Relative Error to reference norm: 9.999697E-01
ReloadInputFile: Realoading input file
LoadInputFile: Loading input file:
WARNING:: CompareToReferenceSolution: FAILED 1 tests out of 1!

The only explanation I could think of for this, if this test is only failing for me, that I compiled elmer with some wrongly linked library that works but produces mathematical errors or just handles stuff differently. I dont know what to do from here.

@ghostforest
Copy link
Author

Any updates on this? I need to get this to work sadly....

@juharu
Copy link
Contributor

juharu commented Jun 28, 2024

The "ContactFrictionHeating"- test fails for me too, one of the known failures,
other known failures include at least

CurvedBoundaryCylH_np3 (Failed)
CurvedBoundaryCylH_np8 (Failed)
ContactFrictionHeating (Failed)
ShoeboxFsiStatic (Failed)
ShoeboxFsiStaticShell (Failed)
StressConstraintModes3 (Failed)

*_np6 tests are running with 6 mpi tasks, for example. Maybe you could post the log from some other test/tests so
we could maybe spot what is going on ...

@juharu
Copy link
Contributor

juharu commented Jun 28, 2024

Looking at those tests:

8 - AdvReactDB_np6 (Failed)
10 - AdvReactDBmap_np6 (Failed)
12 - AdvReactDG_np6 (Failed)

I recognize that the "test_stdout_6.log" files are all empty. No idea.

How about the test_stderr_6.log ? Anyway, might it be the case, that you can't run parallel jobs with 6 or more tasks ?

@juharu
Copy link
Contributor

juharu commented Jun 29, 2024

As an observation the largest group of cases that seem to fail besides the parallel cases are the complex valued cases,
harmonic, complex, Helmholtz,etc.
I seem to have vague recollection of seeing such troubles on macs before ...

@raback
Copy link
Contributor

raback commented Jul 1, 2024

I added a keyword to alternate the order how nodal loads are computed vs. set in conjunction with "apply limiter" turned on. The problem is that sometimes you need to use the loads before determining the limiter, and sometimes the opposite. The updated test case "ContactFrictionHeating" in devel work now.

@ghostforest
Copy link
Author

Looking at those tests:
8 - AdvReactDB_np6 (Failed)
10 - AdvReactDBmap_np6 (Failed)
12 - AdvReactDG_np6 (Failed)
I recognize that the "test_stdout_6.log" files are all empty. No idea.

How about the test_stderr_6.log ? Anyway, might it be the case, that you can't run parallel jobs with 6 or more tasks ?

Quad core but threaded so 8? Id suppose, my bad if this is not the case.

@tzwinger
Copy link
Contributor

tzwinger commented Jul 16, 2024

If you have a quad-core it might be that the MPI implementation by default doesn't allow you runs exceeding that number (i.e., 4) - one can limit MPI tests by setting cmake variable MPI_TEST_MAXPROC or alternatively add option (OpenMPI) --oversubscribe, perhaps to one of these (can't really tell what exactly they are): MPIEXEC_NUMPROC_FLAG, MPIEXEC_POSTFLAGS, MPIEXEC_PREFLAGS

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants