I am trying to compile LSDalton with
Code: Select all
-DENABLE_PCMSOLVER
Setup command was ./setup --mpi --omp --mkl parallel --prefix=/opt -D ENABLE_PCMSOLVER=ON
System is Debian 9.
With deep gratitude,
Igors
Code: Select all
-DENABLE_PCMSOLVER
Code: Select all
./setup --mpi --omp --mkl parallel --prefix=/opt -D ENABLE_PCMSOLVER=ON
Code: Select all
env VERBOSE=1 make
Code: Select all
/bin/sh: 1: -std=c++11: not found
Code: Select all
src/pcm/CMakeLists.txt
Code: Select all
-DENABLE_CXX11_SUPPORT=ON
Code: Select all
cd /home/igors/lsdalton/lsdalton/build/external/pcmsolver-build/src/green && /opt/openmpi-2.1.5/bin/mpicxx -DPCMSolver_EXPORTS -DTAYLOR_CXXIO -I/home/igors/lsdalton/lsdalton/build/external/pcmsolver-build/modules -I/home/igors/lsdalton/lsdalton/external/pcmsolver/api -isystem /home/igors/lsdalton/lsdalton/external/pcmsolver/external/eigen3/include/eigen3 -isystem /home/igors/lsdalton/lsdalton/external/pcmsolver/external/libtaylor -I/home/igors/lsdalton/lsdalton/external/pcmsolver/src -I/home/igors/lsdalton/lsdalton/build/external/pcmsolver-build/include -I/home/igors/lsdalton/lsdalton/external/pcmsolver/include -isystem /home/igors/lsdalton/lsdalton/external/pcmsolver/src/utils/getkw -I/home/igors/lsdalton/lsdalton/external/pcmsolver/src/dielectric_profile -std=gnu++98 -O3 -DNDEBUG -Wno-unused -fPIC -fvisibility=hidden -fvisibility-inlines-hidden -o CMakeFiles/green.dir/UniformDielectric.cpp.o -c /home/igors/lsdalton/lsdalton/external/pcmsolver/src/green/UniformDielectric.cpp
Code: Select all
gfortran: error: unrecognized command line option ‘-mkl=parallel’
Code: Select all
-DENABLE_CXX11_SUPPORT=ON
-mkl
option, only the Intel compilers do. You can still use MKL with the GNU compilers though. Just make sure that you set the MATH_ROOT
environment variable to the install folder for MKL and CMake should pick it up automatically. The documentation at this link should hopefully still be relevant: https://dalton-installation.readthedocs ... /math.htmlCode: Select all
ctest --output-on-failure -L ContinousIntegration
Code: Select all
igors@ibm2-lom:~/lsdalton/lsdalton/build$ ctest --output-on-failure -L ContinuousIntegration
Test project /home/igors/lsdalton/lsdalton/build
Start 1: linsca/linsca_energy
1/29 Test #1: linsca/linsca_energy .......................***Failed 3.57 sec
running test with input files ['energy.dal', 'h2o.mol'] and args None
Traceback (most recent call last):
File "/home/igors/lsdalton/lsdalton/test/linsca/linsca_energy/test", line 26, in <module>
filters={'out': f})
File "/home/igors/lsdalton/lsdalton/test/linsca/linsca_energy/../../runtest/run.py", line 61, in run
f.write(_s)
UnicodeEncodeError: 'ascii' codec can't encode character u'\xa0' in position 269: ordinal not in range(128)
Start 4: linsca/linsca_symtest
2/29 Test #4: linsca/linsca_symtest ...................... Passed 5.57 sec
Start 7: linsca/linsca_energy_restart
3/29 Test #7: linsca/linsca_energy_restart ............... Passed 4.70 sec
Start 72: linsca/linsca_trilevel
4/29 Test #72: linsca/linsca_trilevel ..................... Passed 12.06 sec
Start 73: dft/LSDALTON_dftfunc_xcfun_pbe0
5/29 Test #73: dft/LSDALTON_dftfunc_xcfun_pbe0 ............ Passed 5.72 sec
Start 74: dft/LSDALTON_dftfunc_xcfun_pbe
6/29 Test #74: dft/LSDALTON_dftfunc_xcfun_pbe ............. Passed 6.51 sec
Start 75: LSint/LSDALTON_dftfunc_GGAKey
7/29 Test #75: LSint/LSDALTON_dftfunc_GGAKey .............. Passed 15.20 sec
Start 78: LSint/LSDALTON_df_admm_beta_cam100_xcfun
8/29 Test #78: LSint/LSDALTON_df_admm_beta_cam100_xcfun ... Passed 189.94 sec
Start 80: LSint/LSDALTON_df_admm_beta_cam100_psfun
9/29 Test #80: LSint/LSDALTON_df_admm_beta_cam100_psfun ... Passed 17.22 sec
Start 140: LSint/LSDALTON_UHF_JENGINE_LINK
10/29 Test #140: LSint/LSDALTON_UHF_JENGINE_LINK ............ Passed 4.75 sec
Start 147: linsca/linsca_admm_rapid
11/29 Test #147: linsca/linsca_admm_rapid ................... Passed 6.84 sec
Start 161: LSresponse/LSresponse_HF_alpha
12/29 Test #161: LSresponse/LSresponse_HF_alpha ............. Passed 18.60 sec
Start 170: LSresponse/LSresponse_HF_molgra
13/29 Test #170: LSresponse/LSresponse_HF_molgra ............ Passed 3.13 sec
Start 171: LSresponse/LSresponse_HF_opa
14/29 Test #171: LSresponse/LSresponse_HF_opa ............... Passed 3.98 sec
Start 172: LSresponse/LSresponse_HF_tpa
15/29 Test #172: LSresponse/LSresponse_HF_tpa ............... Passed 6.61 sec
Start 185: LSresponse/LSresponse_DFT_opa
16/29 Test #185: LSresponse/LSresponse_DFT_opa .............. Passed 59.47 sec
Start 195: LSresponse/LSresponse_xcfun_lda_molgrad
17/29 Test #195: LSresponse/LSresponse_xcfun_lda_molgrad .... Passed 15.01 sec
Start 197: LSresponse/LSresponse_xcfun_lda_linrsp
18/29 Test #197: LSresponse/LSresponse_xcfun_lda_linrsp ..... Passed 13.26 sec
Start 211: geomopt/geoopt_rapid
19/29 Test #211: geomopt/geoopt_rapid ....................... Passed 7.59 sec
Start 217: ddynam/ddyn_rapid
20/29 Test #217: ddynam/ddyn_rapid .......................... Passed 19.02 sec
Start 218: dectests/decmp2_energy
21/29 Test #218: dectests/decmp2_energy .....................***Failed 6.06 sec
ERROR: crash during ['/home/igors/lsdalton/lsdalton/build/lsdalton', '-noarch', 'decmp2_energy', 'CO2H2']
--- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
DEC calculations using MPI require at least two MPI processes!
running test: decmp2_energy CO2H2
Start 228: dectests/fullccsd_4
22/29 Test #228: dectests/fullccsd_4 ........................ Passed 16.22 sec
Start 247: dectests/decmp2_fragopt
23/29 Test #247: dectests/decmp2_fragopt ....................***Failed 1.47 sec
ERROR: crash during ['/home/igors/lsdalton/lsdalton/build/lsdalton', '-noarch', 'decmp2_fragopt', 'StrangeString']
--- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
DEC calculations using MPI require at least two MPI processes!
running test: decmp2_fragopt StrangeString
Start 257: dectests/decmp2f12_quick
24/29 Test #257: dectests/decmp2f12_quick ...................***Failed 1.58 sec
ERROR: crash during ['/home/igors/lsdalton/lsdalton/build/lsdalton', '-noarch', 'decmp2f12_quick', 'acetylene']
--- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
DEC calculations using MPI require at least two MPI processes!
running test: decmp2f12_quick acetylene
Start 272: pcm/energy
25/29 Test #272: pcm/energy ................................. Passed 51.85 sec
Start 273: pcm/linear_response
26/29 Test #273: pcm/linear_response ........................ Passed 87.71 sec
Start 275: pcm/cubic_response
27/29 Test #275: pcm/cubic_response ......................... Passed 52.05 sec
Start 290: dectests/decmp2_energy_decco
28/29 Test #290: dectests/decmp2_energy_decco ...............***Failed 1.26 sec
ERROR: crash during ['/home/igors/lsdalton/lsdalton/build/lsdalton', '-noarch', 'decmp2_energy_decco', 'h2o']
--- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
DEC calculations using MPI require at least two MPI processes!
running test: decmp2_energy_decco h2o
Start 303: dectests/decccsdpt_quick
29/29 Test #303: dectests/decccsdpt_quick ...................***Failed 1.53 sec
ERROR: crash during ['/home/igors/lsdalton/lsdalton/build/lsdalton', '-noarch', 'decccsdpt_quick', 'He2']
--- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
DEC calculations using MPI require at least two MPI processes!
running test: decccsdpt_quick He2
79% tests passed, 6 tests failed out of 29
Label Time Summary:
ContinuousIntegration = 638.51 sec (29 tests)
ddynam = 19.02 sec (1 test)
dec = 28.13 sec (6 tests)
decccsd = 3.01 sec (2 tests)
essential = 17.74 sec (5 tests)
f12 = 1.58 sec (1 test)
fulldec = 16.22 sec (1 test)
linsca = 638.51 sec (29 tests)
lsresponse = 266.67 sec (10 tests)
pcm = 191.62 sec (3 tests)
restart = 4.70 sec (1 test)
runtest = 191.62 sec (3 tests)
xcfun = 28.27 sec (2 tests)
Total Test time (real) = 638.99 sec
The following tests FAILED:
1 - linsca/linsca_energy (Failed)
218 - dectests/decmp2_energy (Failed)
247 - dectests/decmp2_fragopt (Failed)
257 - dectests/decmp2f12_quick (Failed)
290 - dectests/decmp2_energy_decco (Failed)
303 - dectests/decccsdpt_quick (Failed)
Errors while running CTest
Code: Select all
command.append('-DBOOST_LIBRARYDIR="{0}"'.format(arguments['--build-boost']))
Code: Select all
command.append('-DFORCE_CUSTOM_BOOST={0}'.format('--build-boost')
Code: Select all
-- found mpi_f08 mod, however only the f90 module is supported, droppin back to -DUSE_MPI_MOD_F90
-- found 32bit integer mpi module
-- found an MPI 3 compatible MPI lib, setting -DVAR_HAVE_MPI3
Source check returned:
('WARNING: no documentation found for this_is_separator in ', 'src/tensor_algebra_dil.F90', ' +2031')
('WARNING: no documentation found for this_is_letter in ', 'src/tensor_algebra_dil.F90', ' +2380')
('WARNING: no documentation found for split_argument in ', 'src/tensor_algebra_dil.F90', ' +4168')
('WARNING: no documentation found for get_next_mlndx in ', 'src/tensor_algebra_dil.F90', ' +4216')
('WARNING: no documentation found for Tile_precondition_doubles_unrest in ', 'src/tensor_pdm_operations_cc.F90', ' +1765')
('WARNING: no documentation found for Tile_precondition_doubles in ', 'src/tensor_pdm_operations_cc.F90', ' +1865')
TESTSTATUS: GOOD
In file included from /home/igors/lsdalton/build/external/pcmsolver-build/include/Config.hpp(62),
from /home/igors/lsdalton/external/pcmsolver/src/cavity/Element.hpp(29),
from /home/igors/lsdalton/external/pcmsolver/src/cavity/Element.cpp(24):
/home/igors/lsdalton/external/pcmsolver/include/Cxx11Workarounds.hpp(131): catastrophic error: cannot open source file "boost/core/enable_if.hpp"
#include <boost/core/enable_if.hpp>
^
compilation aborted for /home/igors/lsdalton/external/pcmsolver/src/cavity/Element.cpp (code 4)
make[5]: *** [src/cavity/CMakeFiles/cavity.dir/Element.cpp.o] Error 4
make[4]: *** [src/cavity/CMakeFiles/cavity.dir/all] Error 2
make[3]: *** [all] Error 2
make[2]: *** [external/pcmsolver-stamp/pcmsolver-build] Error 2
make[1]: *** [src/pcm/CMakeFiles/pcmsolver.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
Source check returned:
('WARNING: no documentation found for this_is_separator in ', 'src/tensor_algebra_dil.F90', ' +2031')
('WARNING: no documentation found for this_is_letter in ', 'src/tensor_algebra_dil.F90', ' +2380')
('WARNING: no documentation found for split_argument in ', 'src/tensor_algebra_dil.F90', ' +4168')
('WARNING: no documentation found for get_next_mlndx in ', 'src/tensor_algebra_dil.F90', ' +4216')
('WARNING: no documentation found for Tile_precondition_doubles_unrest in ', 'src/tensor_pdm_operations_cc.F90', ' +1765')
('WARNING: no documentation found for Tile_precondition_doubles in ', 'src/tensor_pdm_operations_cc.F90', ' +1865')
TESTSTATUS: GOOD
make: *** [all] Error 2
Code: Select all
./setup --mpi --prefix=$HOME/bin/lsd2018_70to89 -D ENABLE_PCMSOLVER=ON -D ENABLE_DEC=ON -D ENABLE_CXX11_SUPPORT=ON --cmake=/home/igors/cmake372/bin/cmake -D ZLIB_ROOT=/home/igors/zlib-1.2.11
Code: Select all
env VERBOSE=1 make -j16 > makelog.logg 2> makeerr.logg
Code: Select all
[100%] Built target lsdalton.x
Install the project...
-- Install configuration: "release"
-- Up-to-date: /home/igors/bin/lsd2018_70to89/dalton
-- Installing: /home/igors/bin/lsd2018_70to89/dalton/lsdalton.x
-- Set runtime path of "/home/igors/bin/lsd2018_70to89/dalton/lsdalton.x" to ""
CMake Error at cmake_install.cmake:60 (file):
file INSTALL cannot find "/home/igors/lsdalton/build/dalton".
make: *** [install] Error 1
Code: Select all
ctest --output-on-failure -L ContinousIntegration
Code: Select all
Test project /home/igors/lsdalton/build
No tests were found!!!
Code: Select all
1 - linsca/linsca_energy (Failed)
218 - dectests/decmp2_energy (Failed)
219 - dectests/decmp2_density (Failed)
220 - dectests/decmp2_gradient (Failed)
221 - dectests/decmp2_geoopt (Failed)
234 - dectests/decrimp2 (Failed)
235 - dectests/decsosrimp2 (Failed)
236 - dectests/decrimp2_restart (Failed)
237 - dectests/decrimp2_density (Failed)
238 - dectests/decrimp2_gradient (Failed)
239 - dectests/decrimp2laplace (Failed)
244 - dectests/decmp2_gradient_debug (Failed)
246 - dectests/decmp2_counterpoise (Failed)
247 - dectests/decmp2_fragopt (Failed)
254 - dectests/fullccsd_f12 (Failed)
255 - dectests/decmp2f12 (Failed)
256 - dectests/decmp2f12_h2 (Failed)
257 - dectests/decmp2f12_quick (Failed)
258 - dectests/decrimp2_f12 (Failed)
259 - dectests/decccsd_noniter_rif12 (Failed)
260 - dectests/decccsd_iter_rif12 (Failed)
261 - dectests/mldec_ccsd_ccsdpt (Failed)
262 - dectests/mldec_hf_rimp2 (Failed)
263 - dectests/mldec_hf_rimp2_laplace (Failed)
264 - dectests/mldec_nolowering (Failed)
265 - dectests/mldec_rimp2_ccsd (Failed)
266 - dectests/mldec_rimp2_ccsdpt (Failed)
267 - dectests/fullccsd_3 (Failed)
268 - dectests/fullccsd_2 (Failed)
269 - dectests/mldec_hf_rimp2_nooptskip (Failed)
283 - dectests/decnp_rimp2_inclfull (Failed)
284 - dectests/decmp2_StressTest (Failed)
287 - dectests/decmp2_gradient_debug2 (Failed)
288 - dectests/decmp2_energy_FO (Failed)
289 - dectests/decmp2_energy_decnp (Failed)
290 - dectests/decmp2_energy_decco (Failed)
291 - dectests/decmp2_gradient_decco (Failed)
292 - dectests/decmp2_geoopt_FO (Failed)
293 - dectests/decccsd_BSSE (Failed)
294 - dectests/decccsd_restart (Failed)
295 - dectests/decmp2_restart (Failed)
296 - dectests/deccc2 (Failed)
298 - dectests/decrpa_energy (Failed)
302 - dectests/decrpa_rsh (Failed)
303 - dectests/decccsdpt_quick (Failed)
304 - dectests/decccsdpt_fragopt (Failed)
305 - dectests/decccsdpt_virt (Failed)
306 - dectests/decccsdpt_abc_occ (Failed)
Code: Select all
--- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
DEC calculations using MPI require at least two MPI processes!
Code: Select all
LSDALTON_LAUNCHER="mpirun -np 4"
Code: Select all
The following tests FAILED:
1 - linsca/linsca_energy (Failed)
224 - dectests/fullmpmp2 (Failed)
225 - dectests/fullmp2_energyCC (Timeout)
227 - dectests/fullcc2 (Timeout)
228 - dectests/fullccsd_4 (Timeout)
231 - dectests/fullrimp2naf (Failed)
238 - dectests/decrimp2_gradient (Failed)
240 - dectests/snoopmp2 (Timeout)
241 - dectests/snoopmp2_ghost (Timeout)
243 - dectests/fullccsd_restart (Timeout)
245 - dectests/ccsd_counterpoise (Timeout)
248 - dectests/fullccsd_noniter_rif12 (Timeout)
253 - dectests/fullmp2_local_f12 (Timeout)
254 - dectests/fullccsd_f12 (Timeout)
256 - dectests/decmp2f12_h2 (Failed)
267 - dectests/fullccsd_3 (Timeout)
268 - dectests/fullccsd_2 (Timeout)
285 - dectests/fullccsdpt (Timeout)
286 - dectests/fullccsdpt_abc (Timeout)
288 - dectests/decmp2_energy_FO (Timeout)
289 - dectests/decmp2_energy_decnp (Timeout)
292 - dectests/decmp2_geoopt_FO (Timeout)
293 - dectests/decccsd_BSSE (Timeout)
294 - dectests/decccsd_restart (Timeout)
296 - dectests/deccc2 (Timeout)
297 - dectests/fullccsd_debug_mult (Timeout)
299 - dectests/ccsd_exci_left (Timeout)
300 - dectests/lofex_tdhf_ccsd (Timeout)
301 - dectests/rpa_rsh (Timeout)
304 - dectests/decccsdpt_fragopt (Timeout)
305 - dectests/decccsdpt_virt (Timeout)
306 - dectests/decccsdpt_abc_occ (Timeout)
307 - dectests/fullccsdpt_print (Timeout)
308 - dectests/fullccsdpt_abc_print (Timeout)
Code: Select all
CMake Error at cmake_install.cmake:72 (file):
file INSTALL cannot find "/home/igors/lsdalton/build/tools"
Code: Select all
can't open file '$HOME/lsdalton/test/linsca/linsca_energy/test': [Errno 2] No such file or directory
Code: Select all
runtest/ runtest_config.py runtest_lsdalton.py runtest_v1.py
Code: Select all
Could not find executable $HOME/lsdalton/test/TEST
Code: Select all
LSDALTON_LAUNCHER="mpirun -np 3"
Code: Select all
export OMP_NUM_THREADS=4
Yes, you can most likely consider it OK (but maybe not for DEC). The problem with the linsca_energy test is related to some Python incompatibility and is not likely to be actually wrong. Regarding the DEC, I ran the testset using only MPI (but with 6 cores) and managed to get through most of them but still a few failed.esmuigors wrote: ↑18 Jun 2019, 21:084) On the machine mentioned in 1) and 2), my only problems right now are with DEC things and linsca energy (tha last one just says there is some error in Unicode and ASCII usage by some Python script in the installation). I am somewhat interested in using DEC, but it is not crucial for me. PCM is the most important thing I need right nowSo can I consider the installation on the first machine is OK to use, or should I (or maybe You
) first solve the linsca problem?
Users browsing this forum: No registered users and 1 guest