Compiling LSDalton with PCM Solver

Problems with Dalton installation? Find answers or ask for help here
esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Compiling LSDalton with PCM Solver

Post by esmuigors » 14 Apr 2019, 20:30

Dear all,

I am trying to compile LSDalton with

Code: Select all

-DENABLE_PCMSOLVER
, but facing a problem: make complains about redefined structures (enable_if, enable_if_c, disable_if, disable_if_c) for the boost library. I am aware there are multiple versions of this library because of changing standard of C/C++; I have installed Debian packages libboost-all-dev (v. 1.62.0.1) and available packages of libboost 1.67.0. Do I need some other libboost version, or is the problem elsewhere?

Setup command was ./setup --mpi --omp --mkl parallel --prefix=/opt -D ENABLE_PCMSOLVER=ON
System is Debian 9.

With deep gratitude,
Igors
Attachments
make_err.log
(214.18 KiB) Downloaded 32 times
make.log
(6.86 KiB) Downloaded 31 times

rob
Posts: 22
Joined: 15 Oct 2014, 13:43
First name(s): Roberto
Last name(s): Di Remigio
Affiliation: CTCC
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by rob » 15 Apr 2019, 14:24

Hi, Boost 1.55.0 and later are fine with PCMSolver. I am confused as to what version of Boost you actually have installed and is picked up by CMake. I cannot find this information in the logs you posted. So, few questions to help you troubleshoot:
1. Which compilers are you using? Do they support C++11?
2. Can you run a clean

Code: Select all

./setup --mpi --omp --mkl parallel --prefix=/opt -D ENABLE_PCMSOLVER=ON
and post the output?
3. Can you re-run the build in serial and verbosely

Code: Select all

env VERBOSE=1 make
and post the output?
Thank you! Roberto

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 17 Apr 2019, 09:23

This shines some light, as I am using Debian ("rock stable" hence outdated a bit system) and g++ (version 6.3.0) requires parameter -std=c++11. How should I include that? Just putting a complex parameter (even in single quotes) after the --cxx option brings up the following error:

Code: Select all

/bin/sh: 1: -std=c++11: not found
so I guess this is not interpreted as intended by me.

The LSDalton manual which the page daltonprogram.org points to still cites using ./configure instead of ./setup, and there is no Makefile.config so I do not know which file to put this option into.

rob
Posts: 22
Joined: 15 Oct 2014, 13:43
First name(s): Roberto
Last name(s): Di Remigio
Affiliation: CTCC
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by rob » 17 Apr 2019, 15:54

Hi again, you need to edit this file:

Code: Select all

src/pcm/CMakeLists.txt
and set

Code: Select all

-DENABLE_CXX11_SUPPORT=ON
at line 23. The PCMSolver submodule own CMake system will check whether your compiler really supports C++11 (GCC 6.3.0 is fully compliant) and enable it.
Being 2019, I should probably set C++11 on by default...

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 17 Apr 2019, 16:24

For now, I have created a specs file in the build directory (I guess it should go there?) with the -std=c++11 option after *cc1plus:, as suggested here. The ./setup and make outputs are attached. Thank You for taking Your time to help!
Attachments
SETUP_OUTPUT.log
(4.11 KiB) Downloaded 26 times
make_stserr.log
(2.56 KiB) Downloaded 28 times
make_stdout.log
(726.98 KiB) Downloaded 28 times

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 17 Apr 2019, 16:28

Sorry, the prrevious answer was prepared some time ago and I was distracted by other tasks! Now I am compiling with the option adveised by You. No errors so forth.

rob
Posts: 22
Joined: 15 Oct 2014, 13:43
First name(s): Roberto
Last name(s): Di Remigio
Affiliation: CTCC
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by rob » 17 Apr 2019, 16:31

That's neat, but it won't do. The C++11 availability check in PCMSolver will err on the side of not enabling C++11 unless you tell it to explicitly. Indeed, in the standard output you can see that it is still preferring C++98:

Code: Select all

cd /home/igors/lsdalton/lsdalton/build/external/pcmsolver-build/src/green && /opt/openmpi-2.1.5/bin/mpicxx   -DPCMSolver_EXPORTS -DTAYLOR_CXXIO -I/home/igors/lsdalton/lsdalton/build/external/pcmsolver-build/modules -I/home/igors/lsdalton/lsdalton/external/pcmsolver/api -isystem /home/igors/lsdalton/lsdalton/external/pcmsolver/external/eigen3/include/eigen3 -isystem /home/igors/lsdalton/lsdalton/external/pcmsolver/external/libtaylor -I/home/igors/lsdalton/lsdalton/external/pcmsolver/src -I/home/igors/lsdalton/lsdalton/build/external/pcmsolver-build/include -I/home/igors/lsdalton/lsdalton/external/pcmsolver/include -isystem /home/igors/lsdalton/lsdalton/external/pcmsolver/src/utils/getkw -I/home/igors/lsdalton/lsdalton/external/pcmsolver/src/dielectric_profile  -std=gnu++98 -O3 -DNDEBUG -Wno-unused -fPIC -fvisibility=hidden -fvisibility-inlines-hidden   -o CMakeFiles/green.dir/UniformDielectric.cpp.o -c /home/igors/lsdalton/lsdalton/external/pcmsolver/src/green/UniformDielectric.cpp

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 17 Apr 2019, 17:51

OK, now I do not have this error, but rather:

Code: Select all

gfortran: error: unrecognized command line option ‘-mkl=parallel’
??????

Files are attached. Thank You!
Attachments
SETUP_OUTPUT3.log
(4.18 KiB) Downloaded 27 times
make_stdout3.log
(744.85 KiB) Downloaded 28 times
make_stderr3.log
(3.65 KiB) Downloaded 28 times

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 17 Apr 2019, 17:53

Now I also used the flag

Code: Select all

-DENABLE_CXX11_SUPPORT=ON
for ./setup script itself.

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 17 Apr 2019, 17:57

Sorry, I have found this tread:
viewtopic.php?f=8&t=1001&p=6286&hilit=m ... llel#p6286

So it means I had better remove MKL parallelism from the compilation setup? No --omp and using --mkl sequential?

rob
Posts: 22
Joined: 15 Oct 2014, 13:43
First name(s): Roberto
Last name(s): Di Remigio
Affiliation: CTCC
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by rob » 17 Apr 2019, 21:36

Ooops! Sorry, should have seen that inconsistency before. The GNU compilers do not recognize the -mkl option, only the Intel compilers do. You can still use MKL with the GNU compilers though. Just make sure that you set the MATH_ROOT environment variable to the install folder for MKL and CMake should pick it up automatically. The documentation at this link should hopefully still be relevant: https://dalton-installation.readthedocs ... /math.html

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 18 Apr 2019, 20:59

Now I have exported MATH_ROOT=/usr/lib and removed --mkl option altogether. Then I successfully compiled LSDalton. Make output actually contains mentions of mkl_sequential, so I hope everything is good, as I will not be able for some time to check if the program runs. Thank You very much for Your help, Dr. Di Remigio!
Attachments
make_stdout7.log
(1.08 MiB) Downloaded 24 times
make_stderr7.log
(6.05 KiB) Downloaded 23 times

rob
Posts: 22
Joined: 15 Oct 2014, 13:43
First name(s): Roberto
Last name(s): Di Remigio
Affiliation: CTCC
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by rob » 18 Apr 2019, 21:06

Yes, this should be fine. Try running:

Code: Select all

ctest --output-on-failure -L ContinousIntegration
to run a subset of the tests and report any errors.

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 23 May 2019, 20:30

Dr. Di Remigio,

Thank You, it seems the PCM solver works now.
There are unfortunately some other errors in tests. It seems that for the first test it is just a misprint in the input file, but for the rest it is the apparent inability to connect to MPI. At the same time, other Dalton (not LSDalton) job is running on the machine through the MPI, can this be the culprit?

The ctest output is as follows:

Code: Select all

igors@ibm2-lom:~/lsdalton/lsdalton/build$ ctest --output-on-failure -L ContinuousIntegration
Test project /home/igors/lsdalton/lsdalton/build
      Start   1: linsca/linsca_energy
 1/29 Test   #1: linsca/linsca_energy .......................***Failed    3.57 sec

running test with input files ['energy.dal', 'h2o.mol'] and args None
Traceback (most recent call last):
  File "/home/igors/lsdalton/lsdalton/test/linsca/linsca_energy/test", line 26, in <module>
    filters={'out': f})
  File "/home/igors/lsdalton/lsdalton/test/linsca/linsca_energy/../../runtest/run.py", line 61, in run
    f.write(_s)
UnicodeEncodeError: 'ascii' codec can't encode character u'\xa0' in position 269: ordinal not in range(128)

      Start   4: linsca/linsca_symtest
 2/29 Test   #4: linsca/linsca_symtest ......................   Passed    5.57 sec
      Start   7: linsca/linsca_energy_restart
 3/29 Test   #7: linsca/linsca_energy_restart ...............   Passed    4.70 sec
      Start  72: linsca/linsca_trilevel
 4/29 Test  #72: linsca/linsca_trilevel .....................   Passed   12.06 sec
      Start  73: dft/LSDALTON_dftfunc_xcfun_pbe0
 5/29 Test  #73: dft/LSDALTON_dftfunc_xcfun_pbe0 ............   Passed    5.72 sec
      Start  74: dft/LSDALTON_dftfunc_xcfun_pbe
 6/29 Test  #74: dft/LSDALTON_dftfunc_xcfun_pbe .............   Passed    6.51 sec
      Start  75: LSint/LSDALTON_dftfunc_GGAKey
 7/29 Test  #75: LSint/LSDALTON_dftfunc_GGAKey ..............   Passed   15.20 sec
      Start  78: LSint/LSDALTON_df_admm_beta_cam100_xcfun
 8/29 Test  #78: LSint/LSDALTON_df_admm_beta_cam100_xcfun ...   Passed  189.94 sec
      Start  80: LSint/LSDALTON_df_admm_beta_cam100_psfun
 9/29 Test  #80: LSint/LSDALTON_df_admm_beta_cam100_psfun ...   Passed   17.22 sec
      Start 140: LSint/LSDALTON_UHF_JENGINE_LINK
10/29 Test #140: LSint/LSDALTON_UHF_JENGINE_LINK ............   Passed    4.75 sec
      Start 147: linsca/linsca_admm_rapid
11/29 Test #147: linsca/linsca_admm_rapid ...................   Passed    6.84 sec
      Start 161: LSresponse/LSresponse_HF_alpha
12/29 Test #161: LSresponse/LSresponse_HF_alpha .............   Passed   18.60 sec
      Start 170: LSresponse/LSresponse_HF_molgra
13/29 Test #170: LSresponse/LSresponse_HF_molgra ............   Passed    3.13 sec
      Start 171: LSresponse/LSresponse_HF_opa
14/29 Test #171: LSresponse/LSresponse_HF_opa ...............   Passed    3.98 sec
      Start 172: LSresponse/LSresponse_HF_tpa
15/29 Test #172: LSresponse/LSresponse_HF_tpa ...............   Passed    6.61 sec
      Start 185: LSresponse/LSresponse_DFT_opa
16/29 Test #185: LSresponse/LSresponse_DFT_opa ..............   Passed   59.47 sec
      Start 195: LSresponse/LSresponse_xcfun_lda_molgrad
17/29 Test #195: LSresponse/LSresponse_xcfun_lda_molgrad ....   Passed   15.01 sec
      Start 197: LSresponse/LSresponse_xcfun_lda_linrsp
18/29 Test #197: LSresponse/LSresponse_xcfun_lda_linrsp .....   Passed   13.26 sec
      Start 211: geomopt/geoopt_rapid
19/29 Test #211: geomopt/geoopt_rapid .......................   Passed    7.59 sec
      Start 217: ddynam/ddyn_rapid
20/29 Test #217: ddynam/ddyn_rapid ..........................   Passed   19.02 sec
      Start 218: dectests/decmp2_energy
21/29 Test #218: dectests/decmp2_energy .....................***Failed    6.06 sec
ERROR: crash during ['/home/igors/lsdalton/lsdalton/build/lsdalton', '-noarch', 'decmp2_energy', 'CO2H2']

  --- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
 DEC calculations using MPI require at least two MPI processes!

running test: decmp2_energy CO2H2

      Start 228: dectests/fullccsd_4
22/29 Test #228: dectests/fullccsd_4 ........................   Passed   16.22 sec
      Start 247: dectests/decmp2_fragopt
23/29 Test #247: dectests/decmp2_fragopt ....................***Failed    1.47 sec
ERROR: crash during ['/home/igors/lsdalton/lsdalton/build/lsdalton', '-noarch', 'decmp2_fragopt', 'StrangeString']

  --- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
 DEC calculations using MPI require at least two MPI processes!

running test: decmp2_fragopt StrangeString

      Start 257: dectests/decmp2f12_quick
24/29 Test #257: dectests/decmp2f12_quick ...................***Failed    1.58 sec
ERROR: crash during ['/home/igors/lsdalton/lsdalton/build/lsdalton', '-noarch', 'decmp2f12_quick', 'acetylene']

  --- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
 DEC calculations using MPI require at least two MPI processes!

running test: decmp2f12_quick acetylene

      Start 272: pcm/energy
25/29 Test #272: pcm/energy .................................   Passed   51.85 sec
      Start 273: pcm/linear_response
26/29 Test #273: pcm/linear_response ........................   Passed   87.71 sec
      Start 275: pcm/cubic_response
27/29 Test #275: pcm/cubic_response .........................   Passed   52.05 sec
      Start 290: dectests/decmp2_energy_decco
28/29 Test #290: dectests/decmp2_energy_decco ...............***Failed    1.26 sec
ERROR: crash during ['/home/igors/lsdalton/lsdalton/build/lsdalton', '-noarch', 'decmp2_energy_decco', 'h2o']

  --- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
 DEC calculations using MPI require at least two MPI processes!

running test: decmp2_energy_decco h2o

      Start 303: dectests/decccsdpt_quick
29/29 Test #303: dectests/decccsdpt_quick ...................***Failed    1.53 sec
ERROR: crash during ['/home/igors/lsdalton/lsdalton/build/lsdalton', '-noarch', 'decccsdpt_quick', 'He2']

  --- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
 DEC calculations using MPI require at least two MPI processes!

running test: decccsdpt_quick He2


79% tests passed, 6 tests failed out of 29

Label Time Summary:
ContinuousIntegration    = 638.51 sec (29 tests)
ddynam                   =  19.02 sec (1 test)
dec                      =  28.13 sec (6 tests)
decccsd                  =   3.01 sec (2 tests)
essential                =  17.74 sec (5 tests)
f12                      =   1.58 sec (1 test)
fulldec                  =  16.22 sec (1 test)
linsca                   = 638.51 sec (29 tests)
lsresponse               = 266.67 sec (10 tests)
pcm                      = 191.62 sec (3 tests)
restart                  =   4.70 sec (1 test)
runtest                  = 191.62 sec (3 tests)
xcfun                    =  28.27 sec (2 tests)

Total Test time (real) = 638.99 sec

The following tests FAILED:
	  1 - linsca/linsca_energy (Failed)
	218 - dectests/decmp2_energy (Failed)
	247 - dectests/decmp2_fragopt (Failed)
	257 - dectests/decmp2f12_quick (Failed)
	290 - dectests/decmp2_energy_decco (Failed)
	303 - dectests/decccsdpt_quick (Failed)
Errors while running CTest

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 23 May 2019, 20:44

Also, I now have another problem... I am compiling LSDalton for some other host, considerably less up-to-date (and the politics at the cluster is to left everything not updated to avoid possible breaking of user environment). I had to install cmake and zlib for myself, but I have run into problems with C++11 and/or the boost library, again. Because make reports that the PCMSolver does not find something of that kind. It does have a cmake option to build the boost on-the-fly, but I am not sure how to pass the option through the LSDalton compilation (it is not a -D option, but --build-boost). I tried doctoring the file lsdalton/external/pcmsolver/setup.py by changing

Code: Select all

command.append('-DBOOST_LIBRARYDIR="{0}"'.format(arguments['--build-boost']))
to

Code: Select all

command.append('-DFORCE_CUSTOM_BOOST={0}'.format('--build-boost')
but nothing changed: the compilation still throws the following error:

Code: Select all

-- found mpi_f08 mod, however only the f90 module is supported, droppin back to -DUSE_MPI_MOD_F90
-- found 32bit integer mpi module
-- found an MPI 3 compatible MPI lib, setting -DVAR_HAVE_MPI3
Source check returned:
('WARNING: no documentation found for this_is_separator in ', 'src/tensor_algebra_dil.F90', ' +2031')
('WARNING: no documentation found for this_is_letter in ', 'src/tensor_algebra_dil.F90', ' +2380')
('WARNING: no documentation found for split_argument in ', 'src/tensor_algebra_dil.F90', ' +4168')
('WARNING: no documentation found for get_next_mlndx in ', 'src/tensor_algebra_dil.F90', ' +4216')
('WARNING: no documentation found for Tile_precondition_doubles_unrest in ', 'src/tensor_pdm_operations_cc.F90', ' +1765')
('WARNING: no documentation found for Tile_precondition_doubles in ', 'src/tensor_pdm_operations_cc.F90', ' +1865')
TESTSTATUS: GOOD
In file included from /home/igors/lsdalton/build/external/pcmsolver-build/include/Config.hpp(62),
                 from /home/igors/lsdalton/external/pcmsolver/src/cavity/Element.hpp(29),
                 from /home/igors/lsdalton/external/pcmsolver/src/cavity/Element.cpp(24):
/home/igors/lsdalton/external/pcmsolver/include/Cxx11Workarounds.hpp(131): catastrophic error: cannot open source file "boost/core/enable_if.hpp"
  #include <boost/core/enable_if.hpp>
                                     ^

compilation aborted for /home/igors/lsdalton/external/pcmsolver/src/cavity/Element.cpp (code 4)
make[5]: *** [src/cavity/CMakeFiles/cavity.dir/Element.cpp.o] Error 4
make[4]: *** [src/cavity/CMakeFiles/cavity.dir/all] Error 2
make[3]: *** [all] Error 2
make[2]: *** [external/pcmsolver-stamp/pcmsolver-build] Error 2
make[1]: *** [src/pcm/CMakeFiles/pcmsolver.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
Source check returned:
('WARNING: no documentation found for this_is_separator in ', 'src/tensor_algebra_dil.F90', ' +2031')
('WARNING: no documentation found for this_is_letter in ', 'src/tensor_algebra_dil.F90', ' +2380')
('WARNING: no documentation found for split_argument in ', 'src/tensor_algebra_dil.F90', ' +4168')
('WARNING: no documentation found for get_next_mlndx in ', 'src/tensor_algebra_dil.F90', ' +4216')
('WARNING: no documentation found for Tile_precondition_doubles_unrest in ', 'src/tensor_pdm_operations_cc.F90', ' +1765')
('WARNING: no documentation found for Tile_precondition_doubles in ', 'src/tensor_pdm_operations_cc.F90', ' +1865')
TESTSTATUS: GOOD
make: *** [all] Error 2
The command to setup was:

Code: Select all

./setup --mpi --prefix=$HOME/bin/lsd2018_70to89 -D ENABLE_PCMSOLVER=ON -D ENABLE_DEC=ON -D ENABLE_CXX11_SUPPORT=ON --cmake=/home/igors/cmake372/bin/cmake -D ZLIB_ROOT=/home/igors/zlib-1.2.11
The command to compile was:

Code: Select all

env VERBOSE=1 make -j16 > makelog.logg 2> makeerr.logg
EDIT: I installed an old boost library from www.boost.org, installed it (not all targets were built, though) and added it to INCLUDE and LIBRARY_PATH variables, but still nothing improved in make outputs. I forgot to mention that now it is Intel compiler I am using (icpc (ICC) 14.0.1 20131008). What should I do? Try different boost version?
Thank You very much for all Your past and future help!

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 24 May 2019, 17:08

I now installed the latest boost version, and compilation apparently went successfully, but make install failed at:

Code: Select all

[100%] Built target lsdalton.x
Install the project...
-- Install configuration: "release"
-- Up-to-date: /home/igors/bin/lsd2018_70to89/dalton
-- Installing: /home/igors/bin/lsd2018_70to89/dalton/lsdalton.x
-- Set runtime path of "/home/igors/bin/lsd2018_70to89/dalton/lsdalton.x" to ""
CMake Error at cmake_install.cmake:60 (file):
  file INSTALL cannot find "/home/igors/lsdalton/build/dalton".


make: *** [install] Error 1
Which to me looks like a copy-paste-type typo in the INSTALL script, because it is LSDalton which is being built and not Dalton… But I cannot find file with such a name in the build directory, unfortunately.

What is stranger (to me) is that the command

Code: Select all

ctest --output-on-failure -L ContinousIntegration
produced the following output:

Code: Select all

Test project /home/igors/lsdalton/build
No tests were found!!!
The directory lsdalton/build/test actually contained only three python scripts and the runtest subdirectory. I moved the contents of lsdalton/test to this folder, but got the same results. Could You give me any suggestions? Many thanks for the help I got up till now.

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 07 Jun 2019, 16:34

Um... Hello?

I understand I myself made it to look like the issue is not that important to me. It was not of top importance then, but it still requires some legit advice, if anyone competent has time for it, of course.

Sorry for everything.

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 08 Jun 2019, 13:49

I'm not sure about the install part. It may very well be broken. To run the testset, it is not necessary to copy files or anything else. Just running 'ctest' from the build directory should be enough, after a successful make. Perhaps try to simply run "ctest" without the options.

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 11 Jun 2019, 21:45

Dear Prof. Olsen,

I tried to run just 'ctest' and got noticeably more failed tests (but I at least got them):

Code: Select all

	  1 - linsca/linsca_energy (Failed)
	218 - dectests/decmp2_energy (Failed)
	219 - dectests/decmp2_density (Failed)
	220 - dectests/decmp2_gradient (Failed)
	221 - dectests/decmp2_geoopt (Failed)
	234 - dectests/decrimp2 (Failed)
	235 - dectests/decsosrimp2 (Failed)
	236 - dectests/decrimp2_restart (Failed)
	237 - dectests/decrimp2_density (Failed)
	238 - dectests/decrimp2_gradient (Failed)
	239 - dectests/decrimp2laplace (Failed)
	244 - dectests/decmp2_gradient_debug (Failed)
	246 - dectests/decmp2_counterpoise (Failed)
	247 - dectests/decmp2_fragopt (Failed)
	254 - dectests/fullccsd_f12 (Failed)
	255 - dectests/decmp2f12 (Failed)
	256 - dectests/decmp2f12_h2 (Failed)
	257 - dectests/decmp2f12_quick (Failed)
	258 - dectests/decrimp2_f12 (Failed)
	259 - dectests/decccsd_noniter_rif12 (Failed)
	260 - dectests/decccsd_iter_rif12 (Failed)
	261 - dectests/mldec_ccsd_ccsdpt (Failed)
	262 - dectests/mldec_hf_rimp2 (Failed)
	263 - dectests/mldec_hf_rimp2_laplace (Failed)
	264 - dectests/mldec_nolowering (Failed)
	265 - dectests/mldec_rimp2_ccsd (Failed)
	266 - dectests/mldec_rimp2_ccsdpt (Failed)
	267 - dectests/fullccsd_3 (Failed)
	268 - dectests/fullccsd_2 (Failed)
	269 - dectests/mldec_hf_rimp2_nooptskip (Failed)
	283 - dectests/decnp_rimp2_inclfull (Failed)
	284 - dectests/decmp2_StressTest (Failed)
	287 - dectests/decmp2_gradient_debug2 (Failed)
	288 - dectests/decmp2_energy_FO (Failed)
	289 - dectests/decmp2_energy_decnp (Failed)
	290 - dectests/decmp2_energy_decco (Failed)
	291 - dectests/decmp2_gradient_decco (Failed)
	292 - dectests/decmp2_geoopt_FO (Failed)
	293 - dectests/decccsd_BSSE (Failed)
	294 - dectests/decccsd_restart (Failed)
	295 - dectests/decmp2_restart (Failed)
	296 - dectests/deccc2 (Failed)
	298 - dectests/decrpa_energy (Failed)
	302 - dectests/decrpa_rsh (Failed)
	303 - dectests/decccsdpt_quick (Failed)
	304 - dectests/decccsdpt_fragopt (Failed)
	305 - dectests/decccsdpt_virt (Failed)
	306 - dectests/decccsdpt_abc_occ (Failed)
What should I check now?

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 12 Jun 2019, 08:12

Ah yeah sorry I should have told you to keep the "--output-on-failure" option. I'm not sure about the linsca_energy but the failing dectests are most likely related to the fact that these tests require a certain number of MPI processes. If you run, e.g., "ctest --output-on-failure -R dectests/decmp2_density" you will probably get the following message:

Code: Select all

  --- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
 DEC calculations using MPI require at least two MPI processes!
To run the tests with MPI you can export the LSDALTON_LAUNCHER environment variable, e.g.,

Code: Select all

LSDALTON_LAUNCHER="mpirun -np 4"
and then run the tests again.

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 14 Jun 2019, 22:20

1. For the node where testing was the sole problem. I did what You told me, and the tests are being run now (though slowly) and some did not complete because of timeout, others failed (even with memory leaks and dumps posted to output).

The summary is here:

Code: Select all

The following tests FAILED:
	  1 - linsca/linsca_energy (Failed)
	224 - dectests/fullmpmp2 (Failed)
	225 - dectests/fullmp2_energyCC (Timeout)
	227 - dectests/fullcc2 (Timeout)
	228 - dectests/fullccsd_4 (Timeout)
	231 - dectests/fullrimp2naf (Failed)
	238 - dectests/decrimp2_gradient (Failed)
	240 - dectests/snoopmp2 (Timeout)
	241 - dectests/snoopmp2_ghost (Timeout)
	243 - dectests/fullccsd_restart (Timeout)
	245 - dectests/ccsd_counterpoise (Timeout)
	248 - dectests/fullccsd_noniter_rif12 (Timeout)
	253 - dectests/fullmp2_local_f12 (Timeout)
	254 - dectests/fullccsd_f12 (Timeout)
	256 - dectests/decmp2f12_h2 (Failed)
	267 - dectests/fullccsd_3 (Timeout)
	268 - dectests/fullccsd_2 (Timeout)
	285 - dectests/fullccsdpt (Timeout)
	286 - dectests/fullccsdpt_abc (Timeout)
	288 - dectests/decmp2_energy_FO (Timeout)
	289 - dectests/decmp2_energy_decnp (Timeout)
	292 - dectests/decmp2_geoopt_FO (Timeout)
	293 - dectests/decccsd_BSSE (Timeout)
	294 - dectests/decccsd_restart (Timeout)
	296 - dectests/deccc2 (Timeout)
	297 - dectests/fullccsd_debug_mult (Timeout)
	299 - dectests/ccsd_exci_left (Timeout)
	300 - dectests/lofex_tdhf_ccsd (Timeout)
	301 - dectests/rpa_rsh (Timeout)
	304 - dectests/decccsdpt_fragopt (Timeout)
	305 - dectests/decccsdpt_virt (Timeout)
	306 - dectests/decccsdpt_abc_occ (Timeout)
	307 - dectests/fullccsdpt_print (Timeout)
	308 - dectests/fullccsdpt_abc_print (Timeout)
According to the output, the problems are mostly with RI-MP2?
Attachments
ctest_output_15jun2019.txt
Command output
(52.11 KiB) Downloaded 5 times
ctest_err_15jun2019.txt
Nothing actually. It is output to stderr for th command.
(27 Bytes) Downloaded 4 times

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 17 Jun 2019, 21:50

2. For the node with INSTALL problem. I did some vaguely justified things (ln -s lsdalton dalton to circumvent "not found" error for $SRC_DIR/build/dalton; then I was shown the error of

Code: Select all

CMake Error at cmake_install.cmake:72 (file):
 file INSTALL cannot find "/home/igors/lsdalton/build/tools"
after which I, again. made a softlink to the tools directory of the successful Dalton installation. Now I tried to run the tests but received the following errors for every test:

Code: Select all

can't open file '$HOME/lsdalton/test/linsca/linsca_energy/test': [Errno 2] No such file or directory
And indeed, the only contents of the $HOME/lsdalton/test/ are the following:

Code: Select all

runtest/  runtest_config.py  runtest_lsdalton.py  runtest_v1.py
Then I copied test files and directories from the LSDalton install on than "1st machine (from the previous post)", and some tests ran OK, though many failed with

Code: Select all

Could not find executable $HOME/lsdalton/test/TEST
Some failed.

What should I do now about this? How insane was my idea of making the soft links? Thanks in advance!
Attachments
ctest_output_17jun2019.txt
(914.88 KiB) Downloaded 3 times
ctest_err_17jun2019.txt
(6.71 KiB) Downloaded 3 times

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 18 Jun 2019, 10:43

Regarding the testing. First of all note that the DEC functionality is disabled by default. Possibly because some tests are failing. If you don't this functionality then just leave it disabled. In any case, the timeouts may be related to the number of threads being set to the default on your system which would result in each MPI process trying to use too many resources. This would slow everything down a great deal. You have to set the number of MPI processes and OpenMP threads so that MPI*OpenMP equals the number of available physical cores. For example, if you have one node with 12 physical cores, you could set

Code: Select all

LSDALTON_LAUNCHER="mpirun -np 3"
and

Code: Select all

export OMP_NUM_THREADS=4
assuming that you're using BASH. Note however that I'm not sure what is an efficient combination nor if the DEC code will benefit from using OpenMP threading.

Now regarding your install problems. Sorry but there is just too much information in this thread making it difficult to follow exactly what it is that you're trying to do?

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 18 Jun 2019, 21:08

Prof. Olsen and others,

I am indeed sorry for the mess I created in this topic. It went as follows:
1) I tried installing LSDalton on one machine, with relatively up-to-date system, encountered problems and asked for help.
2) Then, as the installation on that first machine was complete, and only some testing problems persisted, I thought they will be solved straight away and so
3) asked about another installation problem, on a machine belonging to a computational cluster with a moderately outdated system. This is where I got problems with make install and then proceeded with some wacky moves (with symlinking and copying things from other installations, both of Dalton and LSDalton. These were:
  1. ln -s lsdalton dalton to circumvent "not found" error for $HOME/lsdalton/build/dalton
  2. ln -s $SOMEWHERE/dalton/tools $HOME/lsdalton/build/tools
  3. cp [LSDalton tests from the machine mentioned in 1)] $HOME/lsdalton/build/test/
    cp [LSDalton tests from the machine mentioned in 1)] $HOME/lsdalton/test/
It is on this machine where I still got many, many tests failed or not found. Then I recognized that, as I coped via a vfat-formatted flash drive, TEST and MakeFileTest scripts just do not have the 'x' bit set. Now I did it, and the testing is in process right now (with some failures up to now). I will post the output when the tests have ended. Up to now, the failures are:
  1. LSint/LSDALTON_magderiv (not magderiv2 or magderiv3, reporting memory leak)
  2. dft/LSDALTON_dftdisp_d3_ANY_FUNCTIONAL, reporting segmentation fault (also the same error for d3bj, no error for d2)
  3. linsca/linsca_admm[ANYTHING] (result absolutely far from the desired one, rel diff: 1.39e+01) What actually is linsca? Some linear-scaling method?
  4. LSresponse/LSresponse_DFT_[d]tpa (timeouts; I am really interested in this property, how can one solve this error?)
  5. and various more
4) On the machine mentioned in 1) and 2), my only problems right now are with DEC things and linsca energy (tha last one just says there is some error in Unicode and ASCII usage by some Python script in the installation). I am somewhat interested in using DEC, but it is not crucial for me. PCM is the most important thing I need right now :-) So can I consider the installation on the first machine is OK to use, or should I (or maybe You :roll: ) first solve the linsca problem?

Many thanks to everyone who had already helped and are helping now!

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 19 Jun 2019, 08:02

esmuigors wrote:
18 Jun 2019, 21:08
4) On the machine mentioned in 1) and 2), my only problems right now are with DEC things and linsca energy (tha last one just says there is some error in Unicode and ASCII usage by some Python script in the installation). I am somewhat interested in using DEC, but it is not crucial for me. PCM is the most important thing I need right now :-) So can I consider the installation on the first machine is OK to use, or should I (or maybe You :roll: ) first solve the linsca problem?
Yes, you can most likely consider it OK (but maybe not for DEC). The problem with the linsca_energy test is related to some Python incompatibility and is not likely to be actually wrong. Regarding the DEC, I ran the testset using only MPI (but with 6 cores) and managed to get through most of them but still a few failed.

Post Reply

Who is online

Users browsing this forum: No registered users and 3 guests