Compiling LSDalton with PCM Solver

Problems with Dalton installation? Find answers or ask for help here
User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 19 Jun 2019, 08:09

esmuigors wrote:
18 Jun 2019, 21:08
3) asked about another installation problem, on a machine belonging to a computational cluster with a moderately outdated system. This is where I got problems with make install and then proceeded with some wacky moves (with symlinking and copying things from other installations, both of Dalton and LSDalton. These were:
  1. ln -s lsdalton dalton to circumvent "not found" error for $HOME/lsdalton/build/dalton
  2. ln -s $SOMEWHERE/dalton/tools $HOME/lsdalton/build/tools
  3. cp [LSDalton tests from the machine mentioned in 1)] $HOME/lsdalton/build/test/
    cp [LSDalton tests from the machine mentioned in 1)] $HOME/lsdalton/test/
It is on this machine where I still got many, many tests failed or not found. Then I recognized that, as I coped via a vfat-formatted flash drive, TEST and MakeFileTest scripts just do not have the 'x' bit set. Now I did it, and the testing is in process right now (with some failures up to now). I will post the output when the tests have ended. Up to now, the failures are:
  1. LSint/LSDALTON_magderiv (not magderiv2 or magderiv3, reporting memory leak)
  2. dft/LSDALTON_dftdisp_d3_ANY_FUNCTIONAL, reporting segmentation fault (also the same error for d3bj, no error for d2)
  3. linsca/linsca_admm[ANYTHING] (result absolutely far from the desired one, rel diff: 1.39e+01) What actually is linsca? Some linear-scaling method?
  4. LSresponse/LSresponse_DFT_[d]tpa (timeouts; I am really interested in this property, how can one solve this error?)
  5. and various more
What were your steps? The way to do it is

Code: Select all

./setup [some option] build
 cd build
 make
If the make was successful, you should run the testset (after setting LSDALTON_LAUNCHER and OMP_NUM_THREADS to something reasonable)

Code: Select all

 ctest [--output-with-failure]
If this is successful, you can either use lsdalton from the build directory or install it somewhere else (and this is were it doesn't work for you?). There is no need to run the tests after install.

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 24 Jun 2019, 20:00

magnus wrote:
19 Jun 2019, 08:09
What were your steps? The way to do it is

Code: Select all

./setup [some option] build
 cd build
 make
Yes, I indeed did so, but after running make I also run make install, and this was where some errors appeared which I "solved" by the "wacky" symlink creation. I thought that I need to run it (and I did on the first machine, I think) because the installation prefix submitted to the ./setup script was different from the current directory. Full commands which I used are as follows:

Code: Select all

./setup --mpi --prefix=$HOME/bin/lsd2018_70to89 -D ENABLE_PCMSOLVER=ON -D ENABLE_DEC=ON -D ENABLE_CXX11_SUPPORT=ON --cmake=$HOME/cmake372/bin/cmake -D ZLIB_ROOT=$HOME/zlib-1.2.11
cd build
env VERBOSE=1 make -j16 > make_output_19jun2019 2> make_errors_19jun2019
make install
ctest --output-on-failure -R > ctest_output_19jun2019 2> ctest_err_19jun2019
I also got the results of testing finally. The file with them is attached via Google Drive because it is too large to be uploaded to the forum. Actually there are quite many errors. Stderr, on the other hand, was empty this run (except for the title line).

So, where am I now and how to proceed? The errors look quite worrisome, especially for the PCM tests which I am interested in, but also for geometry optimization and Grimme's dispersion correction.

EDIT: OK, PCM errors actually seem to be solvable by asking the system administrator to install the missing python module. The other errors, however, are not...

Link to the file: https://drive.google.com/open?id=1I9el4 ... yLSou1JRMq

taylor
Posts: 532
Joined: 15 Oct 2013, 05:37
First name(s): Peter
Middle name(s): Robert
Last name(s): Taylor
Affiliation: Tianjin University
Country: China

Re: Compiling LSDalton with PCM Solver

Post by taylor » 25 Jun 2019, 08:25

I am not able to build LSDalton with PCMsolver, even after (I think) following the various suggestions. With the setup command
# setup command was executed 25-June-2019 14:22:01
./setup --fc=mpiifort --cc=mpiicc --cxx=mpiicpc --type=release --omp --scalapack --blacs intelmpi --explicit-libs=-lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64 -liomp5 -lpthread -lm -ldl --mkl=parallel -D ENABLE_DEC=ON -D ENABLE_PCMSOLVER=ON -DENABLE_CXX11_SUPPORT=ON build_solv

# cmake command generated by this setup command was:
# FC=mpiifort CC=mpiicc CXX=mpiicpc cmake -DENABLE_MPI=ON -DENABLE_SGI_MPT=OFF -DENABLE_OMP=ON -DENABLE_64BIT_INTEGERS=OFF -DENABLE_GPU=OFF -DENABLE_CUBLAS=OFF -DENABLE_CSR=OFF -DENABLE_SCALASCA=OFF -DENABLE_VAMPIRTRACE=OFF -DENABLE_TIMINGS=OFF -DENABLE_STATIC_LINKING=OFF -DENABLE_SCALAPACK=ON -DBLACS_IMPLEMENTATION=intelmpi -DENABLE_AUTO_BLAS=ON -DENABLE_AUTO_LAPACK=ON -DMKL_FLAG="-mkl=parallel" -DENABLE_AUTO_BLAS=OFF -DENABLE_AUTO_LAPACK=OFF -DEXPLICIT_LIBS="-lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64 -liomp5 -lpthread -lm -ldl" -DCMAKE_ARGS="--no-warn-unused-cli" -DCMAKE_BUILD_TYPE=release -DENABLE_DEC=ON -DENABLE_PCMSOLVER=ON -DENABLE_CXX11_SUPPORT=ON /home/taylor/src/2018/lsdalton


I get the errors seen in the attached (truncated) make output. Everything up to that point worked. Note that the inclusion of -DENABLE_CXX11_SUPPORT=ON seems to have no effect: I got the same errors when I tried without it first.

Boost is 1.70.0, by the way: standard install. Anyone got any ideas?

Best regards
Pete
Attachments
slask.out
Output from make failure
(13.32 KiB) Downloaded 5 times

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 25 Jun 2019, 14:57

Hi Pete,
Perhaps it will work if you follow Roberto's suggestion quoted below. At least that did the trick for me on Fedora 30.
rob wrote:
17 Apr 2019, 15:54
Hi again, you need to edit this file:

Code: Select all

src/pcm/CMakeLists.txt
and set

Code: Select all

-DENABLE_CXX11_SUPPORT=ON
at line 23. The PCMSolver submodule own CMake system will check whether your compiler really supports C++11 (GCC 6.3.0 is fully compliant) and enable it.
Being 2019, I should probably set C++11 on by default...

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 25 Jun 2019, 15:16

esmuigors wrote:
24 Jun 2019, 20:00
So, where am I now and how to proceed? The errors look quite worrisome, especially for the PCM tests which I am interested in, but also for geometry optimization and Grimme's dispersion correction.

EDIT: OK, PCM errors actually seem to be solvable by asking the system administrator to install the missing python module. The other errors, however, are not...

Link to the file: https://drive.google.com/open?id=1I9el4 ... yLSou1JRMq
Can you provide the output from the setup? Probably you already mentioned it, but which compiler are you using? Also which version of LSDalton are you compiling?

taylor
Posts: 532
Joined: 15 Oct 2013, 05:37
First name(s): Peter
Middle name(s): Robert
Last name(s): Taylor
Affiliation: Tianjin University
Country: China

Re: Compiling LSDalton with PCM Solver

Post by taylor » 26 Jun 2019, 05:33

Does not work for me, but I get a new error (see below). In this case, having enabled CXX11 support in pcm/CMakeFiles.txt, I did not use the directive on the setup command line. by the way.

cd /home/taylor/src/2018/lsdalton/build_solv/external/pcmsolver-build/src/green && /apps/intelmpi/5.1/compilers_and_libraries_2016.0.109/linux/mpi/intel64/bin/mpiicpc -DHAS_CXX11 -DHAS_CXX11_AUTO -DHAS_CXX11_AUTO_RET_TYPE -DHAS_CXX11_CLASS_OVERRIDE -DHAS_CXX11_CONSTEXPR -DHAS_CXX11_CSTDINT_H -DHAS_CXX11_DECLTYPE -DHAS_CXX11_FUNC -DHAS_CXX11_INITIALIZER_LIST -DHAS_CXX11_LAMBDA -DHAS_CXX11_LONG_LONG -DHAS_CXX11_NOEXCEPT -DHAS_CXX11_NORETURN -DHAS_CXX11_NULLPTR -DHAS_CXX11_RVALUE_REFERENCES -DHAS_CXX11_SIZEOF_MEMBER -DHAS_CXX11_STATIC_ASSERT -DHAS_CXX11_VARIADIC_TEMPLATES -DPCMSolver_EXPORTS -DTAYLOR_CXXIO -DVAR_IFORT -I/home/taylor/src/2018/lsdalton/build_solv/external/pcmsolver-build/modules -I/home/taylor/src/2018/lsdalton/external/pcmsolver/api -isystem /home/taylor/src/2018/lsdalton/external/pcmsolver/external/eigen3/include/eigen3 -isystem /home/taylor/src/2018/lsdalton/external/pcmsolver/external/libtaylor -I/home/taylor/src/2018/lsdalton/external/pcmsolver/src -I/home/taylor/src/2018/lsdalton/build_solv/external/pcmsolver-build/include -I/home/taylor/src/2018/lsdalton/external/pcmsolver/include -isystem /home/taylor/src/2018/lsdalton/external/pcmsolver/src/utils/getkw -I/home/taylor/src/2018/lsdalton/external/pcmsolver/src/dielectric_profile -std=c++11 -O3 -DNDEBUG -fPIC -fvisibility=hidden -o CMakeFiles/green.dir/SphericalDiffuse.cpp.o -c /home/taylor/src/2018/lsdalton/external/pcmsolver/src/green/SphericalDiffuse.cpp
In file included from /usr/local/include/boost/noncopyable.hpp(15),
from /usr/local/include/boost/numeric/ublas/detail/config.hpp(23),
from /usr/local/include/boost/numeric/ublas/exception.hpp(19),
from /usr/local/include/boost/numeric/ublas/storage.hpp(25),
from /usr/local/include/boost/numeric/ublas/vector.hpp(21),
from /usr/local/include/boost/numeric/odeint/util/ublas_wrapper.hpp(23),
from /usr/local/include/boost/numeric/odeint.hpp(25),
from /home/taylor/src/2018/lsdalton/external/pcmsolver/src/green/InterfacesImpl.hpp(38),
from /home/taylor/src/2018/lsdalton/external/pcmsolver/src/green/SphericalDiffuse.hpp(42),
from /home/taylor/src/2018/lsdalton/external/pcmsolver/src/green/SphericalDiffuse.cpp(24):
/usr/local/include/boost/core/noncopyable.hpp(42): error: defaulted default constructor cannot be constexpr because the corresponding implicitly declared default constructor would not be constexpr
BOOST_CONSTEXPR noncopyable() = default;
^

compilation aborted for /home/taylor/src/2018/lsdalton/external/pcmsolver/src/green/SphericalDiffuse.cpp (code 2)


Best regards
Pete

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 26 Jun 2019, 11:13

I tried using the same setup command and was able to compile and pass the pcm tests, but I think the trick is that no Boost was found which forces use of a custom Boost:

Code: Select all

-- Could NOT find Boost
--   No libraries required, installing headers
You may be able to force it by adding -DBUILD_CUSTOM_BOOST=ON in src/pcm/CMakeLists.txt around the same place as -DENABLE_CXX11_SUPPORT=ON but not sure if it works.

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 26 Jun 2019, 14:22

magnus wrote:
25 Jun 2019, 15:16
Can you provide the output from the setup? Probably you already mentioned it, but which compiler are you using? Also which version of LSDalton are you compiling?
I am using the Intel 14.0.1.20131008 compiler and compiling LSDalton v2018.0 (I cloned the git repository which actually had only this brach, despite v.2018.1 being mentioned somewhere).

/.setup command is:

Code: Select all

./setup --mpi --prefix=$HOME/bin/lsd2018_70to89 -D ENABLE_PCMSOLVER=ON -D ENABLE_DEC=ON -D ENABLE_CXX11_SUPPORT=ON --cmake=$HOME/cmake372/bin/cmake -D ZLIB_ROOT=$HOME/zlib-1.2.11
./setup output is as follows:

Code: Select all

 FC=mpif90 CC=mpicc CXX=mpicxx /home/igors/cmake372/bin/cmake -DENABLE_MPI=ON -DENABLE_SGI_MPT=OFF -DENABLE_OMP=OFF -DENABLE_64BIT_INTEGERS=OFF -DENABLE_GPU=OFF -DENABLE_CUBLAS=OFF -DENABLE_CSR=OFF -DENABLE_SCALASCA=OFF -DENABLE_VAMPIRTRACE=OFF -DENABLE_TIMINGS=OFF -DENABLE_STATIC_LINKING=OFF -DENABLE_SCALAPACK=OFF -DCMAKE_INSTALL_PREFIX=/home/igors/bin/lsd2018_70to89 -DCMAKE_ARGS="--no-warn-unused-cli" -DCMAKE_BUILD_TYPE=release -DENABLE_PCMSOLVER=ON -DENABLE_DEC=ON -DENABLE_CXX11_SUPPORT=ON -DZLIB_ROOT=/home/igors/zlib-1.2.11 /home/igors/lsdalton

-- compiler understands Fortran 2003
-- BLAS will be searched for based on MKLROOT=/opt/intel/composer_xe_2013_sp1.1.106/mkl
-- LAPACK will be searched for based on MKLROOT=/opt/intel/composer_xe_2013_sp1.1.106/mkl
-- found mpi mod, setting -DUSE_MPI_MOD_F90
-- found 32bit integer mpi module
-- found an MPI 3 compatible MPI lib, setting -DVAR_HAVE_MPI3
-- GIT_BRANCH            : (no branch)
-- System                : Linux
-- Processor type        : x86_64
-- Fortran compiler flags: -fpp -assume byterecl -DVAR_IFORT  -O3 -ip
-- C compiler flags      : -g -wd981 -wd279 -wd383 -vec-report0 -wd1572 -wd1777 -restrict -DRESTRICT=restrict -O3 -ip
-- Libraries             : /opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_intel_lp64.so;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_sequential.so;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_core.so;/usr/lib64/libpthread.so;/usr/lib64/libm.so;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_lapack95_lp64.a;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_intel_lp64.so
-- Definitions           : SYS_LINUX;SYS_UNIX;VAR_IFORT;COMPILER_UNDERSTANDS_FORTRAN_2003;VAR_PTR_RESHAPE;HAVE_MKL_BLAS;HAVE_MKL_LAPACK;VAR_MPI;USE_MPI_MOD_F90;VAR_HAVE_MPI3;VAR_DEC;VAR_MFDS;_FILE_OFFSET_BITS=64;IMPLICIT_NONE;BINARY_INFO_AVAILABLE;INSTALL_BASDIR="/home/igors/lsdalton/build/basis";VAR_MKL;VAR_RSP;HAS_PCMSOLVER;INSTALL_WRKMEM=64000000;INSTALL_MMWORK=1;VAR_XCFUN;VAR_ENABLE_TENSORS
CMake Warning:
  Manually-specified variables were not used by the project:

    CMAKE_ARGS
    ENABLE_CXX11_SUPPORT


-- The Fortran compiler identification is Intel 14.0.1.20131008
-- The C compiler identification is Intel 14.0.1.20131008
-- The CXX compiler identification is Intel 14.0.1.20131008
-- Check for working Fortran compiler: /opt/openmpi-2.1.5/bin/mpif90
-- Check for working Fortran compiler: /opt/openmpi-2.1.5/bin/mpif90  -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /opt/openmpi-2.1.5/bin/mpif90 supports Fortran 90
-- Checking whether /opt/openmpi-2.1.5/bin/mpif90 supports Fortran 90 -- yes
-- Check for working C compiler: /opt/openmpi-2.1.5/bin/mpicc
-- Check for working C compiler: /opt/openmpi-2.1.5/bin/mpicc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /opt/openmpi-2.1.5/bin/mpicxx
-- Check for working CXX compiler: /opt/openmpi-2.1.5/bin/mpicxx -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Performing Test COMPILER_UNDERSTANDS_FORTRAN03
-- Performing Test COMPILER_UNDERSTANDS_FORTRAN03 - Success
-- Performing Test PTR_RESHAPE_WORKS
-- Performing Test PTR_RESHAPE_WORKS - Success
-- Math lib search order is MKL;ESSL;OPENBLAS;ATLAS;ACML;SYSTEM_NATIVE
-- You can select a specific type by defining for instance -D BLAS_TYPE=ATLAS or -D LAPACK_TYPE=ACML
-- or by redefining MATH_LIB_SEARCH_ORDER
-- Found BLAS: MKL (/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_intel_lp64.so;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_sequential.so;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_core.so;/usr/lib64/libpthread.so;/usr/lib64/libm.so)
-- Found LAPACK: MKL (/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_lapack95_lp64.a;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_intel_lp64.so)
-- Found MPI_C: /opt/openmpi-2.1.5/lib/libmpi.so  
-- Found MPI_CXX: /opt/openmpi-2.1.5/lib/libmpi.so  
-- Found MPI_Fortran: /opt/openmpi-2.1.5/lib/libmpi_usempif08.so;/opt/openmpi-2.1.5/lib/libmpi_usempi_ignore_tkr.so;/opt/openmpi-2.1.5/lib/libmpi_mpifh.so;/opt/openmpi-2.1.5/lib/libmpi.so  
-- Performing Test MPI_COMPATIBLE
-- Performing Test MPI_COMPATIBLE - Success
-- Performing Test MPI_F90_I4
-- Performing Test MPI_F90_I4 - Success
-- Performing Test MPI_F90_I8
-- Performing Test MPI_F90_I8 - Failed
-- Performing Test ENABLE_MPI3_FEATURES
-- Performing Test ENABLE_MPI3_FEATURES - Success
-- Found Git: /usr/bin/git  
-- Polarizable Continuum Model via PCMSolver ENABLED
-- Configuring done
-- Generating done
-- Build files have been written to: /home/igors/lsdalton/build

   configure step is done
   now you need to compile the sources:
   $ cd build
   $ make

taylor
Posts: 532
Joined: 15 Oct 2013, 05:37
First name(s): Peter
Middle name(s): Robert
Last name(s): Taylor
Affiliation: Tianjin University
Country: China

Re: Compiling LSDalton with PCM Solver

Post by taylor » 27 Jun 2019, 08:02

But what about the build finding the boost libraries in their default location? Does it make more sense to "hide" these or set the directories explicitly to null values (as well as trying to force a local boost headers build)?

Best regards
Pete

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 29 Jun 2019, 11:59

Prof. Taylor,

Isn't it so that make searches for the libraries in the sequence You put their location in the $INCLUDE and $LIBRARY_PATH? I put the path to the tarball installation of boost in front of everything that was already there:

Code: Select all

INCLUDE="$HOME/boost_1_70_0:$INCLUDE"
and make seems to have found it, as far as I can judge from its output (though various problems persist with tests). Or do I misunderstand You?

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 29 Jun 2019, 15:01

I tried a different versions of BLAS/LAPACK (3.8.0, the newest one), and got the following error at the very end of the compilation:

Code: Select all

[100%] Linking Fortran executable lsdalton.x
/home/igors/cmake372/bin/cmake -E cmake_link_script CMakeFiles/lsdalton.x.dir/link.txt --verbose=1
/opt/openmpi-2.1.5/bin/mpif90    -fpp -assume byterecl -DVAR_IFORT  -O3 -ip CMakeFiles/lsdalton.x.dir/src/lsdaltonsrc/lsdalton_wrapper.f90.o  -o lsdalton.x  -L/home/igors/lsdalton/build/external/lib  -L/home/igors/lsdalton/build/external/lib64  -L/home/igors/lsdalton/build/external/ls-matrix-defop-build/external/lib  -L/home/igors/lsdalton/build/external/xcfun-build/external/lib  -L/home/igors/lsdalton/build/external/ls-openrsp-build/external/lib -Wl,-rpath,/home/igors/lsdalton/build/external/lib:/home/igors/lsdalton/build/external/lib64:/home/igors/lsdalton/build/external/ls-matrix-defop-build/external/lib:/home/igors/lsdalton/build/external/xcfun-build/external/lib:/home/igors/lsdalton/build/external/ls-openrsp-build/external/lib:/home/igors/lapack-3.8.0/lib64:/home/igors/zlib-1.2.11: lib/liblsdaltonmain.a lib/libxcfun_interface.a lib/libgeooptlib.a lib/libwannierlib.a lib/librsp_propertieslib.a lib/liblinearslib.a lib/librspsolverlib.a lib/libsolverutillib.a lib/liblspcm.a lib/libdeclib.a lib/libpbclib.a lib/libddynamlib.a lib/liblsintlib.a lib/libdftfunclib.a lib/libfmmlib.a lib/liblsutillib.a lib/liblsutiltypelib_common.a lib/libpdpacklib.a lib/libmatrixulib.a lib/libmatrixolib.a lib/liblsutillib_common8.a lib/liblsutillib_common7.a lib/liblsutillib_common6.a lib/liblsutillib_common5.a lib/liblsutillib_common4.a lib/liblsutillib_common3.a lib/liblsutillib_common2.a lib/liblsutillib_common1.a lib/libmatrixmlib.a lib/libcuda_gpu_interfaces.a lib/liblsutillib_precision.a /home/igors/lapack-3.8.0/lib64/libblas.so /home/igors/lapack-3.8.0/lib64/liblapack.so lib/libls_frame_input.a external/lib/libopenrsp.a external/lib/libScaTeLib.a external/lib/libxcfun_f90_bindings.a external/lib/libxcfun.a external/lib/libmatrix-defop.a lib/libmatrixulib.a -lstdc++ -lmpi -limf -lsvml -lirng -lm -lipgo -ldecimal -lcilkrts -lstdc++ -lirc -lpthread -lsvml -lc -lirc_s -ldl -lc -lpcm -lmpi -limf -lsvml -lirng -lm -lipgo -ldecimal -lcilkrts -lstdc++ -lirc -lpthread -lsvml -lc -lirc_s -ldl -lc /home/igors/zlib-1.2.11/libz.so /home/igors/lapack-3.8.0/lib64/libblas.so /home/igors/lapack-3.8.0/lib64/liblapack.so external/lib/libopenrsp.a external/lib/libScaTeLib.a external/lib/libxcfun_f90_bindings.a external/lib/libxcfun.a external/lib/libmatrix-defop.a lib/libmatrixulib.a -lstdc++ -lmpi -limf -lsvml -lirng -lm -lipgo -ldecimal -lcilkrts -lirc -lpthread -lc -lirc_s -ldl -lpcm /home/igors/zlib-1.2.11/libz.so -lirng -ldecimal -lcilkrts -lstdc++ -lirng -ldecimal -lcilkrts -lstdc++

/home/igors/lapack-3.8.0/lib64/liblapack.so: undefined reference to `slacgv_'
/home/igors/lapack-3.8.0/lib64/liblapack.so: undefined reference to `dlacgv_'
make[2]: *** [lsdalton.x] Error 1
make[1]: *** [CMakeFiles/lsdalton.x.dir/all] Error 2
make: *** [all] Error 2

make[2]: Leaving directory `/home/igors/lsdalton/build'
make[1]: Leaving directory `/home/igors/lsdalton/build'
(I'm combining stderr log from one file with stdout log from another one, so not sure about the sequence; stderr part is in the middle, separated by the blank lines.)

Why would that happen? Is the library too new for the system, or incompatible to the compiler? I changed the FC, CC and CXX variables just as for LSDalton compilation in the cmake command (but not before the make line; could that be the reason?)...

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 03 Jul 2019, 12:46

taylor wrote:
27 Jun 2019, 08:02
But what about the build finding the boost libraries in their default location? Does it make more sense to "hide" these or set the directories explicitly to null values (as well as trying to force a local boost headers build)?

Best regards
Pete
I'm not sure but in my case boost wasn't detected by cmake so perhaps worth it to try to prevent detection.

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 03 Jul 2019, 12:57

esmuigors wrote:
26 Jun 2019, 14:22
magnus wrote:
25 Jun 2019, 15:16
Can you provide the output from the setup? Probably you already mentioned it, but which compiler are you using? Also which version of LSDalton are you compiling?
I am using the Intel 14.0.1.20131008 compiler and compiling LSDalton v2018.0 (I cloned the git repository which actually had only this brach, despite v.2018.1 being mentioned somewhere).

/.setup command is:

Code: Select all

./setup --mpi --prefix=$HOME/bin/lsd2018_70to89 -D ENABLE_PCMSOLVER=ON -D ENABLE_DEC=ON -D ENABLE_CXX11_SUPPORT=ON --cmake=$HOME/cmake372/bin/cmake -D ZLIB_ROOT=$HOME/zlib-1.2.11
./setup output is as follows:

Code: Select all

 FC=mpif90 CC=mpicc CXX=mpicxx /home/igors/cmake372/bin/cmake -DENABLE_MPI=ON -DENABLE_SGI_MPT=OFF -DENABLE_OMP=OFF -DENABLE_64BIT_INTEGERS=OFF -DENABLE_GPU=OFF -DENABLE_CUBLAS=OFF -DENABLE_CSR=OFF -DENABLE_SCALASCA=OFF -DENABLE_VAMPIRTRACE=OFF -DENABLE_TIMINGS=OFF -DENABLE_STATIC_LINKING=OFF -DENABLE_SCALAPACK=OFF -DCMAKE_INSTALL_PREFIX=/home/igors/bin/lsd2018_70to89 -DCMAKE_ARGS="--no-warn-unused-cli" -DCMAKE_BUILD_TYPE=release -DENABLE_PCMSOLVER=ON -DENABLE_DEC=ON -DENABLE_CXX11_SUPPORT=ON -DZLIB_ROOT=/home/igors/zlib-1.2.11 /home/igors/lsdalton

-- compiler understands Fortran 2003
-- BLAS will be searched for based on MKLROOT=/opt/intel/composer_xe_2013_sp1.1.106/mkl
-- LAPACK will be searched for based on MKLROOT=/opt/intel/composer_xe_2013_sp1.1.106/mkl
-- found mpi mod, setting -DUSE_MPI_MOD_F90
-- found 32bit integer mpi module
-- found an MPI 3 compatible MPI lib, setting -DVAR_HAVE_MPI3
-- GIT_BRANCH            : (no branch)
-- System                : Linux
-- Processor type        : x86_64
-- Fortran compiler flags: -fpp -assume byterecl -DVAR_IFORT  -O3 -ip
-- C compiler flags      : -g -wd981 -wd279 -wd383 -vec-report0 -wd1572 -wd1777 -restrict -DRESTRICT=restrict -O3 -ip
-- Libraries             : /opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_intel_lp64.so;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_sequential.so;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_core.so;/usr/lib64/libpthread.so;/usr/lib64/libm.so;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_lapack95_lp64.a;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_intel_lp64.so
-- Definitions           : SYS_LINUX;SYS_UNIX;VAR_IFORT;COMPILER_UNDERSTANDS_FORTRAN_2003;VAR_PTR_RESHAPE;HAVE_MKL_BLAS;HAVE_MKL_LAPACK;VAR_MPI;USE_MPI_MOD_F90;VAR_HAVE_MPI3;VAR_DEC;VAR_MFDS;_FILE_OFFSET_BITS=64;IMPLICIT_NONE;BINARY_INFO_AVAILABLE;INSTALL_BASDIR="/home/igors/lsdalton/build/basis";VAR_MKL;VAR_RSP;HAS_PCMSOLVER;INSTALL_WRKMEM=64000000;INSTALL_MMWORK=1;VAR_XCFUN;VAR_ENABLE_TENSORS
CMake Warning:
  Manually-specified variables were not used by the project:

    CMAKE_ARGS
    ENABLE_CXX11_SUPPORT


-- The Fortran compiler identification is Intel 14.0.1.20131008
-- The C compiler identification is Intel 14.0.1.20131008
-- The CXX compiler identification is Intel 14.0.1.20131008
-- Check for working Fortran compiler: /opt/openmpi-2.1.5/bin/mpif90
-- Check for working Fortran compiler: /opt/openmpi-2.1.5/bin/mpif90  -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /opt/openmpi-2.1.5/bin/mpif90 supports Fortran 90
-- Checking whether /opt/openmpi-2.1.5/bin/mpif90 supports Fortran 90 -- yes
-- Check for working C compiler: /opt/openmpi-2.1.5/bin/mpicc
-- Check for working C compiler: /opt/openmpi-2.1.5/bin/mpicc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /opt/openmpi-2.1.5/bin/mpicxx
-- Check for working CXX compiler: /opt/openmpi-2.1.5/bin/mpicxx -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Performing Test COMPILER_UNDERSTANDS_FORTRAN03
-- Performing Test COMPILER_UNDERSTANDS_FORTRAN03 - Success
-- Performing Test PTR_RESHAPE_WORKS
-- Performing Test PTR_RESHAPE_WORKS - Success
-- Math lib search order is MKL;ESSL;OPENBLAS;ATLAS;ACML;SYSTEM_NATIVE
-- You can select a specific type by defining for instance -D BLAS_TYPE=ATLAS or -D LAPACK_TYPE=ACML
-- or by redefining MATH_LIB_SEARCH_ORDER
-- Found BLAS: MKL (/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_intel_lp64.so;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_sequential.so;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_core.so;/usr/lib64/libpthread.so;/usr/lib64/libm.so)
-- Found LAPACK: MKL (/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_lapack95_lp64.a;/opt/intel/composer_xe_2013_sp1.1.106/mkl/lib/intel64/libmkl_intel_lp64.so)
-- Found MPI_C: /opt/openmpi-2.1.5/lib/libmpi.so  
-- Found MPI_CXX: /opt/openmpi-2.1.5/lib/libmpi.so  
-- Found MPI_Fortran: /opt/openmpi-2.1.5/lib/libmpi_usempif08.so;/opt/openmpi-2.1.5/lib/libmpi_usempi_ignore_tkr.so;/opt/openmpi-2.1.5/lib/libmpi_mpifh.so;/opt/openmpi-2.1.5/lib/libmpi.so  
-- Performing Test MPI_COMPATIBLE
-- Performing Test MPI_COMPATIBLE - Success
-- Performing Test MPI_F90_I4
-- Performing Test MPI_F90_I4 - Success
-- Performing Test MPI_F90_I8
-- Performing Test MPI_F90_I8 - Failed
-- Performing Test ENABLE_MPI3_FEATURES
-- Performing Test ENABLE_MPI3_FEATURES - Success
-- Found Git: /usr/bin/git  
-- Polarizable Continuum Model via PCMSolver ENABLED
-- Configuring done
-- Generating done
-- Build files have been written to: /home/igors/lsdalton/build

   configure step is done
   now you need to compile the sources:
   $ cd build
   $ make
Can you run "mpif90 --version" and "mpirun --version" and paste the output here? Just wondering if there is maybe some mismatch.

There is only a 2018.0 version of LSDalton. The other one is for Dalton only.

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 03 Jul 2019, 13:14

esmuigors wrote:
29 Jun 2019, 15:01
I tried a different versions of BLAS/LAPACK (3.8.0, the newest one), and got the following error at the very end of the compilation:

Code: Select all

[100%] Linking Fortran executable lsdalton.x
/home/igors/cmake372/bin/cmake -E cmake_link_script CMakeFiles/lsdalton.x.dir/link.txt --verbose=1
/opt/openmpi-2.1.5/bin/mpif90    -fpp -assume byterecl -DVAR_IFORT  -O3 -ip CMakeFiles/lsdalton.x.dir/src/lsdaltonsrc/lsdalton_wrapper.f90.o  -o lsdalton.x  -L/home/igors/lsdalton/build/external/lib  -L/home/igors/lsdalton/build/external/lib64  -L/home/igors/lsdalton/build/external/ls-matrix-defop-build/external/lib  -L/home/igors/lsdalton/build/external/xcfun-build/external/lib  -L/home/igors/lsdalton/build/external/ls-openrsp-build/external/lib -Wl,-rpath,/home/igors/lsdalton/build/external/lib:/home/igors/lsdalton/build/external/lib64:/home/igors/lsdalton/build/external/ls-matrix-defop-build/external/lib:/home/igors/lsdalton/build/external/xcfun-build/external/lib:/home/igors/lsdalton/build/external/ls-openrsp-build/external/lib:/home/igors/lapack-3.8.0/lib64:/home/igors/zlib-1.2.11: lib/liblsdaltonmain.a lib/libxcfun_interface.a lib/libgeooptlib.a lib/libwannierlib.a lib/librsp_propertieslib.a lib/liblinearslib.a lib/librspsolverlib.a lib/libsolverutillib.a lib/liblspcm.a lib/libdeclib.a lib/libpbclib.a lib/libddynamlib.a lib/liblsintlib.a lib/libdftfunclib.a lib/libfmmlib.a lib/liblsutillib.a lib/liblsutiltypelib_common.a lib/libpdpacklib.a lib/libmatrixulib.a lib/libmatrixolib.a lib/liblsutillib_common8.a lib/liblsutillib_common7.a lib/liblsutillib_common6.a lib/liblsutillib_common5.a lib/liblsutillib_common4.a lib/liblsutillib_common3.a lib/liblsutillib_common2.a lib/liblsutillib_common1.a lib/libmatrixmlib.a lib/libcuda_gpu_interfaces.a lib/liblsutillib_precision.a /home/igors/lapack-3.8.0/lib64/libblas.so /home/igors/lapack-3.8.0/lib64/liblapack.so lib/libls_frame_input.a external/lib/libopenrsp.a external/lib/libScaTeLib.a external/lib/libxcfun_f90_bindings.a external/lib/libxcfun.a external/lib/libmatrix-defop.a lib/libmatrixulib.a -lstdc++ -lmpi -limf -lsvml -lirng -lm -lipgo -ldecimal -lcilkrts -lstdc++ -lirc -lpthread -lsvml -lc -lirc_s -ldl -lc -lpcm -lmpi -limf -lsvml -lirng -lm -lipgo -ldecimal -lcilkrts -lstdc++ -lirc -lpthread -lsvml -lc -lirc_s -ldl -lc /home/igors/zlib-1.2.11/libz.so /home/igors/lapack-3.8.0/lib64/libblas.so /home/igors/lapack-3.8.0/lib64/liblapack.so external/lib/libopenrsp.a external/lib/libScaTeLib.a external/lib/libxcfun_f90_bindings.a external/lib/libxcfun.a external/lib/libmatrix-defop.a lib/libmatrixulib.a -lstdc++ -lmpi -limf -lsvml -lirng -lm -lipgo -ldecimal -lcilkrts -lirc -lpthread -lc -lirc_s -ldl -lpcm /home/igors/zlib-1.2.11/libz.so -lirng -ldecimal -lcilkrts -lstdc++ -lirng -ldecimal -lcilkrts -lstdc++

/home/igors/lapack-3.8.0/lib64/liblapack.so: undefined reference to `slacgv_'
/home/igors/lapack-3.8.0/lib64/liblapack.so: undefined reference to `dlacgv_'
make[2]: *** [lsdalton.x] Error 1
make[1]: *** [CMakeFiles/lsdalton.x.dir/all] Error 2
make: *** [all] Error 2

make[2]: Leaving directory `/home/igors/lsdalton/build'
make[1]: Leaving directory `/home/igors/lsdalton/build'
(I'm combining stderr log from one file with stdout log from another one, so not sure about the sequence; stderr part is in the middle, separated by the blank lines.)

Why would that happen? Is the library too new for the system, or incompatible to the compiler? I changed the FC, CC and CXX variables just as for LSDalton compilation in the cmake command (but not before the make line; could that be the reason?)...
I'm not sure I understood if this a clean build or not. If not then try with a clean build and if it is a clean build then I would assume it is a problem with your lapack.

taylor
Posts: 532
Joined: 15 Oct 2013, 05:37
First name(s): Peter
Middle name(s): Robert
Last name(s): Taylor
Affiliation: Tianjin University
Country: China

Re: Compiling LSDalton with PCM Solver

Post by taylor » 03 Jul 2019, 15:59

I have tried adding the line

-DBUILD_CUSTOM_BOOST=ON

to src/pcm/CMakeLists.txt
but all that happens now is I get

No rule to make target `/home/taylor/src/2018/lsdalton/external/pcmsolver/include/custom_boost', needed by `include/CMakeFiles/generate-config-hpp'. Stop.

So clearly this is not enough to ensure that I get a custom boost build. Anybody got any ideas...?

Best regards
Pete

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 04 Jul 2019, 19:17

magnus wrote:
03 Jul 2019, 12:57
Can you run "mpif90 --version" and "mpirun --version" and paste the output here? Just wondering if there is maybe some mismatch.

Code: Select all

[lasc87:~]$ mpif90 --version
ifort (IFORT) 14.0.1 20131008
Copyright (C) 1985-2013 Intel Corporation.  All rights reserved.

[lasc87:~]$ mpirun --version
mpirun (Open MPI) 2.1.5

Report bugs to http://www.open-mpi.org/community/help/
[lasc87:~]$ 
There is only a 2018.0 version of LSDalton. The other one is for Dalton only.
Got it, thanks for the clarification!

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 05 Jul 2019, 09:37

esmuigors wrote:
04 Jul 2019, 19:17

Code: Select all

[lasc87:~]$ mpif90 --version
ifort (IFORT) 14.0.1 20131008
Copyright (C) 1985-2013 Intel Corporation.  All rights reserved.

[lasc87:~]$ mpirun --version
mpirun (Open MPI) 2.1.5

Report bugs to http://www.open-mpi.org/community/help/
[lasc87:~]$ 
I was hoping it would give more information but apparently OpenMPI doesn't do that. So what I'm wondering is if the problem is related to some mismatch between the Intel compiler and OpenMPI compilation that you're using. Perhaps try to add -D USE_MPIF_H=ON to your setup command and, importantly, start from scratch, i.e. a clean build directory.

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 05 Jul 2019, 09:48

Just to note, yes, I am starting from scratch every new time.

Also, I found the solution for the LAPACK problem (at least I hope it is the solution):
https://github.com/Reference-LAPACK/lapack/issues/228

After I applied it, I got the LSDalton compiled and linked properly. But evn with the newest LAPACK/BLAS all the test problems persist. E.g., dftdisp_d3_b3lyp_h2o test file just abruptly ends after printing out the dispersion model parameters, and the file LSDALTON.ERR is empty.

I will now try adding -D USE_MPIF_H=ON, thank You! The ./setup script output is as follows:

Code: Select all

[lasc87:~/lsdalton]$ PATH="/home/igors/Python-2.7.16:$PATH" FC=mpif90 CC=mpicc CXX=mpicxx nice ./setup --mpi --prefix=$HOME/bin/lsd2018_50to64 -D ENABLE_PCMSOLVER=ON -D ENABLE_DEC=ON -D ENABLE_CXX11_SUPPORT=ON --cmake=$HOME/cmake372/bin/cmake -D ZLIB_ROOT=$HOME/zlib-1.2.11 --blas=$HOME/lapack-3.8.0/lib64/libblas.so --lapack=$HOME/lapack-3.8.0/lib64/liblapack.so -D USE_MPIF_H=ON
 FC=mpif90 CC=mpicc CXX=mpicxx /home/igors/cmake372/bin/cmake -DENABLE_MPI=ON -DENABLE_SGI_MPT=OFF -DENABLE_OMP=OFF -DENABLE_64BIT_INTEGERS=OFF -DENABLE_GPU=OFF -DENABLE_CUBLAS=OFF -DENABLE_CSR=OFF -DENABLE_SCALASCA=OFF -DENABLE_VAMPIRTRACE=OFF -DENABLE_TIMINGS=OFF -DENABLE_STATIC_LINKING=OFF -DENABLE_SCALAPACK=OFF -DEXPLICIT_BLAS_LIB=/home/igors/lapack-3.8.0/lib64/libblas.so -DENABLE_AUTO_BLAS=OFF -DEXPLICIT_LAPACK_LIB=/home/igors/lapack-3.8.0/lib64/liblapack.so -DENABLE_AUTO_LAPACK=OFF -DCMAKE_INSTALL_PREFIX=/home/igors/bin/lsd2018_50to64 -DCMAKE_ARGS="--no-warn-unused-cli" -DCMAKE_BUILD_TYPE=release -DENABLE_PCMSOLVER=ON -DENABLE_DEC=ON -DENABLE_CXX11_SUPPORT=ON -DZLIB_ROOT=/home/igors/zlib-1.2.11 -DUSE_MPIF_H=ON /home/igors/lsdalton

-- compiler understands Fortran 2003
-- BLAS: using explit library (/home/igors/lapack-3.8.0/lib64/libblas.so)
-- LAPACK: using explit library (/home/igors/lapack-3.8.0/lib64/liblapack.so)
-- WARNING: integer check not successful, assuming a 32bit mpif.h
-- found an MPI 3 compatible MPI lib, setting -DVAR_HAVE_MPI3
-- GIT_BRANCH            : (no branch)
-- System                : Linux
-- Processor type        : x86_64
-- Fortran compiler flags: -fpp -assume byterecl -DVAR_IFORT  -O3 -ip
-- C compiler flags      : -g -wd981 -wd279 -wd383 -vec-report0 -wd1572 -wd1777 -restrict -DRESTRICT=restrict -O3 -ip
-- Libraries             : /home/igors/lapack-3.8.0/lib64/libblas.so;/home/igors/lapack-3.8.0/lib64/liblapack.so
-- Definitions           : SYS_LINUX;SYS_UNIX;VAR_IFORT;COMPILER_UNDERSTANDS_FORTRAN_2003;VAR_PTR_RESHAPE;VAR_MPI;VAR_HAVE_MPI3;VAR_DEC;VAR_MFDS;_FILE_OFFSET_BITS=64;IMPLICIT_NONE;BINARY_INFO_AVAILABLE;INSTALL_BASDIR="/home/igors/lsdalton/build/basis";VAR_RSP;HAS_PCMSOLVER;INSTALL_WRKMEM=64000000;INSTALL_MMWORK=1;VAR_XCFUN;VAR_ENABLE_TENSORS
CMake Warning:
  Manually-specified variables were not used by the project:

    CMAKE_ARGS
    ENABLE_CXX11_SUPPORT


-- The Fortran compiler identification is Intel 14.0.1.20131008
-- The C compiler identification is Intel 14.0.1.20131008
-- The CXX compiler identification is Intel 14.0.1.20131008
-- Check for working Fortran compiler: /opt/openmpi-2.1.5/bin/mpif90
-- Check for working Fortran compiler: /opt/openmpi-2.1.5/bin/mpif90  -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /opt/openmpi-2.1.5/bin/mpif90 supports Fortran 90
-- Checking whether /opt/openmpi-2.1.5/bin/mpif90 supports Fortran 90 -- yes
-- Check for working C compiler: /opt/openmpi-2.1.5/bin/mpicc
-- Check for working C compiler: /opt/openmpi-2.1.5/bin/mpicc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /opt/openmpi-2.1.5/bin/mpicxx
-- Check for working CXX compiler: /opt/openmpi-2.1.5/bin/mpicxx -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Performing Test COMPILER_UNDERSTANDS_FORTRAN03
-- Performing Test COMPILER_UNDERSTANDS_FORTRAN03 - Success
-- Performing Test PTR_RESHAPE_WORKS
-- Performing Test PTR_RESHAPE_WORKS - Success
-- Math lib search order is MKL;ESSL;OPENBLAS;ATLAS;ACML;SYSTEM_NATIVE
-- You can select a specific type by defining for instance -D BLAS_TYPE=ATLAS or -D LAPACK_TYPE=ACML
-- or by redefining MATH_LIB_SEARCH_ORDER
-- Found MPI_C: /opt/openmpi-2.1.5/lib/libmpi.so  
-- Found MPI_CXX: /opt/openmpi-2.1.5/lib/libmpi.so  
-- Found MPI_Fortran: /opt/openmpi-2.1.5/lib/libmpi_usempif08.so;/opt/openmpi-2.1.5/lib/libmpi_usempi_ignore_tkr.so;/opt/openmpi-2.1.5/lib/libmpi_mpifh.so;/opt/openmpi-2.1.5/lib/libmpi.so  
-- Performing Test MPI_COMPATIBLE
-- Performing Test MPI_COMPATIBLE - Success
-- Performing Test ENABLE_MPI3_FEATURES
-- Performing Test ENABLE_MPI3_FEATURES - Success
-- Found Git: /usr/bin/git  
-- Polarizable Continuum Model via PCMSolver ENABLED
-- Configuring done
-- Generating done
-- Build files have been written to: /home/igors/lsdalton/build

   configure step is done
   now you need to compile the sources:
   $ cd build
   $ make
It is different by that it doesn't set the USE_MPI_MOD_F90 option (WARNING: integer check not successful, assuming a 32bit mpif.h)
Last edited by esmuigors on 05 Jul 2019, 09:59, edited 1 time in total.

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 05 Jul 2019, 09:52

I'm quite certain that the problem is not related to BLAS/LAPACK, since you used MKL before and there it also failed. The question is whether it is even related to MPI, so one other thing you could try is to compile without MPI (but optionally with OpenMP) and then see if the tests still fail. In this case the DEC tests will fail because they AFAIK require MPI so don't worry about those.

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 05 Jul 2019, 10:00

OK, but I still should test the compilation with -D USE_MPIF_H=ON, don't I?

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 05 Jul 2019, 10:01

Sure, since you would like to have MPI enabled in the end. The serial compilation is just to check whether or not the failed tests are related to MPI or not.

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 05 Jul 2019, 12:43

Unfortunately -D USE_MPIF_H=ON did not help: the errors seem to be all the same (and in quantity they are definitely. Hence I will try with sequential build...

The ctest output is attached.
Attachments
ctest_output_05jul2019.log
(1.19 MiB) Downloaded 3 times

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 05 Jul 2019, 12:58

Just to be sure, can you run "git status" from your build directory and paste the output here? Also can you run a calculation using the attached input files and attach the output file as well as the output from the terminal? You can run the calculation, e.g., as:

Code: Select all

/path/to/build/lsdalton -mol h2o.mol -dal dftdisp_d3_Slater.dal -o test.out
Attachments
dftdisp_d3_Slater.dal
(135 Bytes) Downloaded 2 times
h2o.mol
(232 Bytes) Downloaded 2 times

esmuigors
Posts: 43
Joined: 14 Nov 2018, 17:54
First name(s): Igors
Middle name(s): N.
Last name(s): Mihailovs
Affiliation: Institute of Solid State Physics, University of Latvia
Country: Latvia

Re: Compiling LSDalton with PCM Solver

Post by esmuigors » 05 Jul 2019, 19:35

`git status` produces no output and hangs, possibly because of the nodes not being connected to the Internet (their normal state).

I now have many builds, and can provide outcomes from all of them.
test…out files are more or less the same all the time. LSDalton script output (script…out) changes, though.

1. script_MKL_cmplr_not_set.out – my first try, using MKL and not explicitly setting the compilers env. variables in front of the ./setup command.
2. script_LAPACK_cmplr_not_set_mpi.out – using LAPACK 3.8.0, not setting compiler variables (FC, CC and CXX)
3. script_LAPACK_mpicc.out – setting compilers (mpif90, mpicc, mpicxx)
4. script_useMPIFh_mpicc.out - after adding -D USE_MPIF_H
5. script_OMP_mpicc.out - switching to OpenMP from MPI, but out of negligence CC=mpicc, etc. persisted.
6. script_OMP_icc.out - specifying serial compilers (ifort, icc, icpc).
7. script_OMP_MKL.out - serial + MKL libraries.

UPDATED: for the serial build with MKL (not standalone) LAPACK all tests also failed.
Attachments
script_OMP_MKL.out
(6.58 KiB) Downloaded 2 times
test_OMP_MKL.out
(13.88 KiB) Downloaded 2 times
testing_water.zip
(38.14 KiB) Downloaded 2 times

User avatar
magnus
Posts: 497
Joined: 27 Jun 2013, 16:32
First name(s): Jógvan Magnus
Middle name(s): Haugaard
Last name(s): Olsen
Affiliation: Hylleraas Centre, UiT The Arctic University of Norway
Country: Norway

Re: Compiling LSDalton with PCM Solver

Post by magnus » 07 Jul 2019, 14:00

Running "git status" should produce output even if you do not have an internet connection. I've experienced before that it can take a long time on some machines but it should produce some output at some point. This output would be good to have just to be absolutely sure that the code is unchanged.

Ok, so itss not an MPI problem and I'm pretty sure it's not a problem with math libraries so I'm leaning towards a problem with the specific version of the Intel compiler that you're using (assuming that the code is unchanged). You already have a more or less successful build using gcc compilers on a different machine, right?

Post Reply

Who is online

Users browsing this forum: No registered users and 1 guest