ERROR in "lsutil" during compilation of DALTON2013
- rainer.rutka
- Posts: 9
- Joined: 27 Nov 2013, 15:57
- First name(s): Rainer
- Last name(s): Rutka
- Affiliation: University of Konstanz, Germany
- Country: Germany
ERROR in "lsutil" during compilation of DALTON2013
Hi!
Then I try to install DALTON 2013 i get the following error-message while compiling:
/pfs/data2/home/kn/kn_kn/kn_popxxxxxx/src/chem/dalton/2013/DALTON-2013.4-Source/external/pelib/src/polarizable_embedding.f90(3089): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_GATHERV]
call mpi_gatherv(Fpe(:,i), cubedists(myid), rmpi, 0, 0, 0, rmpi,&
---------------------^
compilation aborted for /pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/external/pelib/src/polarizable_embedding.f90 (code 1)
make[5]: *** [CMakeFiles/pelib.dir/src/polarizable_embedding.f90.o] Fehler 1
make[4]: *** [CMakeFiles/pelib.dir/all] Fehler 2
make[3]: *** [all] Fehler 2
make[2]: *** [external/pelib-stamp/pelib-build] Fehler 2
make[1]: *** [CMakeFiles/pelib.dir/all] Fehler 2
make: *** [all] Fehler 2
[kn_popxxxxxx@uc1n997 build]$ module list
...
Currently Loaded Modulefiles:
1) compiler/intel/14.0(default) 3) numlib/mkl/11.1.4(default)
2) mpi/openmpi/1.8-intel-14.0 4) devel/cmake/2.8.11
Within the build folder i called: ./setup --mpi
Any suggestions?
Thanks.
rainer.rutka@uni-konstanz.de
Then I try to install DALTON 2013 i get the following error-message while compiling:
/pfs/data2/home/kn/kn_kn/kn_popxxxxxx/src/chem/dalton/2013/DALTON-2013.4-Source/external/pelib/src/polarizable_embedding.f90(3089): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_GATHERV]
call mpi_gatherv(Fpe(:,i), cubedists(myid), rmpi, 0, 0, 0, rmpi,&
---------------------^
compilation aborted for /pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/external/pelib/src/polarizable_embedding.f90 (code 1)
make[5]: *** [CMakeFiles/pelib.dir/src/polarizable_embedding.f90.o] Fehler 1
make[4]: *** [CMakeFiles/pelib.dir/all] Fehler 2
make[3]: *** [all] Fehler 2
make[2]: *** [external/pelib-stamp/pelib-build] Fehler 2
make[1]: *** [CMakeFiles/pelib.dir/all] Fehler 2
make: *** [all] Fehler 2
[kn_popxxxxxx@uc1n997 build]$ module list
...
Currently Loaded Modulefiles:
1) compiler/intel/14.0(default) 3) numlib/mkl/11.1.4(default)
2) mpi/openmpi/1.8-intel-14.0 4) devel/cmake/2.8.11
Within the build folder i called: ./setup --mpi
Any suggestions?
Thanks.
rainer.rutka@uni-konstanz.de
Last edited by rainer.rutka on 23 Jan 2015, 09:37, edited 1 time in total.
- magnus
- Posts: 524
- Joined: 27 Jun 2013, 16:32
- First name(s): Jógvan Magnus
- Middle name(s): Haugaard
- Last name(s): Olsen
- Affiliation: Aarhus University
- Country: Denmark
Re: Installing DALTON 2013 on our bwUniveral-Cluster
This has been fixed in a newer version of the Polarizable Embedding library but it has not yet been released with Dalton. As far as I remember the error only appears with more recent OpenMPI (and probably also some other MPI implementations). You could try with an older version of OpenMPI or if you don't need the polarizable embedding functionality you can try the following: ./setup --mpi -DENABLE_PELIB=OFF
- rainer.rutka
- Posts: 9
- Joined: 27 Nov 2013, 15:57
- First name(s): Rainer
- Last name(s): Rutka
- Affiliation: University of Konstanz, Germany
- Country: Germany
Re: ERROR while install DALTON 2013 on our bwUniveral-Cluste
Thanks. OK I used the flags you mentioned.
But now I got these:
[...]
68%] Building Fortran object CMakeFiles/dalton.dir/DALTON/densfit/dposv.F.o
[ 68%] Building Fortran object CMakeFiles/dalton.dir/binary_info.F90.o
Linking Fortran static library lib/libdalton.a
[ 68%] Built target dalton
Scanning dependencies of target dalton.x
[ 68%] Building Fortran object CMakeFiles/dalton.x.dir/DALTON/abacus/dalton.F.o
Linking Fortran executable dalton.x
lib/libdalton.a(eri2par.F.o): In function `dalton_nodedriver_':
/pfs/data2/home/kn/kn_kn/kn_popxxxxxxx/src/chem/dalton/2013/DALTON-2013.4-Source/DALTON/eri/eri2par.F:(.text+0x4a96): undefined reference to `pe_ifc_mpi_'
make[2]: *** [dalton.x] Fehler 1
make[1]: *** [CMakeFiles/dalton.x.dir/all] Fehler 2
make: *** [all] Fehler 2
[kn_popxxxxxx@uc1n997 build]$
But now I got these:
[...]
68%] Building Fortran object CMakeFiles/dalton.dir/DALTON/densfit/dposv.F.o
[ 68%] Building Fortran object CMakeFiles/dalton.dir/binary_info.F90.o
Linking Fortran static library lib/libdalton.a
[ 68%] Built target dalton
Scanning dependencies of target dalton.x
[ 68%] Building Fortran object CMakeFiles/dalton.x.dir/DALTON/abacus/dalton.F.o
Linking Fortran executable dalton.x
lib/libdalton.a(eri2par.F.o): In function `dalton_nodedriver_':
/pfs/data2/home/kn/kn_kn/kn_popxxxxxxx/src/chem/dalton/2013/DALTON-2013.4-Source/DALTON/eri/eri2par.F:(.text+0x4a96): undefined reference to `pe_ifc_mpi_'
make[2]: *** [dalton.x] Fehler 1
make[1]: *** [CMakeFiles/dalton.x.dir/all] Fehler 2
make: *** [all] Fehler 2
[kn_popxxxxxx@uc1n997 build]$
- magnus
- Posts: 524
- Joined: 27 Jun 2013, 16:32
- First name(s): Jógvan Magnus
- Middle name(s): Haugaard
- Last name(s): Olsen
- Affiliation: Aarhus University
- Country: Denmark
Re: ERROR while install DALTON 2013 on our bwUniveral-Cluste
Sorry about that. I was sure that it worked. Luckily it is easy to fix. I've attached a patch that you can apply from the top level dalton directory
Code: Select all
patch -p0 < pelib.patch.txt
- Attachments
-
- pelib.patch.txt
- (601 Bytes) Downloaded 401 times
- rainer.rutka
- Posts: 9
- Joined: 27 Nov 2013, 15:57
- First name(s): Rainer
- Last name(s): Rutka
- Affiliation: University of Konstanz, Germany
- Country: Germany
Re: ERROR while install DALTON 2013 on our bwUniveral-Cluste
... cool. Thanks a lot. 

- rainer.rutka
- Posts: 9
- Joined: 27 Nov 2013, 15:57
- First name(s): Rainer
- Last name(s): Rutka
- Affiliation: University of Konstanz, Germany
- Country: Germany
Re: ERROR while install DALTON 2013 on our bwUniveral-Cluste
Hi!
After installing a patch for the above error, the compiler run until 71%.
Here I got another error:
[...]
[ 71%] Building Fortran object CMakeFiles/lsutillib_common.dir/manual_reordering/reord4d_4_utils_f2t.F90.o
[ 71%] Building Fortran object CMakeFiles/lsutillib_common.dir/manual_reordering/reord4d_4_utils_t2f.F90.o
[ 71%] Building Fortran object CMakeFiles/lsutillib_common.dir/LSDALTON/lsutil/lsmpiType.F90.o
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(1058): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND]
call MPI_SEND(buffer,nbuf,dtype,receiver,tag,comm,ierr)
-------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(1103): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV]
call MPI_RECV(buffer,nbuf,dtype,sender,tag,comm,status,ierr)
-------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(1260): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND]
call MPI_SEND(buffer(1:nbuf),n,DATATYPE,receiver,tag,comm,ierr)
----------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(1262): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV]
call MPI_RECV(buffer(1:nbuf),n,DATATYPE,sender,tag,comm,status,ierr)
----------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(3666): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_REDUCE]
CALL MPI_REDUCE(MPI_IN_PLACE,BUFFER,n,MPI_INTEGER8,MPI_SUM,&
-------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(3670): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_REDUCE]
CALL MPI_REDUCE(BUFFER,NULL,n,MPI_INTEGER8,MPI_SUM,&
-------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(5771): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLGATHERV]
call MPI_ALLGATHERV(sendbuf,n,dtype,recbuf,reccounts,&
-----------^
compilation aborted for /pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90 (code 1)
make[3]: *** [CMakeFiles/lsutillib_common.dir/LSDALTON/lsutil/lsmpiType.F90.o] Fehler 1
make[2]: *** [CMakeFiles/lsutillib_common.dir/LSDALTON/lsutil/lsmpiType.F90.o.provides] Fehler 2
make[1]: *** [CMakeFiles/lsutillib_common.dir/all] Fehler 2
make: *** [all] Fehler 2
Any help is most welcome!
rainer.rutka@uni-konstanz.de
After installing a patch for the above error, the compiler run until 71%.
Here I got another error:
[...]
[ 71%] Building Fortran object CMakeFiles/lsutillib_common.dir/manual_reordering/reord4d_4_utils_f2t.F90.o
[ 71%] Building Fortran object CMakeFiles/lsutillib_common.dir/manual_reordering/reord4d_4_utils_t2f.F90.o
[ 71%] Building Fortran object CMakeFiles/lsutillib_common.dir/LSDALTON/lsutil/lsmpiType.F90.o
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(1058): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND]
call MPI_SEND(buffer,nbuf,dtype,receiver,tag,comm,ierr)
-------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(1103): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV]
call MPI_RECV(buffer,nbuf,dtype,sender,tag,comm,status,ierr)
-------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(1260): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND]
call MPI_SEND(buffer(1:nbuf),n,DATATYPE,receiver,tag,comm,ierr)
----------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(1262): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV]
call MPI_RECV(buffer(1:nbuf),n,DATATYPE,sender,tag,comm,status,ierr)
----------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(3666): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_REDUCE]
CALL MPI_REDUCE(MPI_IN_PLACE,BUFFER,n,MPI_INTEGER8,MPI_SUM,&
-------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(3670): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_REDUCE]
CALL MPI_REDUCE(BUFFER,NULL,n,MPI_INTEGER8,MPI_SUM,&
-------------^
/pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90(5771): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLGATHERV]
call MPI_ALLGATHERV(sendbuf,n,dtype,recbuf,reccounts,&
-----------^
compilation aborted for /pfs/data2/home/kn/kn_kn/kn_pop235844/src/chem/dalton/2013/DALTON-2013.4-Source/LSDALTON/lsutil/lsmpiType.F90 (code 1)
make[3]: *** [CMakeFiles/lsutillib_common.dir/LSDALTON/lsutil/lsmpiType.F90.o] Fehler 1
make[2]: *** [CMakeFiles/lsutillib_common.dir/LSDALTON/lsutil/lsmpiType.F90.o.provides] Fehler 2
make[1]: *** [CMakeFiles/lsutillib_common.dir/all] Fehler 2
make: *** [all] Fehler 2
Any help is most welcome!
rainer.rutka@uni-konstanz.de
- magnus
- Posts: 524
- Joined: 27 Jun 2013, 16:32
- First name(s): Jógvan Magnus
- Middle name(s): Haugaard
- Last name(s): Olsen
- Affiliation: Aarhus University
- Country: Denmark
Re: ERROR while install DALTON 2013 on our bwUniveral-Cluste
I suggest that you try compiling using an older OpenMPI like 1.6.x and Intel 13 or 12.
-
- Posts: 600
- Joined: 15 Oct 2013, 05:37
- First name(s): Peter
- Middle name(s): Robert
- Last name(s): Taylor
- Affiliation: Tianjin University
- Country: China
Re: ERROR while install DALTON 2013 on our bwUniveral-Cluste
Is the OpenMPI build done by your system administrators or did you do it yourself? Either way it looks as if the build process is confused about the MPI routines LSDalton is trying to call. What flags/configuration was used to build the OpenMPI? And what flags are being specified to the Dalton build (I know you posted earlier it was just --mpi, is that still the case)? Also, if your site has the full "Intel Cluster Studio" (not just Intel Composer Suite) you could try using Intel's MPI instead of OpenMPI.
I am not directly involved in LSDalton development so I am not able to advise much here, but I think the more information you provide the more the LSDalton experts should be able to help.
While I was typing this Magnus's post came in. I would very much agree with his view that trying older compilers may help (we here do not use 2014 and we would not go near 2015 with a bargepole at this point!). Here at VLSCI (the centre I direct) our default for users is Intel 2013 and OpenMPI 1.6.5. At home on our cluster we use the same compiler and OpenMPI 1.6.4.
Best regards
Pete
I am not directly involved in LSDalton development so I am not able to advise much here, but I think the more information you provide the more the LSDalton experts should be able to help.
While I was typing this Magnus's post came in. I would very much agree with his view that trying older compilers may help (we here do not use 2014 and we would not go near 2015 with a bargepole at this point!). Here at VLSCI (the centre I direct) our default for users is Intel 2013 and OpenMPI 1.6.5. At home on our cluster we use the same compiler and OpenMPI 1.6.4.
Best regards
Pete
- rainer.rutka
- Posts: 9
- Joined: 27 Nov 2013, 15:57
- First name(s): Rainer
- Last name(s): Rutka
- Affiliation: University of Konstanz, Germany
- Country: Germany
Re: ERROR in "lsutil" during compilation of DALTON2013
Hi magnus (and the rest of the group).
As you requested, I used some older compiler- and openmpi-versions.
After testing with newer versions of the intel compiler:
compiler/intel/13.1
compiler/intel/14.0(default)
with the corresponding openmpi systems I used
at least these modules (the oldest version available on our cluster):
#(3) Load required modules for build process
module load compiler/intel/12.1
module load mpi/openmpi/1.6.5-intel-12.1
module load numlib/mkl/11.1.4
module load devel/cmake/2.8.11
module list
Currently Loaded Modulefiles:
1) compiler/intel/12.1 3) numlib/mkl/11.1.4(default)
2) mpi/openmpi/1.6.5-intel-12.1 4) devel/cmake/2.8.11
I installed the patch for the "DALTON/eri/eri2par.F" - problem, too.
Setup was built with:
# Patch the current version because of Frotran errors inside the code
cp ${SOURCE_DIR}/bwhpc-patch/dalton2013-patch.txt .
patch -p0 < eri2par-patch.txt
#
rm -f build/CMakeCache.txt # delete this file if you run setup again
./setup --mpi -DENABLE_PELIB=OFF -DCMAKE_INSTALL_PREFIX=${TARGET_DIR} | tee ${LOG}/setup.out
cd build
make 2>&1 | tee ${LOG}/make.out
and ....
NO ERRORS!
Summary
compiler|mpi intel-13.1|14 : NO
compiler/mpi intel-12.1 : YES
good luck, we did not get rid of the old Intel!
Am 23.01.2015 um 09:38 schrieb no-reply@daltonprogram.org:
> Posted by magnus
> I suggest that you try compiling using an older OpenMPI like 1.6.x and
> Intel 13 or 12.
--
Rainer Rutka
Universität Konstanz
Kommunikations-, Informations-, Medienzentrum (KIM)
High-Performance-Computing (HPC) [Raum V511]
78457 Konstanz
+49 7531 88-5413
As you requested, I used some older compiler- and openmpi-versions.
After testing with newer versions of the intel compiler:
compiler/intel/13.1
compiler/intel/14.0(default)
with the corresponding openmpi systems I used
at least these modules (the oldest version available on our cluster):
#(3) Load required modules for build process
module load compiler/intel/12.1
module load mpi/openmpi/1.6.5-intel-12.1
module load numlib/mkl/11.1.4
module load devel/cmake/2.8.11
module list
Currently Loaded Modulefiles:
1) compiler/intel/12.1 3) numlib/mkl/11.1.4(default)
2) mpi/openmpi/1.6.5-intel-12.1 4) devel/cmake/2.8.11
I installed the patch for the "DALTON/eri/eri2par.F" - problem, too.
Setup was built with:
# Patch the current version because of Frotran errors inside the code
cp ${SOURCE_DIR}/bwhpc-patch/dalton2013-patch.txt .
patch -p0 < eri2par-patch.txt
#
rm -f build/CMakeCache.txt # delete this file if you run setup again
./setup --mpi -DENABLE_PELIB=OFF -DCMAKE_INSTALL_PREFIX=${TARGET_DIR} | tee ${LOG}/setup.out
cd build
make 2>&1 | tee ${LOG}/make.out
and ....
NO ERRORS!
Summary
compiler|mpi intel-13.1|14 : NO
compiler/mpi intel-12.1 : YES
good luck, we did not get rid of the old Intel!
Am 23.01.2015 um 09:38 schrieb no-reply@daltonprogram.org:
> Posted by magnus
> I suggest that you try compiling using an older OpenMPI like 1.6.x and
> Intel 13 or 12.
--
Rainer Rutka
Universität Konstanz
Kommunikations-, Informations-, Medienzentrum (KIM)
High-Performance-Computing (HPC) [Raum V511]
78457 Konstanz
+49 7531 88-5413
- magnus
- Posts: 524
- Joined: 27 Jun 2013, 16:32
- First name(s): Jógvan Magnus
- Middle name(s): Haugaard
- Last name(s): Olsen
- Affiliation: Aarhus University
- Country: Denmark
Re: ERROR in "lsutil" during compilation of DALTON2013
That's great 
Not that you should change anything now, since you already have it installed, but for the record I should say that Intel 13.x and OpenMPI 1.6.x should also work. Also you don't have to deactivate the PE library when using OpenMPI 1.6.x or older.

Not that you should change anything now, since you already have it installed, but for the record I should say that Intel 13.x and OpenMPI 1.6.x should also work. Also you don't have to deactivate the PE library when using OpenMPI 1.6.x or older.
- rainer.rutka
- Posts: 9
- Joined: 27 Nov 2013, 15:57
- First name(s): Rainer
- Last name(s): Rutka
- Affiliation: University of Konstanz, Germany
- Country: Germany
Re: ERROR in "lsutil" during compilation of DALTON2013
OK; I'll give it a try!
- rainer.rutka
- Posts: 9
- Joined: 27 Nov 2013, 15:57
- First name(s): Rainer
- Last name(s): Rutka
- Affiliation: University of Konstanz, Germany
- Country: Germany
Re: ERROR in "lsutil" during compilation of DALTON2013
OK magnus! You're right!
[...]
./setup --mpi -DENABLE_PELIB=ON -DCMAKE_INSTALL_PREFIX=${TARGET_DIR} | tee ${LOG}/setup.out
[...]
Scanning dependencies of target xyz2dalton
[100%] Building Fortran object CMakeFiles/tools/xyz2dalton.dir/DALTON/tools/xyz2dalton.f90.o
Linking Fortran executable tools/xyz2dalton
[100%] Built target tools/xyz2dalton
[kn_popxxxxxx@uc1n996 build]$ module list
Currently Loaded Modulefiles:
1) numlib/mkl/11.1.4(default)
2) devel/cmake/2.8.11
3) compiler/intel/13.1
4) mpi/openmpi/1.6.5-intel-13.1(default)
[kn_popxxxxxx@uc1n996 build]$
is working, too.
Thank you for the hints.
Best support ever!
[...]
./setup --mpi -DENABLE_PELIB=ON -DCMAKE_INSTALL_PREFIX=${TARGET_DIR} | tee ${LOG}/setup.out
[...]
Scanning dependencies of target xyz2dalton
[100%] Building Fortran object CMakeFiles/tools/xyz2dalton.dir/DALTON/tools/xyz2dalton.f90.o
Linking Fortran executable tools/xyz2dalton
[100%] Built target tools/xyz2dalton
[kn_popxxxxxx@uc1n996 build]$ module list
Currently Loaded Modulefiles:
1) numlib/mkl/11.1.4(default)
2) devel/cmake/2.8.11
3) compiler/intel/13.1
4) mpi/openmpi/1.6.5-intel-13.1(default)
[kn_popxxxxxx@uc1n996 build]$
is working, too.
Thank you for the hints.
Best support ever!
Who is online
Users browsing this forum: No registered users and 2 guests