LSDalton memory requirements

Find answers or ask questions regarding LSDalton calculations
Post Reply
taylor
Posts: 526
Joined: 15 Oct 2013, 05:37
First name(s): Peter
Middle name(s): Robert
Last name(s): Taylor
Affiliation: Tianjin University
Country: China

LSDalton memory requirements

Post by taylor » 09 Feb 2014, 01:50

I thought I had asked something like this question before, but maybe I'm imagining it. Anyway, how does one estimate the likely (I don't mean precisely) memory requirements for an LSDalton calculation? What is the dependence on the basis set: with the size of the basis and perhaps its composition, etc. I ask because we have run quite successfully with up to about 11,000 basis functions and 1000 atoms but I have been largely unsuccessful with larger runs, encountering issues like ODBatch_allocation_1dim messages and job aborts. This is completely independent of using Scalapack or straight MPI, and using threading or not, or whether it's an int64 build or not. I'm happy to post more information about the jobs, but the smallest of them runs for more than 60 hours on 64 cores before falling over, which is asking a lot of someone who wants to help out...

There's plenty of memory on the machines we use, physical memory will obviously be the ultimate limitation, but we have at least 6GB per core.

Best regards
Pete

simensr
Posts: 182
Joined: 28 Aug 2013, 09:54
First name(s): Simen
Middle name(s): Sommerfelt
Last name(s): Reine
Affiliation: University of Oslo
Country: Norway

Re: LSDalton memory requirements

Post by simensr » 09 Feb 2014, 09:07

We have no a priori estimate of the memory consumption in LSDALTON. However, we do have an a posteriori printing of the memory statistics for a successful run, for example:

Max allocated memory, TOTAL 43.272 GB
Max allocated memory, type(matrix) 9.730 GB
Max allocated memory, real(realk) 43.270 GB
...

If you look at the TOTAL memory consumption of your completed calculations, this would at least give some indication of the memory requirements to expect for the larger systems.

Hope this is of help,
Simen

tkjaer
Posts: 300
Joined: 27 Aug 2013, 20:35
First name(s): Thomas
Last name(s): Kjaergaard

Re: LSDalton memory requirements

Post by tkjaer » 09 Feb 2014, 09:52

First of all. Thank you Peter. You seem to be the only one running LSDalton - or the only one with problems running LSDalton. We appreciate you feedback and your questions. However there is currently no way a priory to determine the memory required for a LSDalton calculation.

As you know all memory is allocated on the heap using fortrans intrisic allocate subroutine, as such it is the responsibility of the individual developer to ensure a limited memory usage. This have not been a concern at the beginning of the LSDalton development and afterwards when we tried to apply the code to larger and larger molecules we had to go back and reduce the memory footprints.

I believe a simple HF/DFT energy calculation should be fairly thorough tested - but only up to around 1000 atoms and around 11,000 basis (although I have done a 17,000 basis function calculation).

If you are doing a HF/DFT calculation it should be possible to estimate the memory requirements. Assuming that you use the dense matrix type (no .SCALAPACK and no .CSR) the memory for the matrices should be

nbast*nbast*8.0/(1000*1000*1000) GB

where nbast = number of basisfunctions

the number of matrices can depending on the SCF algorithm be as much as 55 matrices. The actual number of matrices used
depend on the number of iterations - this can be reduced by using matrices on disk (.DISK and .DISKSOLVER), naturally this is assoicated with a time penalty.

Using a input file like

**WAVE FUNCTIONS
.HF
**END OF INPUT

this should be the dominant part of the SCF part. Naturally the integral routine, for instance, then integral routine takes up some memory in addition to the memory used for matrices, but it should be minor.

I do not have the overview of the Response code - I have no idea how much memory is used in that part of the code. As far as I know - the response people (which does include myself to some extent) have not
been too concerned with memory, and we have not done too much testing of the response code for large molecular systems.
The **DEC and **CC use alot of memory, but it requires a .MEMORY keyword - and should not use more than this amount of memory per node.

However all calculations should provide you with a memory overview at the end

There is two parts.

The first part detects memory leaks - and please contact us if you experience a memory leak.

the second part

Max allocated memory, TOTAL 43.272 GB
Max allocated memory, type(matrix) 9.730 GB
Max allocated memory, real(realk) 43.270 GB
...

contain information about the maximum memory used during the calculation. Looking at this overview for a slightly smaller system can hopefully be used to estimate the memory requirements of the larger systems - but since the memory requirements are not linear scaling with system size nor number of basis functions - it can be tricky.

The LSDalton development team have limited experience running calculations bigger than 11,000 basis functions and 1000 atoms, so maybe you experience a memory problem where some routine allocate too much memory.

I would like to try out the system you mentioned. I am aware that it is a big systems - but the program should be able to handle it, and 6 GB per core should be enough. So please attach it or send to me personally.

On several occasions we have discussed having some kind of statistics run where we estimate memory requirements - maybe MPI efficiency and estimated run time - but since this is a "nice to have" feature without any scientific impact it is not high on anyone's todo list.

Best Regards
Thomas Kjærgaard

tkjaer
Posts: 300
Joined: 27 Aug 2013, 20:35
First name(s): Thomas
Last name(s): Kjaergaard

Re: LSDalton memory requirements

Post by tkjaer » 11 Feb 2014, 14:33

I would like to try out the system you mentioned - or some other big system that you have problems with. The program should be able to handle it, and 6 GB per core should be enough - Maybe I can reduce the memory requirements for the next patch or the next release.

Best Regards
Thomas Kjærgaard

Post Reply

Who is online

Users browsing this forum: No registered users and 3 guests