HomeTechniques and Tips@RISK PerformanceMemory Used by @RISK Simulations

9.6. Memory Used by @RISK Simulations

Applies to: @RISK for Excel 5.x–7.x

How much memory is used during a simulation?

@RISK saves the values of each output, each input (unless you have changed the default on the Sampling tab of Simulation Settings), and each cell referred to by a statistics function such as RiskMean( ) or RiskPtoX( ). The memory required is 8 bytes per value per iteration per simulation. However, to avoid the 2 GB limit (below), @RISK pages data to disk as needed.

@RISK needs additional memory for its own code and for data other than the iterations of simulation inputs and outputs. To get an idea of overall memory requirements for your simulation:

  1. In Simulation Settings » General, change Multiple CPU to Disabled and run a simulation with the number of iterations unchanged.

  2. When the @RISK progress window shows iterations being run, open Task Manager (Ctrl+Shift+Esc) and look in the Commit Size column to see how much memory Excel.exe is using.

    • In Windows 7 or XP, look at the Processes tab. If you don't see the Commit Size column, click View » Select Columns » Memory–Commit Size.
    • In Windows 8, look at the Details tab. If you don't see the Commit Size column, right-click on any column head and click Select Columns » Commit Size.
  3. You can then shut down the simulation with the "stop" button in the progress window.

When you re-enable Multiple CPU, the master CPU will use about this much and each worker CPU will use somewhat less.

When I disable Smart Sensitivity Analysis, my simulation starts faster, but does it also reduce memory use?

Yes and no. After running a Smart Sensitivity Analysis, @RISK saves the results of the precedent tracing but frees the memory used for the trace. So there is no appreciable memory saving once the simulation starts.

However, if your model is large and complicated enough, @RISK could run out of memory during the process of tracing precedents. In that case, turning off Smart Sensitivity Analysis will bypass precedent tracing and the associated out-of-memory condition.

I have heard that Excel has a memory limit of 2 GB. Does @RISK have such a limit?

Every 32-bit process, including 32-bit Excel, is limited to 2 GB of address space. @RISK running in 32-bit Excel does not have its own memory space, but shares in Excel's memory space. As mentioned above, @RISK pages data to disk during a simulation to avoid overrunning the 2 GB limit.

If you are using multiple processors, then each Excel process has a separate 2 GB limit, so the overall simulation can use up to 2 GB times the number of processors. If you want to limit the number of processors used by a simulation, please see CPUs Used by @RISK.

There is an additional limit, which may be tighter in practice. Not all the RAM in your computer is available to Excel and @RISK: the operating system and other running applications need some as well. On the Processes tab of Task Manager, you can see how much memory is in use by which processes.

Does @RISK take advantage of 64-bit Excel?

The great majority of simulations run just fine in 32-bit Excel and @RISK and do not see significant benefit from switching to a 64-bit platform. If your simulation generates gigabytes of data, and you have enough RAM to hold it all, you may see some benefit. Please see Should I Install 64-bit Excel? for more information.

See also: "Out of Memory" and "Not enough memory to run simulation" for techniques to reduce the memory used.

Last edited: 2015-07-14

This page was: Helpful | Not Helpful