Palisade Knowledge Base

HomeTroubleshooting@RISK for Excel: SimulationOutputs with Small Means Don't Converge

4.16. Outputs with Small Means Don't Converge

Applies to:
@RISK 6.x/7.x

I set up my simulation with convergence monitoring using the simulated mean. Most of my outputs converged, but one or two of them were still at or close to 0% converged. Their means are smaller than the means of the inputs that did converge; does that matter?

That's the key to the mystery. For simplicity, let's suppose that you used the defaults, so that you're having @RISK run until it has 95% confidence, for each tested output, that it is within ±3% of the true mean of that output.

If Output A has a mean of 200,000,000, the ±3% tolerance band is 12,000,000 units wide. But if Output B has a mean of 10, the ±3% tolerance band is only 0.6 units wide. It seems reasonable, if the two have similar standard deviations, that Output B will take longer to converge, because it takes more iterations to narrow a number down between 9.7 and 10.3 than between 194,000,000 and 206,000,000.

The means don't even have to be that extreme. Take a look at the attached example, and click Start Simulation. (There's a fixed random number seed, so that your numbers should match the ones in this article.) C55 through G55 have means around $1,700 to $6,200, but H55 has a mean of minus $16. All six have standard deviations in the range $3,500 to $5,300. Even though the other means are only 100 to 400 times as great as the mean of H55, still the other outputs converge but H55 does not. (To see the standard deviations, right-click in the Results Summary window and select Columns for Table.)

See also: For more about how convergence monitoring works, see Convergence Monitoring in @RISK.

Last edited: 2019-02-04


This page was: Helpful | Not Helpful