Palisade Knowledge Base

HomeTechniques and TipsNeuralToolsTradeoffs: Accuracy, Flexibility, Training Speed

15.18. Tradeoffs: Accuracy, Flexibility, Training Speed

Applies to: NeuralTools 5.x–7.x

Is there a way to reduce accuracy of the neural network but extend its flexibility?

Here the key is the distinction between different neural net types available in NeuralTools: Probabilistic Neural Nets versus Multi-Layer Feedforward Neural Nets.

MLF nets are more flexible, in the sense that they have greater capacity for generalizing beyond the range of training data. By default NeuralTools uses PN nets, which train much faster. For greater flexibility, and capability to generalize, try MLF nets.

Last edited: 2016-08-02

This page was: Helpful | Not Helpful