MML Inference of Single-Layer Neural Networks

Enes Makalic, Lloyd Allison & David L. Dowe,
School of Computer Science and Software Engineering,
Monash University, Clayton, Victoria 3800, Australia.

(The Third IASTED International Conference on Artificial Intelligence and Applications, AIA 2003, September 8-10, 2003, Benalmadena, Spain.)

LA home

Also see

Abstract: The architecture selection problem is of great importance when designing neural networks. A network that is too simple does not learn the problem sufficiently well. Conversely, a larger than necessary network presumably indicates overfitting and provides low generalisation performance. This paper presents a novel architecture selection criterion for single hidden layer feedforward networks. The optimal network size is determined using a version of the Minimum Message Length (MML) inference method. Performance is demonstrated on several problems and compared with a Minimum Description Length (MDL) based selection criterion.



AIA 2003

↑ © L. Allison,   (or as otherwise indicated).
Created with "vi (Linux)",  charset=iso-8859-1,   fetched Thursday, 09-Jul-2020 09:50:14 EDT.

Free: Linux, Ubuntu operating-sys, OpenOffice office-suite, The GIMP ~photoshop, Firefox web-browser, FlashBlock flash on/off.