MML Inference of Single-Layer Neural Networks

Enes Makalic, Lloyd Allison & David L. Dowe,
School of Computer Science and Software Engineering,
Monash University, Clayton, Victoria 3800, Australia.

(The Third IASTED International Conference on Artificial Intelligence and Applications, AIA 2003, September 8-10, 2003, Benalmadena, Spain.)

LA home
Computing
Publications
 AIA03
 SBRN04

Also see
MML

Abstract: The architecture selection problem is of great importance when designing neural networks. A network that is too simple does not learn the problem sufficiently well. Conversely, a larger than necessary network presumably indicates overfitting and provides low generalisation performance. This paper presents a novel architecture selection criterion for single hidden layer feedforward networks. The optimal network size is determined using a version of the Minimum Message Length (MML) inference method. Performance is demonstrated on several problems and compared with a Minimum Description Length (MDL) based selection criterion.

[preprint.ps].

www:

AIA 2003

↑ © L. Allison, www.allisons.org/ll/   (or as otherwise indicated).
Created with "vi (Linux)",  charset=iso-8859-1,   fetched Monday, 19-Nov-2018 03:13:50 EST.

Free: Linux, Ubuntu operating-sys, OpenOffice office-suite, The GIMP ~photoshop, Firefox web-browser, FlashBlock flash on/off.