MML Inference of Single-Layer Neural NetworksEnes Makalic, Lloyd Allison & David L. Dowe,
|
|
Abstract: The architecture selection problem is of great importance when designing neural networks. A network that is too simple does not learn the problem sufficiently well. Conversely, a larger than necessary network presumably indicates overfitting and provides low generalisation performance. This paper presents a novel architecture selection criterion for single hidden layer feedforward networks. The optimal network size is determined using a version of the Minimum Message Length (MML) inference method. Performance is demonstrated on several problems and compared with a Minimum Description Length (MDL) based selection criterion. [preprint.ps]. |
|
↑ © L. Allison, www.allisons.org/ll/ (or as otherwise indicated). Created with "vi (Linux)", charset=iso-8859-1, fetched Saturday, 27-Apr-2024 02:22:35 UTC. Free: Linux, Ubuntu operating-sys, OpenOffice office-suite, The GIMP ~photoshop, Firefox web-browser, FlashBlock flash on/off. |