dc.contributor.author | Pensuwon, W. | |
dc.contributor.author | Adams, R.G. | |
dc.contributor.author | Davey, N. | |
dc.date.accessioned | 2009-10-20T08:16:51Z | |
dc.date.available | 2009-10-20T08:16:51Z | |
dc.date.issued | 2004 | |
dc.identifier.citation | Pensuwon , W , Adams , R G & Davey , N 2004 , ' Optimising a Neural Tree Using Subtree Retraining ' , Lecture Notes in Computer Science (LNCS) , vol. 2004 , pp. 256-262 . | |
dc.identifier.issn | 0302-9743 | |
dc.identifier.other | dspace: 2299/3972 | |
dc.identifier.uri | http://hdl.handle.net/2299/3972 | |
dc.description | “The original publication is available at www.springerlink.com”. Copyright Springer. [Full text of this article is not available in the UHRA] | |
dc.description.abstract | Subtree retraining applied to a Stochastic Competitive Evolutionary Neural Tree model (SCENT) is introduced. This subtree retraining process is designed to improve the performance of the original model which provides a hierarchical classification of unlabelled data. The effect of subtree retraining on the network produces stable classificatory structures by repeatedly restructuring the weakest branch of the classification tree based on internal relation between members. An experimental comparison using well-known real world data sets, chosen to provide a variety of clustering scenarios, showed the new approach produced more reliable performances. | en |
dc.language.iso | eng | |
dc.relation.ispartof | Lecture Notes in Computer Science (LNCS) | |
dc.title | Optimising a Neural Tree Using Subtree Retraining | en |
dc.contributor.institution | School of Computer Science | |
dc.contributor.institution | Centre for Computer Science and Informatics Research | |
dc.contributor.institution | Science & Technology Research Institute | |
dc.contributor.institution | School of Physics, Engineering & Computer Science | |
dc.description.status | Peer reviewed | |
rioxxterms.type | Journal Article/Review | |
herts.preservation.rarelyaccessed | true | |