Continuous Naive Bayesian Classification

dc.contributor.authorVinsensius Berlin Vega S. N.en_US
dc.contributor.authorStephane Bressanen_US
dc.date.accessioned2004-10-21T14:28:52Zen_US
dc.date.accessioned2017-01-23T07:00:03Z
dc.date.available2004-10-21T14:28:52Zen_US
dc.date.available2017-01-23T07:00:03Z
dc.date.issued2003-06-01T00:00:00Zen_US
dc.description.abstractThe most common model of machine learning algorithms involves two life-stages, namely the learning stage and the application stage. The cost of human expertise makes difficult the labeling of large sets of data for the training of machine learning algorithms. In this paper, we propose to challenge this strict dichotomy in the life cycle while addressing the issue of labeling of data. We discuss a learning paradigm called Continuous Learning. After an initial training based on human-labeled data, a Continuously Learning algorithm iteratively trains itself with the result of its own previous application stage and without the privilege of any external feedback. The intuitive motivation and idea of this paradigm are elucidated, followed by an explanation on how it differs from other learning models is laid out. Finally, empirical evaluation of Continuous Learning applied to the Naive Bayesian Classifier for the classification of newsgroup articles of a well-known benchmark is presented.en_US
dc.format.extent258235 bytesen_US
dc.format.mimetypeapplication/pdfen_US
dc.identifier.urihttps://dl.comp.nus.edu.sg/xmlui/handle/1900.100/1429en_US
dc.language.isoenen_US
dc.relation.ispartofseriesTRB6/03en_US
dc.titleContinuous Naive Bayesian Classificationen_US
dc.typeTechnical Reporten_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
report.pdf
Size:
252.18 KB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.52 KB
Format:
Plain Text
Description: